:zap: NEWS - 11/07/2020 :zap:
SceneformMaintained is now part of the SceneView Open Community.
Everyone interested can participate in improvements or make feature queries.
Come talk with us on the Discord channel (Please don't use it for issues, go to the GitHub Issues section instead)
Mail us sceneview@gorisse.com (Please don't use it for issues, go to the GitHub Issues section instead)
:zap: :zap:
:zap: NEWS - 11/01/2020 :zap:
We are currently releasing the successor of Sceneform Maintained which is has greats improvements and latest ARCore features: DepthHit, InstantPlacement,...
More infos: SceneView for Android
If you want to help us, see what it looks like or be an early user, you can participate here for early access.
:zap: :zap:
- Android gradle dependency in Kotlin/Java
- No OpenGL or Unity need
- Latest versions of ARCore SDK and Filament
- Latest versions of Android dependencies (Android Build tools, AndroidX,...)
- Available on
mavenCentral()
- Supports glTF format
- glTF/glb with animations support
- Augmented Images supported
- Augmented Faces supported
- Depth supported
- Simple model loading for basic usage
This repository is a fork of Sceneform Copyright (c) 2021 Google Inc. All rights reserved.
:star: Star the repository to help us being known
Dependencies
app/build.gradle
dependencies {
implementation("com.gorisse.thomas.sceneform:sceneform:1.23.0")
}
more...
Usage (Simple model viewer)
Update your AndroidManifest.xml
AndroidManifest.xml
<uses-permission android:name="android.permission.CAMERA" />
<application>
...
<meta-data android:name="com.google.ar.core" android:value="optional" />
</application>
more...
Add the View
to your layout
res/layout/main_activity.xml
<androidx.fragment.app.FragmentContainerView
android:id="@+id/arFragment"
android:name="com.google.ar.sceneform.ux.ArFragment"
android:layout_width="match_parent"
android:layout_height="match_parent" />
sample...
Edit your Activity
or Fragment
src/main/java/.../MainActivity.java
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
// Load model.glb from assets folder or http url
(supportFragmentManager.findFragmentById(R.id.arFragment) as ArFragment)
.setOnTapPlaneGlbModel("model.glb")
}
Or
src/main/java/.../MainFragment.java
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
super.onViewCreated(view, savedInstanceState)
// Load model.glb from assets folder or http url
(childFragmentManager.findFragmentById(R.id.arFragment) as ArFragment)
.setOnTapPlaneGlbModel("https://storage.googleapis.com/ar-answers-in-search-models/static/Tiger/model.glb")
}
kotlin sample...
java sample...
Samples
glTF with animation
full video...
arFragment.setOnTapArPlaneListener { hitResult, plane, motionEvent ->
// Create the Anchor
arFragment.arSceneView.scene.addChild(AnchorNode(hitResult.createAnchor()).apply {
// Create the transformable model and add it to the anchor
addChild(TransformableNode(arFragment.transformationSystem).apply {
renderable = model
renderableInstance.animate(true).start()
})
})
}
kotlin sample project...
java sample project...
Depth Occlusion
arFragment.apply {
setOnSessionConfigurationListener { session, config ->
if (session.isDepthModeSupported(Config.DepthMode.AUTOMATIC)) {
config.depthMode = Config.DepthMode.AUTOMATIC
}
}
setOnViewCreatedListener { arSceneView ->
// Available modes: DEPTH_OCCLUSION_DISABLED, DEPTH_OCCLUSION_ENABLED
arSceneView.cameraStream.depthOcclusionMode =
CameraStream.DepthOcclusionMode.DEPTH_OCCLUSION_ENABLED
}
}
documentation...
sample project...
Augmented Images
sample project...
Augmented Faces
sample project...
Cloud Anchors
// Create a new anchor = the pose of which ARCore will try to resolve using the ARCore Cloud Anchor service and the provided cloudAnchorId
sceneView.session?.resolveCloudAnchor(cloudAnchorId)?.let { resolvedAnchor ->
sceneView.scene.addChild(AnchorNode(resolvedAnchor).apply {
addChild(VideoNode(context, MediaPlayer.create(context, R.raw.restaurant_presentation).apply {
this.start()
},null)
)
})
}
Environment Lights
sample project...
Video texture
arFragment.setOnTapArPlaneListener { hitResult, plane, motionEvent ->
// Create the Anchor
arFragment.arSceneView.scene.addChild(AnchorNode(hitResult.createAnchor()).apply {
addChild(VideoNode(context, MediaPlayer.create(context, R.raw.video).apply {
start()
}, chromaKeyColor, null))
})
}
sample project...
Dynamic materials/textures
sample project...
Non AR usage
sample project...
Demo
Emulator
Known working configuration
more...
Go further
AR Required vs AR Optional
If your app requires ARCore (AR Required) and is not only (AR Optional), use this manifest to indicates that this app requires Google Play Services for AR (AR Required) and results in
the app only being visible in the Google Play Store on devices that support ARCore:
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera.ar" android:required="true"/>
<application>
...
<meta-data android:name="com.google.ar.core" android:value="required" />
</application>
more...
Nodes
To add a node or multiple nodes to the Scene when the user press on a surface, you can override the onTapPlane
function from a BaseArFragment.OnTapArPlaneListener
:
arFragment.setOnTapArPlaneListener(::onTapPlane)
arFragment.setOnTapArPlaneListener { hitResult, plane, motionEvent ->
// Create the Anchor
arFragment.arSceneView.scene.addChild(AnchorNode(hitResult.createAnchor()).apply {
// Create the transformable model and add it to the anchor.
addChild(TransformableNode(arFragment.transformationSystem).apply {
renderable = model
renderableInstance.animate(true).start()
// Add child model relative the a parent model
addChild(Node().apply {
// Define the relative position
localPosition = Vector3(0.0f, 1f, 0.0f)
// Define the relative scale
localScale = Vector3(0.7f, 0.7f, 0.7f)
renderable = modelView
})
})
})
}
sample...
Remove or Hide a node
Remove an AnchorNode from the Scene
Remove a Model Node, VideoNode, AugmentedFaceNode,... from the Scene
Show/Hide a Node = Don't render it
documentation...
Frame Rate (FPS-Bound)
Upper-Bound
The Update-Rate of the rendering is limited through the used camera config of ARCore. For most Smartphones it is 30 fps and for the Pixel Smartphones it is 60 fps. The user can manually change this value (you should know what you are doing).
arFragment.setOnViewCreatedListener { arSceneView ->
// Set a higher bound for the frame rate
arSceneView.setMaxFramesPerSeconds(60)
}
The default value is 60.
documentation...
Animations
Until now, only RenderableInstance
are animtable. Below model
corresponds to a RenderablaInstance
returned from a node.getRenderableInstance()
Basic usage
On a very basic 3D model like a single infinite rotating sphere, you should not have to
use ModelAnimator but probably instead just call:
model.animate(repeat).start();
Single Model with Single Animation
If you want to animate a single model to a specific timeline position, use:
ModelAnimator.ofAnimationFrame(model, "AnimationName", 100).start();
ModelAnimator.ofAnimationFraction(model, "AnimationName", 0.2f, 0.8f, 1f).start();
ModelAnimator.ofAnimationTime(model, "AnimationName", 10.0f)}.start();
Where can I find the "AnimationName" ?
The animation names are definied at the 3D model level.
You can compare it to a track playing something corresponding to a particular behavior in you model.
For example, on Blender "AnimationName" can correspond to
- An action defined inside the
Non linear Animation View Port
- A single object behavior in the
Timeline ViewPort
To know the actual animation names of a glb/gltf file, you can drag it on a glTF Viewer like here and find it in the animation list.
Values
- A single time, frame, fraction value will go from the actual position to the desired value
- Two values means form value1 to value2
- More than two values means form value1 to value2 then to value3
Single Model with Multiple Animations
If the model is a character, for example, there may be one ModelAnimation for a walkcycle, a
second for a jump, a third for sidestepping and so on:
Play Sequentially
AnimatorSet animatorSet = new AnimatorSet();
animatorSet.playSequentially(ModelAnimator.ofMultipleAnimations(model, "walk", "run"));
animatorSet.start();
Auto Cancel
Here you can see that no call to animator.cancel()
is required because the
animator.setAutoCancel(boolean)
is set to true by default
ObjectAnimator walkAnimator = ModelAnimator.ofAnimation(model, "walk");
walkButton.setOnClickListener(v -> walkAnimator.start());
ObjectAnimator runAnimator = ModelAnimator.ofAnimation(model, "run");
runButton.setOnClickListener(v -> runAnimator.start());
Multiple Models with Multiple Animations
For a synchronised animation set like animating a complete scene with multiple models
time or sequentially, please consider using an AnimatorSet
with one
ModelAnimator
parametrized per step
AnimatorSet completeFly = new AnimatorSet();
ObjectAnimator liftOff = ModelAnimator.ofAnimationFraction(airPlaneModel, "FlyAltitude",0, 40);
liftOff.setInterpolator(new AccelerateInterpolator());
AnimatorSet flying = new AnimatorSet();
ObjectAnimator flyAround = ModelAnimator.ofAnimation(airPlaneModel, "FlyAround");
flyAround.setRepeatCount(ValueAnimator.INFINITE);
flyAround.setDuration(10000);
ObjectAnimator airportBusHome = ModelAnimator.ofAnimationFraction(busModel, "Move", 0);
flying.playTogether(flyAround, airportBusHome);
ObjectAnimator land = ModelAnimator.ofAnimationFraction(airPlaneModel, "FlyAltitude", 0);
land.setInterpolator(new DecelerateInterpolator());
completeFly.playSequentially(liftOff, flying, land);
Morphing animation
Assuming a character object has a skeleton, one keyframe track could store the data for the
position changes of the lower arm bone over time, a different track the data for the rotation
changes of the same bone, a third the track position, rotation or scaling of another bone, and so
on. It should be clear, that an ModelAnimation can act on lots of such tracks.
Assuming the model has morph targets (for example one morph target showing a friendly face
and another showing an angry face), each track holds the information as to how the influence of a
certain morph target changes during the performance of the clip.
In a glTF context, this {@link android.animation.Animator} updates matrices according to glTF
animation and skin definitions.
ModelAnimator can be used for two things
- Updating matrices in
TransformManager
components according to the model animation definitions.
- Updating bone matrices in
RenderableManager
com ## Animations
Every PropertyValuesHolder that applies a modification on the time position of the animation
must use the ModelAnimation.TIME_POSITION
instead of its own Property in order to possibly cancel
any ObjectAnimator operating time modifications on the same ModelAnimation.
more...
License
Please see the
LICENSE
file.
Brand Guidelines
The Sceneform trademark is a trademark of Google, and is not subject to the
copyright or patent license grants contained in the Apache 2.0-licensed
Sceneform repositories on GitHub. Any uses of the Sceneform trademark other than
those permitted in these guidelines must be approved by Google in advance.
Purpose of the Brand Guidelines
These guidelines exist to ensure that the Sceneform project can share its
technology under open source licenses while making sure that the "Sceneform"
brand is protected as a meaningful source identifier in a way that's consistent
with trademark law. By adhering to these guidelines, you help to promote the
freedom to use and develop high-quality Sceneform technology.
Acceptable uses
Because we are open-sourcing the Sceneform technology, you may use the Sceneform
trademark to refer to the project without prior written permission. Examples of
these approved references include the following:
- To refer to the Sceneform project itself;
- To refer to unmodified source code or other files shared by the Sceneform
repositories on GitHub;
- To accurately identify that your design or implementation is based on, is for
use with, or is compatible with the Sceneform technology.
Examples:
- "[Your Product] for Sceneform."
- "[Your Product] is a fork of the Sceneform project."
- "[Your Product] is compatible with Sceneform."
Usage guidelines
- The Sceneform name may never be used or registered in a manner that would
cause confusion as to Google's sponsorship, affiliation, or endorsement.
- Don't use the Sceneform name, or a confusingly similar term, as part of your
company name, product name, domain name, or social media profile.
- Other than as permitted by these guidelines, the Sceneform name should not be
combined with other trademarks, terms, or source identifiers.
- Don't remove, distort or alter the Sceneform name. That includes modifying the
Sceneform name, for example, through hyphenation, combination, or
abbreviation. Do not shorten, abbreviate, or create acronyms out of the
Sceneform name.
- Don't display the Sceneform name using any different stylization, color, or
font from the surrounding text.
- Don't use the term Sceneform as a verb, or use it in possessive form.
Terms & Conditions
By downloading the Sceneform SDK for Android, you agree that the
Google APIs Terms of Service governs
your use thereof.
User privacy requirements
You must disclose the use of Google Play Services for AR (ARCore) and how it
collects and processes data, prominently in your application, easily accessible
to users. You can do this by adding the following text on your main menu or
notice screen: "This application runs on
Google Play Services for AR
(ARCore), which is provided by Google LLC and governed by the
Google Privacy Policy".