Sceneform Maintained SDK for Android

:zap: NEWS - 11/07/2020 :zap:

:zap: NEWS - 11/07/2020 :zap:

SceneformMaintained is now part of the SceneView Open Community.

Everyone interested can participate in improvements or make feature queries.

Come talk with us on the Discord channel (Please don't use it for issues, go to the GitHub Issues section instead)

Mail us sceneview@gorisse.com (Please don't use it for issues, go to the GitHub Issues section instead)

:zap: :zap:



:zap: NEWS - 11/01/2020 :zap:

We are currently releasing the successor of Sceneform Maintained which is has greats improvements and latest ARCore features: DepthHit, InstantPlacement,...

More infos: SceneView for Android

If you want to help us, see what it looks like or be an early user, you can participate here for early access.

:zap: :zap:


Maven Central

This repository is a fork of Sceneform Copyright (c) 2021 Google Inc. All rights reserved.

Maintained and continued by Nikita Zaytsev, Vojta Maiwald, Brigido Rodriguez, Fvito, Marius Kajczuga, Robert Gregat and Thomas Gorisse


:star: Star the repository to help us being known


Dependencies

app/build.gradle

dependencies {
     implementation("com.gorisse.thomas.sceneform:sceneform:1.23.0")
}

more...

Usage (Simple model viewer)

Update your AndroidManifest.xml

AndroidManifest.xml

<uses-permission android:name="android.permission.CAMERA" />

<application>
    ...
    <meta-data android:name="com.google.ar.core" android:value="optional" />
</application>

more...

Add the View to your layout

res/layout/main_activity.xml

<androidx.fragment.app.FragmentContainerView
    android:id="@+id/arFragment"
    android:name="com.google.ar.sceneform.ux.ArFragment"
    android:layout_width="match_parent"
    android:layout_height="match_parent" />

sample...

Edit your Activity or Fragment

src/main/java/.../MainActivity.java


override fun onCreate(savedInstanceState: Bundle?) {
    super.onCreate(savedInstanceState)

    // Load model.glb from assets folder or http url
    (supportFragmentManager.findFragmentById(R.id.arFragment) as ArFragment)
        .setOnTapPlaneGlbModel("model.glb")
}

Or

src/main/java/.../MainFragment.java

override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
    super.onViewCreated(view, savedInstanceState)

    // Load model.glb from assets folder or http url
    (childFragmentManager.findFragmentById(R.id.arFragment) as ArFragment)
        .setOnTapPlaneGlbModel("https://storage.googleapis.com/ar-answers-in-search-models/static/Tiger/model.glb")
}

kotlin sample...

java sample...

Samples

glTF with animation

Full Video
full video...

arFragment.setOnTapArPlaneListener { hitResult, plane, motionEvent ->
    // Create the Anchor
    arFragment.arSceneView.scene.addChild(AnchorNode(hitResult.createAnchor()).apply {
        // Create the transformable model and add it to the anchor
        addChild(TransformableNode(arFragment.transformationSystem).apply {
            renderable = model
            renderableInstance.animate(true).start()
        })
    })
}

kotlin sample project...

java sample project...

Depth Occlusion

Depth Occlusion 01 Depth Occlusion 02 Depth Occlusion 03
arFragment.apply {
    setOnSessionConfigurationListener { session, config ->
        if (session.isDepthModeSupported(Config.DepthMode.AUTOMATIC)) {
            config.depthMode = Config.DepthMode.AUTOMATIC
        }
    }
    setOnViewCreatedListener { arSceneView ->
        // Available modes: DEPTH_OCCLUSION_DISABLED, DEPTH_OCCLUSION_ENABLED
        arSceneView.cameraStream.depthOcclusionMode =
            CameraStream.DepthOcclusionMode.DEPTH_OCCLUSION_ENABLED
    }
}

documentation...

sample project...

Augmented Images

Augmented Images 01

sample project...

Augmented Faces

Augmented Faces 01 Augmented Faces 02 Augmented Faces 03

sample project...

Cloud Anchors

image

// Create a new anchor = the pose of which ARCore will try to resolve using the ARCore Cloud Anchor service and the provided cloudAnchorId
sceneView.session?.resolveCloudAnchor(cloudAnchorId)?.let { resolvedAnchor ->
  sceneView.scene.addChild(AnchorNode(resolvedAnchor).apply {
      addChild(VideoNode(context, MediaPlayer.create(context, R.raw.restaurant_presentation).apply {
                  this.start()
              },null)
      )
  })
}

Environment Lights

Environment Lights 01 Environment Lights 02 Environment Lights 03
Environment Lights 04 Environment Lights 05 Environment Lights 06

sample project...

Video texture

Video texture 01 Video texture 02 Video texture 03
arFragment.setOnTapArPlaneListener { hitResult, plane, motionEvent ->
    // Create the Anchor
    arFragment.arSceneView.scene.addChild(AnchorNode(hitResult.createAnchor()).apply {
        addChild(VideoNode(context, MediaPlayer.create(context, R.raw.video).apply {
            start()
        }, chromaKeyColor, null))
    })
}

sample project...

Dynamic materials/textures

Dynamic materials 01 Dynamic materials 02

sample project...

Non AR usage

Non AR Usage 01

sample project...

Demo

Get it on Google Play

Youtube Video 01 Youtube Video 02

Emulator

Known working configuration

image

more...

Go further

AR Required vs AR Optional

If your app requires ARCore (AR Required) and is not only (AR Optional), use this manifest to indicates that this app requires Google Play Services for AR (AR Required) and results in the app only being visible in the Google Play Store on devices that support ARCore:

<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera.ar" android:required="true"/>

<application>
    ...
    <meta-data android:name="com.google.ar.core" android:value="required" />
</application>

more...

Nodes

To add a node or multiple nodes to the Scene when the user press on a surface, you can override the onTapPlane function from a BaseArFragment.OnTapArPlaneListener:

arFragment.setOnTapArPlaneListener(::onTapPlane)
arFragment.setOnTapArPlaneListener { hitResult, plane, motionEvent ->
    // Create the Anchor
    arFragment.arSceneView.scene.addChild(AnchorNode(hitResult.createAnchor()).apply {
        // Create the transformable model and add it to the anchor.
        addChild(TransformableNode(arFragment.transformationSystem).apply {
            renderable = model
            renderableInstance.animate(true).start()
            // Add child model relative the a parent model
            addChild(Node().apply {
                // Define the relative position
                localPosition = Vector3(0.0f, 1f, 0.0f)
                // Define the relative scale
                localScale = Vector3(0.7f, 0.7f, 0.7f)
                renderable = modelView
            })
        })
    })
}

sample...

Remove or Hide a node

Remove an AnchorNode from the Scene

anchorNode.anchor = null

Remove a Model Node, VideoNode, AugmentedFaceNode,... from the Scene

node.parent = null

Show/Hide a Node = Don't render it

node.enabled= false

documentation...

Frame Rate (FPS-Bound)

Upper-Bound

The Update-Rate of the rendering is limited through the used camera config of ARCore. For most Smartphones it is 30 fps and for the Pixel Smartphones it is 60 fps. The user can manually change this value (you should know what you are doing).

arFragment.setOnViewCreatedListener { arSceneView ->
    // Set a higher bound for the frame rate
    arSceneView.setMaxFramesPerSeconds(60)
}

The default value is 60.

documentation...

Animations

Until now, only RenderableInstance are animtable. Below model corresponds to a RenderablaInstance returned from a node.getRenderableInstance()

Basic usage

On a very basic 3D model like a single infinite rotating sphere, you should not have to use ModelAnimator but probably instead just call:

model.animate(repeat).start();

Single Model with Single Animation

If you want to animate a single model to a specific timeline position, use:

ModelAnimator.ofAnimationFrame(model, "AnimationName", 100).start();
ModelAnimator.ofAnimationFraction(model, "AnimationName", 0.2f, 0.8f, 1f).start();
ModelAnimator.ofAnimationTime(model, "AnimationName", 10.0f)}.start();

Where can I find the "AnimationName" ?

The animation names are definied at the 3D model level.
You can compare it to a track playing something corresponding to a particular behavior in you model.

For example, on Blender "AnimationName" can correspond to

To know the actual animation names of a glb/gltf file, you can drag it on a glTF Viewer like here and find it in the animation list.

Values

Single Model with Multiple Animations

If the model is a character, for example, there may be one ModelAnimation for a walkcycle, a second for a jump, a third for sidestepping and so on:

Play Sequentially

AnimatorSet animatorSet = new AnimatorSet();
        animatorSet.playSequentially(ModelAnimator.ofMultipleAnimations(model, "walk", "run"));
        animatorSet.start();

Auto Cancel

Here you can see that no call to animator.cancel() is required because the animator.setAutoCancel(boolean) is set to true by default

ObjectAnimator walkAnimator = ModelAnimator.ofAnimation(model, "walk");
        walkButton.setOnClickListener(v -> walkAnimator.start());

        ObjectAnimator runAnimator = ModelAnimator.ofAnimation(model, "run");
        runButton.setOnClickListener(v -> runAnimator.start());

Multiple Models with Multiple Animations

For a synchronised animation set like animating a complete scene with multiple models time or sequentially, please consider using an AnimatorSet with one ModelAnimator parametrized per step

AnimatorSet completeFly = new AnimatorSet();

        ObjectAnimator liftOff = ModelAnimator.ofAnimationFraction(airPlaneModel, "FlyAltitude",0, 40);
        liftOff.setInterpolator(new AccelerateInterpolator());

        AnimatorSet flying = new AnimatorSet();
        ObjectAnimator flyAround = ModelAnimator.ofAnimation(airPlaneModel, "FlyAround");
        flyAround.setRepeatCount(ValueAnimator.INFINITE);
        flyAround.setDuration(10000);
        ObjectAnimator airportBusHome = ModelAnimator.ofAnimationFraction(busModel, "Move", 0);
        flying.playTogether(flyAround, airportBusHome);

        ObjectAnimator land = ModelAnimator.ofAnimationFraction(airPlaneModel, "FlyAltitude", 0);
        land.setInterpolator(new DecelerateInterpolator());

        completeFly.playSequentially(liftOff, flying, land);

Morphing animation

Assuming a character object has a skeleton, one keyframe track could store the data for the position changes of the lower arm bone over time, a different track the data for the rotation changes of the same bone, a third the track position, rotation or scaling of another bone, and so on. It should be clear, that an ModelAnimation can act on lots of such tracks.

Assuming the model has morph targets (for example one morph target showing a friendly face and another showing an angry face), each track holds the information as to how the influence of a certain morph target changes during the performance of the clip.

In a glTF context, this {@link android.animation.Animator} updates matrices according to glTF animation and skin definitions.

ModelAnimator can be used for two things

Every PropertyValuesHolder that applies a modification on the time position of the animation must use the ModelAnimation.TIME_POSITION instead of its own Property in order to possibly cancel any ObjectAnimator operating time modifications on the same ModelAnimation.

more...

License

Please see the LICENSE file.

Brand Guidelines

The Sceneform trademark is a trademark of Google, and is not subject to the copyright or patent license grants contained in the Apache 2.0-licensed Sceneform repositories on GitHub. Any uses of the Sceneform trademark other than those permitted in these guidelines must be approved by Google in advance.

Purpose of the Brand Guidelines

These guidelines exist to ensure that the Sceneform project can share its technology under open source licenses while making sure that the "Sceneform" brand is protected as a meaningful source identifier in a way that's consistent with trademark law. By adhering to these guidelines, you help to promote the freedom to use and develop high-quality Sceneform technology.

Acceptable uses

Because we are open-sourcing the Sceneform technology, you may use the Sceneform trademark to refer to the project without prior written permission. Examples of these approved references include the following:

Examples:

Usage guidelines

Terms & Conditions

By downloading the Sceneform SDK for Android, you agree that the Google APIs Terms of Service governs your use thereof.

User privacy requirements

You must disclose the use of Google Play Services for AR (ARCore) and how it collects and processes data, prominently in your application, easily accessible to users. You can do this by adding the following text on your main menu or notice screen: "This application runs on Google Play Services for AR (ARCore), which is provided by Google LLC and governed by the Google Privacy Policy".