Your First AR App
with SceneView
Build an AR app where you tap the floor to place a 3D model in the real world. The model persists, can be moved by tapping again, and scales / rotates with gestures. In ~50 lines of Jetpack Compose.
- AR camera view with plane detection
- Tap-to-place a 3D model on any flat surface
- Pinch-to-scale and two-finger-rotate gestures
- Visual plane indicator (animated reticle)
Create a new project
In Android Studio, create a new Empty Activity project:
- Language: Kotlin
- Minimum SDK: API 24 (Android 7.0)
- Build configuration: Kotlin DSL
Make sure Jetpack Compose is enabled (it is by default in Empty Activity).
Add the SceneView dependency
In app/build.gradle.kts, add the AR dependency:
// app/build.gradle.kts
dependencies {
implementation("io.github.sceneview:arsceneview:3.2.0")
}
In app/src/main/AndroidManifest.xml, add AR permissions and metadata:
<!-- AndroidManifest.xml -->
<manifest>
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera.ar"
android:required="true" />
<application>
<!-- Required: tells Play Store this app requires ARCore -->
<meta-data android:name="com.google.ar.core"
android:value="required" />
...
</application>
</manifest>
arsceneview for AR apps. For 3D-only (no camera),
use sceneview instead.
Add your 3D model
Download a free .glb model from one of these sources:
- Sketchfab โ thousands of free Creative Commons models
- Poly Pizza โ low-poly models, great for AR
- Khronos glTF samples โ PBR showcase models
Place the file at:
app/src/main/assets/models/object.glb
Write the AR screen
Create a new file ARPlacementScreen.kt in your package:
package com.example.myarapp
import androidx.compose.foundation.layout.*
import androidx.compose.material3.Text
import androidx.compose.runtime.*
import androidx.compose.ui.Alignment
import androidx.compose.ui.Modifier
import androidx.compose.ui.graphics.Color
import androidx.compose.ui.unit.dp
import androidx.compose.ui.unit.sp
import io.github.sceneview.ar.ARScene
import io.github.sceneview.ar.node.AnchorNode
import io.github.sceneview.node.ModelNode
import io.github.sceneview.rememberEngine
import io.github.sceneview.rememberModelInstance
import io.github.sceneview.rememberModelLoader
@Composable
fun ARPlacementScreen() {
val engine = rememberEngine()
val modelLoader = rememberModelLoader(engine)
// The anchor node where the model is placed
var anchorNode by remember { mutableStateOf<AnchorNode?>(null) }
// Load the model (returns null while loading)
val modelInstance = rememberModelInstance(modelLoader, "models/object.glb")
// Hint text โ disappears once the model is placed
val hint = if (anchorNode == null)
"Move your phone to detect surfaces, then tap to place"
else ""
Box(modifier = Modifier.fillMaxSize()) {
ARScene(
modifier = Modifier.fillMaxSize(),
engine = engine,
modelLoader = modelLoader,
planeRenderer = true, // Animated plane grid
onSingleTapConfirmed = { hitResult ->
// Destroy previous anchor (replace on tap)
anchorNode?.destroy()
// Create a new anchor at the tapped position
anchorNode = AnchorNode(
engine = engine,
anchor = hitResult.createAnchor()
).apply {
isEditable = true // pinch-scale + two-finger-rotate
}
}
) {
// Place the model at the anchor when both are ready
anchorNode?.let { anchor ->
modelInstance?.let { instance ->
ModelNode(
modelInstance = instance,
scaleToUnits = 0.5f, // Scale to 50cm
autoAnimate = true,
animationLoop = true
)
}
}
}
// Hint text overlay
if (hint.isNotEmpty()) {
Text(
text = hint,
color = Color.White,
fontSize = 14.sp,
modifier = Modifier
.align(Alignment.BottomCenter)
.padding(bottom = 48.dp)
)
}
}
}
Set it as the main screen
In MainActivity.kt, set ARPlacementScreen as the content:
class MainActivity : ComponentActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContent {
MaterialTheme {
ARPlacementScreen()
}
}
}
}
Run the app
Connect a physical Android device โ AR requires a real camera, emulators don't work.
- Enable Developer Options and USB Debugging on your device
- Click Run โถ in Android Studio
- Point your phone at a floor or table
- Move slowly to detect the surface โ an animated grid appears
- Tap anywhere on the grid to place the model
- Pinch to scale ยท two-finger drag to rotate
How it works
ARScene { } composable
The ARScene composable handles everything automatically:
- Camera permission request
- ARCore session initialization
- Surface tracking and plane detection
- The camera feed as background
You provide the onSingleTapConfirmed callback and the content block. ARCore does the rest.
AnchorNode
An AnchorNode locks a coordinate to a real-world position.
As ARCore refines its understanding of the scene, the anchor updates automatically โ
keeping your model locked to the surface even as you move.
Setting isEditable = true enables:
- Pinch โ scale the model
- Two-finger drag โ rotate around the Y axis
rememberModelInstance
rememberModelInstance loads the GLB file asynchronously on the main thread
(required by Filament). It returns null while loading. The ?.let { }
null check means nothing renders until loading is complete โ no crash, no flash.
remember* helpers handle this correctly โ always use them in composables.
Add a "Remove" button
Let users delete the placed model. Add this inside your Box, below the ARScene:
if (anchorNode != null) {
Button(
onClick = {
anchorNode?.destroy()
anchorNode = null
},
modifier = Modifier
.align(Alignment.TopEnd)
.padding(16.dp)
) {
Text("Remove")
}
}
The button only appears after a model is placed โ the if (anchorNode != null) check
drives the UI reactively from state, just like any other Compose conditional.
Model switching (optional)
Let users choose from multiple models with FilterChip selectors:
val models = listOf(
"Chair" to "models/chair.glb",
"Car" to "models/car.glb"
)
var selectedModel by remember { mutableStateOf("models/chair.glb") }
val modelInstance = rememberModelInstance(modelLoader, selectedModel)
// In your Box, add a chip row at the bottom:
Row(
modifier = Modifier
.align(Alignment.BottomCenter)
.padding(bottom = 32.dp)
) {
models.forEach { (label, path) ->
FilterChip(
selected = selectedModel == path,
onClick = { selectedModel = path },
label = { Text(label) }
)
}
}
When selectedModel changes, rememberModelInstance reloads automatically.
The old model is cleaned up. Zero manual lifecycle management.
Summary
You built a tap-to-place AR app in ~50 lines of Compose. Here's what you used:
| Component | Purpose |
|---|---|
| ARScene { } | AR camera + plane detection, all managed automatically |
| AnchorNode | Lock 3D content to a real-world surface |
| ModelNode | Render the GLB model at the anchor position |
| rememberModelInstance | Async model loading with automatic cleanup |
| isEditable = true | Pinch-to-scale + two-finger-rotate gestures |
Next steps
- ar-model-viewer sample โ better gesture UX, plane mesh visualization
- Cloud Anchors โ share AR scenes between devices
- TextNode & BillboardNode โ add 3D labels to your AR scene
- All 14 sample apps โ explore every feature