AR Features Showcase

Everything built into SceneView — from basic 3D rendering to advanced augmented reality. Click any video to watch it on YouTube.

Core

3D Model Viewer

Load and render glTF / glb 3D models with physically-based rendering (PBR) materials powered by Google Filament. Supports full gesture control — pinch to zoom, rotate, and pan. Compatible with Jetpack Compose and classic Android Views.

glTF / glb PBR Materials Gesture Control Compose Support
ARCore

Augmented Images

Detect and track real-world 2D images — QR codes, book covers, logos, posters — and anchor persistent 3D content precisely on them in real time using ARCore's image recognition pipeline.

Image Detection ARCore Real-time Tracking
ARCore

Augmented Faces

Track 468 facial landmarks in real time using ARCore's face mesh API. Apply 3D masks, accessories, make-up overlays or custom expressions directly on a user's face with sub-millimetre accuracy.

468 Landmarks Face Mesh AR Filters
Multiplayer

Cloud Anchors

Host AR anchors in the cloud and share them across devices in real time. Build persistent multi-user AR experiences where all participants see the same 3D content in the same physical location, powered by Google ARCore Cloud Anchor API.

Multi-user Persistent Google Cloud
Depth Sensor

Depth API

Access per-pixel depth maps from supported devices to enable photorealistic object occlusion, accurate surface detection, and precise measurement. Virtual objects realistically hide behind real-world surfaces.

Depth Map Occlusion Measurement
Rendering

Environment Lights

SceneView automatically estimates the real-world lighting conditions from the camera feed and applies them to your 3D objects via image-based lighting (IBL). Virtual objects blend seamlessly into any environment.

IBL HDR Auto-Estimation