ARCore sample project that visualizes the human heart on a marker and allows to freely place and transform nerve cells.
This project extends the samples that ship with the Google ARCore for Unity SDK. It combines multiple scenarios:
-
Augmented Images (marker tracking): allows a 1:1 relationship between marker and associated 3D model / prefab to create. Only instantiates every object once in the world.
See script:MarkerAnchorController.cs
Difference to Original Google example: uses the same Prefab for every augmented image in the database. Sets up the correct object hierarchy in Unity (Anchor -> Prefab) and deletes the item as well as its anchor when tracking is lost. -
Plane Anchoring: Tap on a detected plane to position an item. Not activated by default in the scene. See script:
PlaneAnchorController.cs
Difference to Original Google example: Checks if the plane has been lost and cleans up the anchor + insantiated prefab. -
Plane Anchoring + Manipulation: Tap on a detected plane to position an item, with the manipulation system in place. Allows moving, rotating and scaling objects that have been placed in the scene.
See script:ManipulatorController.cs
Difference to Original Google example: Checks if the plane has been lost and cleans up the anchor + insantiated prefab. -
Lifecycle: Takes care of permissions, screen time out and quitting the app.
See script:LifecycleController.cs
Difference to Original Google example: Taken from the SDK's HelloAR example. But these tasks are good to have in an extra script, instead of being integrated into a controller that handles user interaction. -
Autofocus Controller: Turn autofocus on / off while the app is running through a tap with two fingers on the screen. Google currently still recommends using the default setting - which is autofocus off. With this script, you can explore the differences. If the phone camera doesn't support autofocus, it will ignore this setting.
See script:AutofocusController.cs
Difference to Original Google example: Starting with the ARCore for Unity SDK 1.8, Google now enables Autofocus by default for augmented images.
-
Augmented Images: The two pictures to use as markers are in the
/Assets/Images/
folder. For the best performance, print them on paper. For quick testing, it also works to show the markers on your computer screen. -
ArPrefab: Should be part of every AR prefab instantiated in the scene. Stores a reference to the trackable the prefab instance is connected to as a property. Allows for quickly checking which trackables have been given up by the ARCore SDK, to properly clean up the game objects in the scene.
-
TrackingCheck: Regularly checks instantiated prefabs whether the associated ARCore trackable reports that tracking has been lost. Automatically checks all controller scripts that implement the
IArObjectController
interface. This simple interface allows querying the instantiated prefabs. -
Light Estimation: Is enabled in this scene.
-
Screen Auto-Rotation: Disabled, as this gives a better user experience on Android without screen flickering while turning the screen. Auto-rotation would mainly be needed for static textual content that is not anchored to a plane or augmented image or otherwise placed in screen-space instead of world-space.
-
GLB / GLTF Import: The 3D models of the human heart and nerve cells were imported from GLB files. This format is not yet natively supported in Unity. They were imported through the SketchFab unitypackage.
The 3D models used in this scene have been created by Microsoft and were downloaded from Remix3D. Update: the Remix3D platform is no longer available, you can still find the models through the 3D model section of Office 365. To avoid dead links, I've removed the links to the original source from the readme.
Released under the MIT license - see the LICENSE file for details.
Developed by Andreas Jakl