An example project showing how to extract anchor geometry from ARMeshAnchor, create a new custom mesh and color per face in RealityKit.
Last year, ARKit 3.5 introduced the ability to use the LIDAR sensor to understand more about the scene and surroudings. At the time you were limited by RealityKit's lack of procedural geometry, if you wanted to visualise the scene understanding with custom models you needed to use SceneKit. With ARKit 5, Apple has introduced a way to create procedural meshes, so we can now use the geometry to create custom model entities in RealityKit.
This project uses the ARMeshAnchor geometry returned using the LIDAR scanner with scene reconstruction to generate a custom mesh using the new (From iOS 15 Beta) RealityKit MeshBuffers api. It uses the vertices, faces, normals and classificationIds to display a new mesh anchored at the same position. The faces are then coloured using the matching classificationIds. You can change the colors and blending mode for each classification in the example app.
Visualising the ARMeshAnchor geometry.
Changing colours of the classifications.
Max Cobb - Getting Started with RealityKit: Procedural Geometries.