This is my ongoing project about finding metamorphic relation in augmented reality application. Here, I have implemented the following MR.
-
Raycast within boundary
- Test Input: Place a spawn object
- Expected output: Spawn object should be inside the boundary of the plane
- Transformation: While changing the camera AR camera increases and adds multiple plane. Spawn object in these newly added plane
- Expected Transform output: The position of the placed objects should be consistent within the boundary of the newly detected plan
Implementation-
- Use ARPLanemanager’s trackables to get all the plane
- Then for each of the plane’s transform, it calculates the dot product
- Then check whether the product value is within tolerance.(tolerance set to .1F)
- For each of the plane it iterates the process.
-
Check Other object-
- Collision with game object- Simulate touch event where there is an existing gameobject
- Test Input: Touch and gameobject should be initiated
- Transformation: After spawning gameobject,any new gameobject should looking for existing object
- Expected Transformed Output: If there is existing gameobject it should not initiated. Implementation-
- Used layer masking technique of unity
- Define a tolerance level and apply raycast
- If it does not hit the layer of the plane rather hit the layer of other spawn object then overlapping
-
Visibility and Occlusion:
- Test Input: Place an object partially behind a real-world object.
- Expected Output: Part of the virtual object is occluded by the real-world object.
- Transformation: Move the camera to view the object from different angles.
- Expected Transformed Output: The occlusion should adjust correctly as the perspective changes.
- Implementation: Use depth sensing and spatial mapping to understand the environment and manage occlusion realistically.
-
Object Scaling with Distance: Test Input: Place a spawn object at a known distance. Expected Output: Object appears at a size proportional to its distance. Transformation: Change the position of the camera, moving closer to or further from the object. Expected Transformed Output: The object should scale up or down depending on the camera's distance, maintaining a realistic appearance. Implementation: Calculate the distance between the camera and the object and adjust the scale of the object accordingly.
-
Orientation Consistency Relative to Gravity:
- Test Input: Place an object with a specific orientation relative to gravity.
- Expected Output: Object maintains its orientation (e.g., upright) relative to gravity.
- Transformation: Rotate or tilt the AR device.
- Expected Transformed Output: The object should adjust its orientation to maintain consistency relative to gravity.
-
Detected MR(Virtual physics and interaction)
- Varying rotation in constant-
- Test Input: Simulate rotation to evaluate whether its moving constant
- Expected Output: The rotation angle should be same
- Transformation: change the initial angle and speed
- Expected Transformed Output: If the rotation is moving in constant, the rotation angle should be within specific threshold.
-
Check correct object instantiation-
- Input- Touch on screen
- Expected output- Touch object and instantiate object should be same