According to Apple, “The basic requirement for any AR experience—and the defining feature of ARKit—is the ability to create and track a correspondence between the real-world space the user inhabits and a virtual space where you can model visual content. When your app displays that content together with a live camera image, the user experiences augmented reality: the illusion that your virtual content is part of the real world.”
At TestGrid.io, we love innovation and love to solve complex problems. Automated testing for an ARKit application is a complex problem and we are getting ready for it.
We started building an ARKit application to understand all the details that go into building an Augmented Reality application. If we can build it we have a better understanding of how to test it. See the gif below to see how the team is testing the ARKit application.
After having, maybe too much fun stuck inside during the last hurricane, we learned some important things to test in any ARKit application.
1. Scene or Sprite object responsiveness
At its most basic a Sprite consists of a set of coordinates and a texture that is rendered to the canvas or in this case the image being captured by your camera. It is important that the objects are not too heavy for the device you are testing on and the frame refresh rate is good.
2. Plane detection testing
The plane is a value specifying whether and how the session attempts to automatically detect flat surfaces in the camera-captured image. To test this you must make sure the object is placed on the plane.
3. Anchor point testing
This can be tricky. Apple’s ARKit is pretty good at anchoring the object where it was dropped but with ambient lighting and other ambiance parameters, it can lose the anchor point.
4. Illumination and Reflection testing
Illuminating a three-dimensional scene is not an easy task. Your AR object acts differently for different lighting arrangement. Design it for predictable lighting conditions if possible but make sure to test it thoroughly with different light settings.
Gestures can be tricky to an AR object. There are a limited number of gestures available and a ton of actions to be performed. Typically people use Tap for object placement, Swipe to relocate or drag, Pinch to zoom and Two finger rotation to rotate. Test these gestures on an ARKit app to make sure that your user experience is not compromised.
iPone X gave us a new word on Tuesday, “Animoji.” Animoji is an animated versions of the classic emoji that can track a person’s face and do whatever they’re doing. Like in iPhone X’s demo, object overlays or transitions should be super smooth without jitteriness.
7. Device Performance
ARKit apps can wreak havoc on your users device performance. Be sure to track your:
- Battery Drain: Does the ARKit app kill the battery
- Network: Make sure that the network calls are tailored for the app
- CPU: Make sure that the CPU is not pushed too much
8. Device Compatibility Testing
iPhone X and iPhone 8 uses A11 Bionic chip and Apple’s latest GPU designed in-house. The processors and cameras on these new devices are made for AR perfection. As a tester, we need to make sure we test on all the devices like iPhone 7 and iPhone 6S which support AR apps.
9. iOS Compatibility Testing
ARKit apps are available for iOS 11 and above only. If a user downloads your app and their OS version is less than iOS 11, there needs to be a work around.
Automation Testing an aRKIT application
AR apps are tricky to automate, they might use the Front or rear camera of the phone. With the devices being in the Grid, it would be hard to access and make a meaningful test out of the automation tests.
For automated testing of ARKit Application, scenarios can be added to the basic UIView to test instead of using the Scene View or Sprite View on a camera view. TestGrid.io’s plug and play automation is an easy solution for these new situations. Feel free to share your thoughts in our comments section below.
CTO at TestGrid