One of the key challenges to achieving a scalable, robust AR solution is an accurate and reliable visual experience that requires accurate positioning/tracking and visual processing. Tracking is how the augmented reality algorithm can recognize real-world objects in the camera feed, so it can accurately overlay virtual objects onto it. Most of the common commercial solutions that perform tracking today, such as ARCore by Google and ARKit by Apple, use vision algorithms that inspect visual features in the camera feed to determine the exact position of the camera in space, enabling the algorithm to place the correlated 3D model. These solutions allow flexibility and simplicity but are sensitive to environmental factors, such as light and line of sight, and require visual features that are clear enough to achieve accuracy.
This white paper describes the design and implementation of a robust industrial augmented reality assembly instruction and validation solution in which various tracking methodologies are reviewed to address challenges of environmental factors such as light, and the prerequisite of having fixed visual features in an environment. The resulting solution is being tested in a production line at the Siemens Gas and Power factory, gas-insulated switchgear manufacturing line in Berlin, Germany.