There are many articles out there about applications of Extended Reality (XR) in all walks of life, from FMCG experiences to mining practices and from furniture visualizations to aircraft simulations. A simple Google search result shows a plethora of augmented reality applications in the market today.
There’s no doubt XR will be ubiquitous in the upcoming years, but the question is how can individuals, startups, businesses and enterprises leverage these capabilities to give their customers a superior product/brand experience for sustained periods of time? People’s exposure to augmented reality (AR) is currently limited to Pokemon Go where you view Pokemon in 3D space and take photos with it. This colours users’ perception of AR’s capabilities (or wrongly the limits of it) as a way to place life-size 3D replicas of real and virtual characters, objects, etc. to get a sense of scale and boost their social media likes with cool photos. While that’s an excellent start, it gives a limited view since AR has evolved to be much more, and in a wide variety of applications in areas. Here are some examples:
- Healthcare— practicing to diagnostics and surgery.
- Product experiences (both visual and functional) — apparel, FMCG, lifestyle products,etc.
- Equipment lifecycle management — new equipment installation and maintenance of existing machines
- Process training/simulation — step-by-step repair guidances and verification with collaborative assistance
- Real-world navigation
- Terrain mapping — structural view of real-world terrains
- Here are some more areas for AR-based applications —Education, Tourism, Furniture and a few more.
Some examples in more detail:
Experience product capabilities Let’s say you want to purchase smart home products like Google Home or Amazon’s Alexa, but you have no clue how it works. You can experience these and other smart home products, starting from placing them in your house to talking with them about funny jokes, place grocery orders, play YouTube videos, control ambient lighting, etc. and see how they behave in your smartphone, tablet or your head-mounted augmented reality glasses. All this before purchasing these products. Troubleshoot at multiple levels Let’s say you are driving on the highway and your car breaks down. Before you call the towing agency, you use Fabrik to connect with your car, understand what’s wrong and resolve the issue yourself with AR walkthroughs. In the event you are unable to resolve it yourself, you can place a call with an expert in the same AR experience and the remote genius will see what you are seeing and walk you through the diagnosing the problem. The goal is to get you out of the mess as soon as possible. If all else fails, the tow truck it is :( but the likelihood of that is very low.
Resolve issues and verify in real-time Think of areas like aviation or power distribution where a single part can bring down the entire network and this has a significant impact on end users, thousands of travelers or millions of households consuming electricity. Every minute of downtime is painful for all stakeholders and resolution is the highest priority. But these parts are complex, expensive and have a lifespan of decades. Any misstep may lead to further delays and escalating costs. Consequently, every step should be verified through a multitude of steps, directions and verification mechanisms. And AR, integrated with the equipment technology backbone can verify if the process followed by the user is accurate in real-time and guide the users through subsequent steps. This is more critical when there are multiple technicians working together in different locations of the same site. On a side note — who creates the walkthroughs?
All the above examples sound great, but somebody has to create these walkthroughs. There are multiple online and offline tools available to create quick AR/VR walkthroughs (see Spark AR or Reality Composer ). These tools are great to quickly create XR experiences. We at Fabrik have created an editor to quickly create AR walkthroughs, like you would in powerpoint. Now you don’t need SOPs (Standard Operating Procedures ) like user manuals, videos or articles telling users all the product features that startups take soo much pain to build. One can compose 3D walkthroughs and let customers discover your product through interactive AR experiences.
So, AR is no longer a 3D nice-to-have but a critical 3D digital twin that saves a lot of time, money and effort for all stakeholders in the value chain. It becomes the core of any product experience, any design activity, all collaborative maintenances and equipment lifecycle management. To make things easier, extended reality companies are going the extra mile of integrating with existing technology providers to introduce solutions (* NOT capabilities) for customers to quickly reap the benefits of emerging trends. Integration with existing technology backbone means working with customers’ enterprise resource planning (SAP, Azure, Oracle, etc.), job management, procurement, supply chain, etc. This makes the solution a one-stop-shop with a simple plug-and-play. Deeper integrations with complementary technologies In addition, a slew of complementary emerging technologies will seamlessly merge with XR over the next few years to provide an unprecedented level of capability for end-users in a multitude of sectors. This enables customers to pick a solution that’s much more comprehensive than purchasing smaller parts of the puzzle from various vendors and trying to manage them. Here are a couple of examples to illustrate in more detail:
Internet of Things — IoT hardware companies have sensors for everything, utilise the same backend infrastructure as XR and compute significant real-time data for multiple activities and use-cases. However, this data is not visualised efficiently and as a consequence, applications of IoT is limiting as an end-to-end solution. XR comes in handy when visualizing inferences from IoT devices in real-time on various machines.
Computer Vision — SLAM, recognizing objects and environments around users help overlay relevant information for the user. This provides them with actionable intelligence for a multitude of B2B, B2B2B and B2C use-cases.
Edge Computing — A large amount of computing happens on the cloud adding unnecessary latency to an XR experience. A majority of this computing can be offloaded from the client devices (think of smartphones, tablets or HMDs) to multiple compute layers starting with the edge.
5G — As internet broadband infrastructure goes through the roof, real-time GPU offloading becomes a reality and devices in remote areas will have excellent connectivity to achieve complex XR visualizations. Conclusion Now think of the sectors which will undergo a rapid transformation over the next 3–5 years as emerging technologies make inroads. Aviation, automotive, engineering, oil & gas, energy, shipping, retail, lifestyle products, real estate, mining, healthcare, insurance, humanities, education, sports, etc. are well underway to harnessing these capabilities. And there are many more application areas where extended reality will make a big splash and rethink the way we visualize our day-to-day activities.
Stay tuned for the next article: Virtual Reality and it’s applications.
*- capabilities are novel pieces of technologies but not solve a problem for customers. Solutions are built end-to-end to solve a specific customer problem.