AR Solutions upgraded UX for web-enabled AR platform
Users, with the help of the newly improved UX, feel as if they are inside an app through 3D virtual AR objects in a location and then “placed”. As soon as the virtual 3D object is “placed”, users can move their camera left and right whereas the object remains stable. This gives consumers a realistic experience, known as simultaneous localisation and mapping (SLAM).
Upgrading the UX is NexTech’s latest initiative to further strengthen its offerings as a comprehensive end-to-end solution for the thriving e-commerce sector.
Last month, the company launched its ‘Try-It-On’ AR experience for online retail. According to the company release, this AR experience is first of several sentiment-based technology solutions being developed for its patent pending web-enabled AR eCommerce platform and site optimisation. This new solution uses facial tracking capabilities and AI-assisted computer vision technology that delivers a realistic experience, allowing consumers to see their different looks when they wear apparel items like jewellery, glasses, headwear, etc.
The current iteration, which tracks eyes, enables an AR preview for items like glasses, goggles, and other eyewear. Soon, the support will shortly follow lips, nose, ears, mouth, and other zones above the shoulder.
Yet the company’s augmented reality experience has not exactly been wall-to-wall positive reviews. Writing for the KnowTechie publication last month, in a piece which described NexTech as “the Microsoft Paint of AR apps”, Joe Rice-Jones decried the ‘Try-It-On’ experience. “Think of it as if Snapchat and Shopify had a baby,” he wrote. “The feature needs a bit of finessing. Right now, items appear like stickers over the world and don’t really line up to any of your furniture, walls or floors.
“While you can rotate or resize them, it would be nice if the AR system could figure out where the lines of your home are and adjust the size and placement accordingly. That’s likely needed for AR shopping to take off.”