Premier #AR object tracking meets premier real-time #high-quality graphics
Right on time to AWE, one of the largest & most important get-together for Augmented- and Mixed Reality, we are delighted to announce that VisionLib is now also available for Epics Unreal Engine. VisionLib Development is now available on both industry-leading 3D realtime platforms, Unity and Unreal.
We are excited to be the first to offer model tracking for Unreal. Model Tracking is considered the de-facto standard for detecting and tracking objects in AR, and uses 3D models to localize these in the video stream.
It has become first choice for businesses to augment valuable information directly to entire products, assemblies or other parts. For good reasons: it’s the only technique for enterprise-grade marker-less object tracking. It is reliably precise, can handle object movement and remains working under changing or even adverse light conditions.
With this step, we also increase the range of realtime 3D development tools that can be used with VisionLib.
“That’s good news for XR developers, as this brings more flexibility and enables them to choose the best platform for their AR applications”, says Michael Schmitt, CTO at Visometry. If a developer wanted to create an AR experience before, Unity or native were the only options. Bringing VisionLib to Unreal increases the freedom of choice, because the question of which platform to use for AR no longer arises solely from the specifications of the tracking SDK.
Supporting More Platforms. Simplifying Workflows.
An important point for us at Visometry because, depending on the case, businesses have diverse and not necessarily congruent requirements for rendering and 3D engines. Photorealism, for instance, is secondary when presenting digital twins in production. In marketing, however, visual quality is key. That’s why we aim for supporting more platform options.
Unreal Engine is the state-of-the-art engine and editor with photorealistic rendering and lifelike animation – all in real-time. These are just some of the reasons why leading communication and marketing agencies like to use Unreal to showcase their client’s products and services.
“For many, not only from the creative sector, bringing VisionLib to Unreal simplifies the workflow of complex projects and productions. It eliminates the need to switch assets and production pipelines to present (existing) 3D or VR projects in XR”, concludes Schmitt. A point, which is often not seen or underestimated.
German mixed reality studio NSYNK has been using virtual production technologies to stage e.g. Porsche’s Taycan for some time now. Virtual Production is the process of shooting a film in front of a large LED screen. Compared to green- and bluescreen shootings, things get more tangible. All participants on location already have a better understanding of the scene and the main actors are illuminated in real time by the LEDs. The camera movement is synchronized with the background in such a way that a parallax shift suggests a real “depth effect”; in the camera shots, foreground and background merge in a natural way.
To do so, NSYNK started working with Unity, but eventually switched to Unreal. For them, a step which was especially important for virtual productions and broadcast-like projects. In early 2021, the agency launched the Porsche Taycan AR Event App, a broadcast-quality AR app for journalists, using VisionLib for tracking. It brought the visual quality and precision otherwise only known from broadcast solutions to a mobile phone. Looking for tracking options, NSYNK soon chose VisionLib and started to create the AR app which, at that time, had to be done in Unity.
No big deal in the beginning. But with stepping into Unreal development, working on a project, would come with the need to shift assets across different development platforms, specifically when it is planned to deploy across multiple channels, from virtual productions to XR, from desktop to mobile. So there is a natural desire to stay in the platform where a project started, aiming to avoid to duplicate projects and assets over a couple of dev tools.
Optimising assets for different devices, say from desktop to mobile, can already be a consuming process. Moving from one development platform to another, is a even more difficult story. Animations, app-logics and other elements can’t just be moved sometimes, they have to be re-created at worst.
“We’re excited that VisionLib now supports Unity and Unreal, so there is no need to switch platforms, only because we switch the medium for different communication purposes”, says NSYNK’s CEO, Eno Henze.
Join us at AWE – In-person & Virtually
You want to learn more? At AWE we show the first public preview of VisionLib for Unreal.
Save dates for our joint talk together with Epic Games and NSYNK, where we present insights into high-quality product experiences in XR (including an exclusive look on the creation of Taycan’s AR event app and the its creation workflow). The talk is Nov. 9 at 4:05pm (PST).
And, don’t forget to swing by our booth #719 to get a hands-on demo. Get all details and also discounts on tickets on our AWE event page. AWE 2021 is a hybrid event. If you can’t make it to the event, lookup virtual participation options at awe.live.
Shoutout to Epic & NSYNK
We’d like to thank NSYNK for trusting and using VisionLib. While the SDK is used for a variety of purposes, seeing such creative people using it, is truly inspiring.
We are also happy to have received one of Epic Games’ coveted Mega Grants for bringing VisionLib to Unreal. It was not only the financial support, but also the profund exchange on a personal and technical level. We’re honoured to be able to work with such a dedicated and amazing team at Epic.
About Unreal Engine
About NSYNK, at its virtual production system Hyperbowl
About VisionLib’s Model Tracking
The post VisionLib for Unreal – First Public Preview Available appeared first on Visometry.