Bridge Engine makes it simple to author and deploy mind-bending mixed reality experiences to the mobile device you already own. Coupled with Structure Sensor's ability to capture dense 3D meshes of scenes, you can create magical experiences where it's impossible to tell the virtual from the real. Available now for iOS.
With a sensing range of over 3.5 meters and a depth map that updates at 30fps, the Structure Sensor attached to every Bridge enables robust 6-DoF inside-out positional tracking for room-scale VR experiences. There are no external cameras or fiducial markers required, and near-instant relocalization prevents extended tracking loss in challenging circumstances.
Bridge spans the spectrum from mixed to virtual reality, and you can cross back and forth between the two. Step from your living room into a spaceship — or open a window into another real (scanned) location! Any combination of VR and MR worlds is possible with Bridge Engine portal support.
Using the depth from the Structure Sensor, content is smoothly composited onto your environment. Virtual objects can interact with and respond to the real world with object occlusion, physics, and lighting/shadows.
SCNPhysicsBody *cubeBody = [SCNPhysicsBody dynamicBody]; cubeBody.collisionBitMask = BECollisionCategoryRealWorld | BECollisionCategoryVirtualObjects;
Bridge Engine runs at 60hz and uses advanced tracking techniques along with predictive rendering to keep digital objects locked to the real world.
While making Bridge we began to learn what it means to have virtual characters that live in the world with us. BRIDGET is what came out of our experiments. BRIDGET's a curious little robot whose core functionality is open source. You can pull features from BRIDGET into your own projects, or contribute and expand her capabilities.
Integrate our pathfinding methods with your character’s behavior for increased believability. Your game characters will successfully navigate your real 3D scanned environment to destination points or to other NPCs.
SCNVector3 *tapLocation = [mesh3DFrom2DPoint:tapPointInView outputNormal:tapNormal]; bool couldFindPath = [_bridget moveTo:tapLocation]; // will try to path find around obstacles
By capturing a dense map of the scene, Bridge lets real world objects and features subtly present themselves to users as they navigate through virtual worlds. That adds an extra measure of safety, and also gives users awareness of where they are in the real world while still immersed in a virtual one.
[_mixedReality setRenderStyle:(BERenderStyle::BERenderStyleSceneKitAndCollisionAvoidance withDuration:1.0]
Adding room scale inside-out positional tracking to your Unity mobile VR app is now as simple as replacing the standard Unity camera with Bridge Engine’s camera.
Bridge uses several techniques to make mobile mixed reality immersive. Along with the wide vision lens which doubles the normal camera view, Bridge Engine uses view dependent rendering to create a unique image for each eye. This enable stereo vision that extends to the near periphery.
[_mixedReality setRenderStyle:(BERenderStyle::BERenderStyleSceneKitAndColorCamera withDuration:1.0]