Mapbox brings vision and augmented reality to location-based apps

Video

Development News / Video 343 Views

Mapbox has announced new solutions for developers looking to take their location-based solutions to the next level. The company announced native AR capabilities and a Vision SDK at its Locate conference in San Francisco this week.

The new Native AR capabilities comes in the form of two SDKs: Apple SceneKit SDK and React Native AR SDK. React Native AR SDK gives the ability for developers to build loca-based AR into their cross-platform applications. SceneKit SDK enables iOS developers to build AR experiences leveraging Apple’s native toolkit.

The new Vision SDK is designed to give developers the ability to provide more driving experiences. The SDK works with the company’s live traffic and navigation solutions to provide heads-up displays into apps.

“Equipped with better navigation, paired with augmented reality, and powered by high-performance computer vision, the SDK turns the mobile camera into a powerful sensor — developers have the key to the car,” Eric Gundersen, CEO of Mapbox, wrote in a post.

The Vision SDK features neural networks for real-time segmentation of the environment, the ability to provide live context on the edge, and the ability to detect things like other vehicles, pedestrians, speed limits, construction signs, crosswalks, and vegetation, the company explained.

Mapbox also unveiled a new partnership with Microsoft Azure. As part of the partnership, the Vision SDK will integration with the Microsoft Azure IoT. “The intelligent cloud and intelligent edge bring a wide range of possibilities for the future of smart cities, transportation, public safety and more. By integrating Mapbox’s Vision SDK with Azure IoT Hub, developers will have the power of Microsoft’s global-scale cloud platform and advanced AI services to ingest data in real-time,” Tara Prakriya, group product manager for Microsoft Azure, said in an announcement.

The company is also partnering with Arm to deliver the Vision SDK to Arm’s components and hardware such as CPUs, GPUs, and machine learning and object detection processors. “The Vision SDK puts developers in control of the driving experience. Our partnership with Arm will extend the Vision SDK’s reach to their hundreds of millions of devices. As new data is detected, the SDK classifies road boundaries, lane markings, curbs, crosswalks, traffic signs, and more. This data is then all used to update the map live to ensure the most up-to-date information,” Gundersen wrote.

Other announcements from the conference included a new technology partnership with Sumo Logic to provide new location-based visualization capabilities.  

“Improving the customer experience is at the core of all business and technology decisions today, and has become a major competitive advantage,” said Michael Marfise, senior director of product management of Sumo Logic. “With the ability to integrate Mapbox technology into the Sumo Logic platform, our users can easily visualize all of their data on interactive maps to identify anomalous behavior, solve problems faster and improve their overall business operations.”

The post Mapbox brings vision and augmented reality to location-based apps appeared first on SD Times.

Comments