Click not just the company behind the popular social app Snapchat. The company has also created a powerful platform for augmented reality developers called Snap AR – it extends beyond Snapchat thanks to the Camera Kit, an SDK solution that allows you to integrate Snap’s camera capabilities into other apps.
In Snap jargon, lenses are essentially augmented reality apps that you can access on Snapchat. And over the past few years, the company has created a huge “app store” Lenses. Some of them are funny, some are useful, some help you communicate in a whole new way. It is becoming a rich augmented reality ecosystem.
“Already over 250,000 creators have created over 2.5 million lenses that have been viewed over 5 trillion times,” Snap CEO Evan Spiegel said during the conference. Snap Partner Summit keynote.
Today, the company is launching another component for its developer platform, Lens Cloud. As the name suggests, this is a server component that will help developers create fast-paced multiplayer games.
Lens Cloud has three components. First, developers can take advantage of multi-user services. This allows you to create an instance for a group of friends so they can interact together at the same time in the same lens.
Second, location-based services allow developers to map Lenses to locations starting in central London. For example, museums can use this to activate certain lenses when you point the camera at a certain landmark.
And finally, there are storage services. Developers can store assets on Snap’s servers and download them on demand. It can also act as a kind of memory card. Users can exit the lens and continue where they left off later.
“Storage services allow developers to expand beyond 8 megabytes. They do this by storing the assets they are about to upload to Lens in real time in our cloud,” said Sarah Perez of TechCrunch at Snap Head or AR Platform Partnerships Sophia Dominguez.
Storage services are not yet available, but the company plans to launch them in the coming months. This collection of back-end services will be available free of charge to Snap AR developers.
Creators who want to get started on a Snapchat lens start by downloading Lens studio. They can import 2D and 3D assets, use 3D Face Mesh, add their own shaders, write scripts, and take advantage of Snap’s machine learning models with SnapML. Snap is releasing a new version of Lens Studio today with some new features.
Lens Studio already allows you to dynamically adjust lenses using an API. For example, you can change the look of the lens if it’s raining. The company is adding new API partners to open up more possibilities. Thanks to the AstrologyAPI and Sportradar, you can create content that is customized based on astrological or sports data.
The company is also working on support for ray tracing, which should greatly improve reflections and surface rendering in general. Analytics have also been improved with event analysis. In particular, it should help with debugging.
Credit: techcrunch.com /