Loading stock data...
GettyImages 1172921170

On Tuesday, Snap took the stage at the Augmented World Expo to showcase an early version of its real-time, on-device image diffusion model. This innovative technology has the potential to revolutionize augmented reality (AR) experiences by generating vivid and immersive visuals in real-time.

A Small but Mighty Model

Snap’s co-founder and CTO, Bobby Murphy, revealed that the company has been working tirelessly to accelerate machine learning models for AR applications. According to Murphy, the model is small enough to run on a smartphone and fast enough to re-render frames in real time, guided by a text prompt.

The Emergence of Generative AI Image Diffusion Models

Murphy acknowledged that the emergence of generative AI image diffusion models has been exciting, but emphasized that these models need to be significantly faster for them to have an impact on AR. By developing a model that can run in real-time on a device, Snap is taking a crucial step towards making AR experiences more accessible and user-friendly.

Snapchat Users to Enjoy Enhanced Lenses

In the coming months, Snapchat users will start to see Lenses powered by this new generative model. The company plans to bring this technology to creators by the end of the year, enabling them to create even more immersive and engaging AR experiences.

Lens Studio 5.0: A Game-Changer for AR Creators

As part of its efforts to accelerate the development of AR applications, Snap has also launched Lens Studio 5.0 for developers. This latest version includes new generative AI tools that will help creators create AR effects much faster than currently possible.

Generative AI Tools: Unlocking New Possibilities

With Lens Studio 5.0, AR creators can now generate highly realistic ML face effects, custom stylization effects, and even 3D assets in minutes. The tool also includes an AI assistant that can answer questions and provide guidance to creators.

Face Mesh Technology: A Powerful Tool for Creators

Snap’s Face Mesh technology allows creators to generate characters like aliens or wizards using a text or image prompt. This feature is particularly exciting, as it enables creators to build more complex and interactive AR experiences.

The Future of Augmented Reality

Murphy’s announcement has sparked excitement in the AR community, with many industry experts predicting that this technology will have a significant impact on the future of augmented reality. As Snap continues to push the boundaries of what is possible with AR, we can expect to see even more innovative applications and experiences emerging.

Conclusion

Snap’s real-time AR image diffusion model has the potential to revolutionize the way we experience augmented reality. With its ability to generate vivid visuals in real-time, this technology will undoubtedly open up new possibilities for creators and users alike. As the company continues to develop and refine this technology, we can expect to see even more exciting applications and experiences emerging.

Related Topics

  • AI: The article discusses Snap’s use of AI-powered generative models to create AR effects.
  • Apps: Snap is showcasing its new AR image diffusion model at the Augmented World Expo.
  • Artificial Intelligence (AI): The company is using AI to accelerate machine learning models for AR applications.

About the Author

Aisha Malik is a consumer news reporter at TechCrunch. She has a strong background in technology and journalism, with an honors bachelor’s degree from University of Toronto and a master’s degree in journalism from Western University.

Subscribe to Our Newsletters

Stay up-to-date on the latest tech news by subscribing to our newsletters:

  • TechCrunch Daily News: Get the best of TechCrunch’s coverage every weekday and Sunday.
  • TechCrunch AI: Stay ahead of the curve with the latest news in artificial intelligence.
  • Startups Weekly: Get weekly insights on the startups that are shaping the future of tech.

Follow Us

Join the conversation by following us on social media:

Disclaimer

The article is for informational purposes only and should not be considered as investment advice or any other form of professional guidance.