Qualcomm Technologies and Meta are working to optimize the execution of Meta's large Llama 2 language models directly on the device – without relying solely on cloud services.
The ability to run generative AI models like Llama 2 on devices like smartphones, PCs, VR/AR headsets, and vehicles allows developers to save on cloud costs and provide users with private, more reliable, and personalized experiences.
As a result, Qualcomm plans to make Llama 2-based AI implementations available on the device to enable the creation of new AI applications. This will enable customers, partners and developers to create use cases such as intelligent virtual assistants, productivity apps, content creation tools, entertainment and more.
These new on-device AI experiences, powered by Snapdragon, can work in areas without connectivity or even in airplane mode.
“We applaud Meta’s approach to open and responsible AI and are committed to driving innovation and lowering barriers to entry for developers of any size by bringing generative AI to the device,” said Durga Malladi, senior vice president and general manager of technology, planning and edge solutions company, Qualcomm Technologies, Inc. “To effectively scale generative AI into the mainstream, AI will need to run in the cloud and on edge devices such as smartphones, laptops, vehicles and IoT devices.”
Meta and Qualcomm Technologies have a long history of working together to drive technological innovation and deliver the next generation of premium device experiences. The companies' current collaboration to support the Llama ecosystem encompasses research and product engineering efforts.