MediaTek and Meta Collaborate to Propel On-Device AI Evolution with Llama 2 Integration

Semiconductor leader MediaTek revealed its collaboration with Meta on the development of Meta’s next-generation open-source Large Language Model (LLM), known as Llama 2.
MediaTek uses Meta's Llama 2 for Generative AI in Edge DevicesThis strategic alliance seeks to combine Meta’s cutting-edge LLM technology with MediaTek’s latest Application Processing Units (APUs) and NeuroPilot AI Platform, with the overarching goal of constructing a comprehensive edge computing ecosystem. This ecosystem is designed to accelerate the progress of AI application development across various sectors, including smartphones, IoT devices, vehicles, smart homes, and more.

Traditionally, the processing power behind Generative AI largely resides in cloud computing infrastructure. However, MediaTek’s innovative utilization of Llama 2 models aims to enable the execution of generative AI applications directly on-edge devices. This paradigm shift offers an array of benefits to developers and users alike, encompassing seamless performance, heightened privacy, improved security and reliability, decreased latency, operability in low-connectivity environments, and reduced operational expenses.

To fully capitalize on the potential of on-device Generative AI technology, manufacturers of edge devices will need to embrace high-computing, low-power AI processors and establish faster, more reliable connectivity options. MediaTek has proactively moved in this direction, as evidenced by the inclusion of APUs within each of their 5G smartphone System-on-Chips (SoCs), capable of supporting various Generative AI features such as AI Noise Reduction, AI Super Resolution, and AI MEMC.

Adding to these advancements, MediaTek’s impending flagship chipset, slated for unveiling later this year, will incorporate a meticulously optimized software stack tailored for Llama 2 compatibility. Furthermore, the chipset will integrate an enhanced APU featuring Transformer backbone acceleration. This innovation aims to minimize the footprint access and utilization of DRAM bandwidth, effectively elevating the performance of both the Large Language Model and AI Generative Computing. These developments are expected to hasten the creation of practical use cases for on-device Generative AI applications.

JC Hsu, Corporate Senior Vice President and General Manager of the Wireless Communications Business Unit at MediaTek, expressed his enthusiasm, stating, “The surging popularity of Generative AI marks a transformative trend within the digital landscape. Our vision is to empower the dynamic Llama 2 developer and user community with the essential tools to drive innovation within the AI sphere. Through our collaboration with Meta, we are uniquely positioned to provide hardware and software solutions with unparalleled edge computing capabilities.”

MediaTek projects that AI applications based on the Llama 2 model will soon be accessible to users of smartphones powered by the upcoming flagship SoC. The highly anticipated release is expected to hit the market by year’s end, signaling a remarkable advancement in on-device Generative AI technology.

This marks Meta’s second major collaboration in recent times, highlighting the strategic importance of generative AI in the company’s future endeavors. Just last month, Meta announced a partnership with Qualcomm, emphasizing the company’s dedication to leveraging the potential of advanced AI technologies.

MediaTek, a prominent player in the semiconductor industry, will be integrating Meta’s Llama 2 model into its processors to facilitate on-device AI capabilities. One of the distinct advantages of Llama 2 over its competitors, such as GPT-4, is its ability to be efficiently packaged into a smaller program. This feature enables it to run seamlessly on lighter and more compact devices like smartphones. By dividing data processing between local devices and cloud resources, this integration not only makes AI more easily accessible but also offers the promise of cost-effectiveness, personalization, and enhanced privacy.

Anisha Bhatia, Senior Technology Analyst at GlobalData, underscores the significance of this collaboration in Meta’s journey to rebuild its reputation. With recent concerns about data privacy and the usage of personal information, Meta’s alignment with a technology that prioritizes on-device processing could potentially contribute to restoring trust and confidence in the company.

However, it’s important to note that the field of Large Language Models (LLMs) is still in its early stages, characterized by fluctuations in capabilities that may emerge and recede unpredictably as the models continue to evolve. This variability poses challenges to companies, as the unstable nature of these models can impact their reputation. Despite these experimental aspects, investor enthusiasm for generative AI remains robust, indicating the growing interest in the potential applications of this technology.

As Meta and MediaTek move forward with their collaboration, the integration of Llama 2-based LLMs into MediaTek’s chipsets could mark a significant step in democratizing AI and making it an integral part of various consumer-facing technologies. The partnership holds the potential to reshape the AI landscape, making it more accessible, customizable, and secure for users across the globe.

Latest

More like this
Related

Qualcomm revenue jumps 19% as shifts focus to AI devices

Qualcomm said its revenue rose 19 percent in fourth-quarter...

MediaTek launches Dimensity 9400, optimized for AI, gaming

MediaTek has introduced its latest chipset, Dimensity 9400, designed...

Qualcomm Networking Pro A7 Elite with Edge AI for Wi-Fi 7 performance

Qualcomm Technologies has announced the launch of the Qualcomm...

Qualcomm intros Snapdragon 7s Gen 3 mobile platform

Qualcomm Technologies has introduced the Snapdragon 7s Gen 3...