Qualcomm Backs Hybrid AI Model at India AI Impact Summit 2026

Share


At the India AI Impact Summit 2026, industry leaders highlighted India’s ambition to transition from being primarily a consumer of artificial intelligence technologies to becoming a global developer of sovereign AI infrastructure. Echoing this vision, Qualcomm advocated a hybrid AI deployment model that combines on-device intelligence with cloud computing to ensure scalability, efficiency, and affordability.

Sahil Arora, Business Head – AI & Data Centres at Qualcomm, outlined the company’s perspective on how AI in India is moving beyond research and model training toward real-world, citizen-centric deployment. He emphasized that AI solutions must operate closer to users rather than relying entirely on distant cloud infrastructure.

According to Arora, Qualcomm’s expertise in edge computing—traditionally through smartphones—is now expanding into data centres. However, the company’s core philosophy remains centered on performance per watt. The focus, he noted, is on efficiently running large language models both on devices and in the cloud while minimizing power consumption and maximizing performance.

He pointed out that models in the 7B to 10B parameter range can already function effectively on devices, alongside capabilities such as Automatic Speech Recognition (ASR) and Text-to-Speech (TTS). Running smaller AI models directly on devices and transmitting only processed text to the cloud can significantly reduce latency, connectivity demands, and overall compute requirements—an approach especially relevant at India’s massive scale.

Drawing parallels with India’s digital public infrastructure initiatives like Unified Payments Interface (UPI) and Aadhaar, Arora suggested that the country may now be on the brink of its next transformative shift driven by AI. However, he stressed that affordability will be crucial to ensuring widespread adoption.

For India’s diverse and large-scale use cases, Arora suggested that extremely large AI models may not be necessary. Instead, models in the 40 to 50 billion parameter range could offer an optimal balance between capability and efficiency for most practical applications.

He further emphasized that the future of AI in India lies in hybrid architecture—where GPUs and NPUs embedded in devices such as smartphones, automobiles, and smart home systems work in tandem with centralized cloud infrastructure. This blended approach, he said, represents the most viable pathway for India-scale AI deployment.

His remarks come at a time when the Indian government is expanding GPU infrastructure under the IndiaAI Mission, while private players continue to invest in building domestic data centre capacity. Together, these efforts signal India’s growing determination to establish itself as a key player in the global AI ecosystem.


Recent Random Post: