Perplexity founder and CEO Aravind Srinivas believes India is on the wrong path when it comes to building AI models
Srinivas believes that instead of finetuning a foundational model, Indian companies should focus on training their models from scratch
Currently, majority of the Indian AI companies are pushing for a product finetuned on open sourced foundational models
Inc42 Daily Brief
Stay Ahead With Daily News & Analysis on India’s Tech & Startup Economy
AI search engine Perplexity founder and CEO Aravind Srinivas believes India is on the wrong path when it comes to building AI models.
Taking to X, Srinivas said, “Re India training its foundation models debate: I feel like India fell into the same trap I did while running Perplexity.”
Notably, Perplexity AI uses almost all of the major foundational models to provide real-time answers to users’ queries on the platform.
Srinivas believes that instead of finetuning a foundational model, Indian companies should focus on training their models from scratch.
He said that while the thinking models are going to be costly to train, India “must show the world that it’s capable of ISRO-like feet (sic) for AI”.
“I think that’s possible for AI (to train models frugally), given the recent achievements of DeepSeek. So, I hope India changes its stance from wanting to reuse models from open-source and instead trying to build muscle to train their models that are not just good for Indic languages but are globally competitive on all benchmarks,” he added.
DeepSeek is a China-based AI company that develops large language models (LLM). On (January 20), the company launched its latest reasoning models, DeepSeek-R1 and DeepSeek-R1-Zero, to take on platforms like OpenAI-o1.
DeepSeek, which has raised a little over $4 Mn, is being touted as a competitor to OpenAI.
“I’m not in a position to run a DeepSeek-like company for India, but I’m happy to help anyone obsessed enough to do it and open-source the models,” said Srinivas.
Currently, majority of the Indian AI companies are pushing for a product finetuned on open sourced foundational models. However, few of the notable exceptions include Ola’s Krutim AI and the Indian-government backed BharatGen.
It is pertinent to note that Infosys cofounder Nandan Nilekani, in October last year, said that Indians shouldn’t be focusing on building foundational models from scratch. Instead, they should create synthetic data and smaller models quickly.
“We will use it (open-source foundational models) to create synthetic data, build small language models quickly, and train them using appropriate data,” he said at the time.
One of the major AI companies in India probably following his advice is Sarvam AI. The Lightspeed-backed startup is focusing on training its AI models using synthetic data.
However, Srinivas doesn’t agree with Nilekani. “… He’s (Nilekani) wrong in pushing Indians to ignore model training skills and just focus on building on top of existing models. (It is) essential to do both,” Srinivas wrote in a separate post.
Industry veterans like Google India research director Manish Gupta also shared similar opinions previously, saying that India will benefit from building foundational models.
Key Highlights
Funding Highlights
Investment Highlights
Acquisition Highlights
Financial Highlights
Note: We at Inc42 take our ethics very seriously. More information about it can be found here.