Daybreak - Deepseek cracked open AI. India’s AI plumbers are loving it

Episode Date: March 4, 2025

Up until recently, for most enterprises the default choice ended up being ChatGPT maker Open AI's models. That was mainly since for a long time there were no serious alternatives. Then, in c...ame Deepseek R1. It proved that other models could compete and even win against OpenAi, that too at a fraction of its price. So now its the one that’s nudging enterprises to think twice before paying OpenAi for its services. And as a byproduct of that, over the last few months, the entire AI ecosystem has been moving from the one size fits all approach to picking the best tools for the job. Basically that means getting multiple models to work together. There is a huge opportunity here. Not for consumer-AI startups that were once dominating funding charts, but instead for LLMops businesses. These are companies that glue together large language models or LLMs and optimise hardware and software to speed up computation processes. In the near future, these companies could potentially grow faster than ever before. Tune in. Daybreak is produced from the newsroom of The Ken, India’s first subscriber-only business news platform. Subscribe for more exclusive, deeply-reported, and analytical business stories.The Ken is hosting its first live subscriber event! Join two long-term and contrarian CEOs, Nithin Kamath of Zerodha and Deepak Shenoy of Capitalmind, as they discuss the mental models, decision making frameworks, and potential outcomes related to a very real possibility: an extended stock market winter that lasts 24 months or more. Click here to buy your tickets. 

Transcript
Discussion (0)
Starting point is 00:00:01 Hi, this is Rohan Dharma Kumar. If you've heard any of the Ken's podcasts, you've probably heard me, my interruptions, my analogies, and my contrarian takes on most topics. And you might rightly be wondering why am I interrupting this episode too. It's for a special announcement. For the last few months, I and Sita Raman Ganeshan, my colleague and the Ken's deputy editor, have been working on an ambitious new podcast. It's called Intermission.
Starting point is 00:00:28 We want to tell the secret sauce stories of India's greatest companies. Stories of how they were born, how they fought to survive, how they build their organizations and culture, how they manage to innovate and thrive over decades, and most importantly, how they're poised today. To do that, Sita and I have been reading books, poring over reports, going through financial statements, digging up archives, and talking to dozens of people. And if that wasn't enough, we also decided to throw in video into the mix. Yes, you heard that right. Intermission has also had to find its footing in the world of multi-camera shoots in professional studios, laborious editing, and extensive post-production. Sita and I are still reeling from the intensity of our first studio recording.
Starting point is 00:01:21 Intermission launches on March 23rd. To get alert, as soon as we release our first video. episode, please follow intermission on Spotify and Apple Podcast or subscribe to the Ken's YouTube channel. You can find all of the links at the ken.com slash I am. With that, back to your episode. About 70 years ago, a 25-year-old British athlete named Roger Bannister managed to do the impossible. He became the first person to run a mile in under four minutes. It was, and To a great extent, still is considered to be one of the greatest sporting feats in the last century. Up until then, no one could have even imagined humans running that fast.
Starting point is 00:02:12 But then Bannister did it. And suddenly, it just wasn't so inconceivable anymore. Soon enough, lo and behold, multiple runners followed in his footsteps. And a new record was set. Now, there's an interesting parallel to draw here with all that's been going on in the world of artificial intelligence infrastructure. The Ken reporter Abhiramiji recently spoke to Orko Chato Padhae, the co-founder and chief executive of an AI startup called Pipe Shift. He said that the Chinese open-source AI model Deepseek R1
Starting point is 00:02:44 in many ways was like the banister of the AI world. Its entry back in January heralded what Chattopadya called the four-minute mile of AI. Because until then, for most enterprises, the default choice ended up being chat GPT maker open AI's models. And that was mainly because for a long time, there were no serious alternatives. Then, in came Deep Sea Car 1. It proved that other models could compete and even win against Open AI, that too, at a fraction of its price. That made enterprises think twice before paying Open AI for its services. And as a byproduct of that, over the last few months, the entire AI
Starting point is 00:03:25 ecosystem has been moving from the one-size-fits-all approach to picking the best tools for the job. Basically, that means getting multiple models involved to work together. Now, that is where a massive opportunity lies for smaller companies in this space, the likes of pipe shift. Or just take the AI company, Lyser AI, for instance. It's been focusing on adding deepseek's functions to its AI agents. Now, these are essentially autonomous programs that can understand and perform specific tasks without human intervention. It's pretty cutting-edge stuff. And of course now, India wants in on the action.
Starting point is 00:04:03 It's announced plans for its own foundational AI model within the next eight or ten months. Opinions are divided among AI builders about the practical relevance of these, but one thing is certain. There is a huge opportunity here. Not for the consumer AI startups that were once dominating funding charts, but instead for LLM-OPS businesses.
Starting point is 00:04:24 These are the companies that glue together large-language models or LLMs and optimize hardware and software to speed up computation processes. In the near future, these companies could potentially start growing faster than ever before. Welcome to Daybreak, a business podcast from the Ken. I'm your host Rahil Filippos and I don't chase the news cycle. Instead, every day of the week, my colleagues Nikda Sharma and I will come to you with
Starting point is 00:04:50 one business story that is worth understanding and worth your time. Let's talk about the tremendous opportunity that lies before LLM-OPS firms like Lizer AI. The biggest impact of Deepseek entering the picture was that its cheaper model was also open source, which means that the software is openly available for use, study, modification and sharing by absolutely anybody. It pushed open AI to even start rethinking its strategy of keeping their models close source. But more than that, it also provides an incentive for other AI companies to crank up their R&D efforts and release open source models of their own. Anirud Narayan, the co-founder and chief growth officer of Lyser AI, said in the next 12 months,
Starting point is 00:05:50 pretty much all major players will come out with models that are even better than Deepseek. Remember, the banister of the AI ecosystem. Because R1 is open source. Now, everyone can see how it works and can therefore come up with their own better versions. For LLM-OPS firms like LISER AI, it means a greater choice of models to work with and also a whole new category of clients. especially the ones who were unconvinced or unable to implement these solutions earlier.
Starting point is 00:06:19 You see, during the initial AI boom, investors just wanted in on the technology by hook or by crook. That's why AI app startups, the kinds that were simpler to build and easier for people to understand, were all the hype. The current AI infra companies were only starting out a couple years ago. Before that, some of them worked with machine learning operations and vision language model. Even open source LLMs weren't very common then. It was only after Lama 2's release in late 2023 that people started using open source models at scale. Now that DeepSeek is as good as it is, despite being open source, the market is ripe for opportunity for these companies.
Starting point is 00:06:57 Just take Bangal-based Simply Smart, for instance. Now, this is an Axel and Titan capital-backed firm that builds Gen AI-based solutions. And generally, the enterprise clients it caters to have one of three very different types of privacy concerns. Some don't particularly care, like an AI-chatting app for entertainment purposes. Some want all of their data hosted on servers in one particular country like a SaaS company. And the third type, which is typically large banks or companies with valuable IP, choose to host their models locally on their premises. But across the spectrum of privacy concerns,
Starting point is 00:07:34 companies are seeing more opportunity in deep-seek because they are open-source. In fact, people in the industry say it really isn't even a choice anymore. Why would customers use close-source models if they can get better results, a cheaper price, and complete data privacy using an open-source model? Chattupadier says there will also be a shift in allocation of resources. You see, up until now, most companies that were using Microsoft Azure, the cloud computing platform, were putting a bulk of their spending into API access. That's what allows developers, applications or systems to communicate with other software systems
Starting point is 00:08:09 and access their data or features. But with Deepseek in the picture, companies are discussing the possibility of shifting their budgets towards GPU hours. Now, that's a metric used in high-performance computing to measure how long a graphic processing unit is used to process tasks. That's what's required to run open-source models. There's also an opportunity here for LLMOps companies to double up as advisors to enterprises on how they can enhance their use of Gen AI. More on that in the next segment. The goal for a lot of LLM ops firms is to help companies replace human functions with agentic AI functions. Let me explain how that works with the help of an example.
Starting point is 00:08:54 Now, Lyser's Narayan explained that a company may approach him and say they want to send automatic emails to their prospective clients in multiple languages from certain email databases. In which case, Lyser will hook them up with one of its flagship products, an AI sales development representative agent called Jaysen. Here, one can feed a data set of emails into the agent, which then proceeds to personalize emails based on its online search for the email ID holders information. Later, the agent sets up conversations with relevant stakeholders. It also allows the company to do that across WhatsApp and Facebook. So all the salesperson has to do is show up for the schedule call. Similarly, say you wanted to write a blog post based on some statistics with the help of an AI agent.
Starting point is 00:09:40 You would essentially end up using three LLMs. The first would be Claude, the Gen AI Assistant from San Francisco-based Anthropic. That does all the calculations. Then Google's Gen AI chatbot, Germany would write the blog, and one of OpenAI's models would help make it more creative. It naturally ends up making the whole process of writing a blog post both way easier and also almost instantaneous. But that begs the question. How do you pick the right model for the job?
Starting point is 00:10:09 More times than not, it ends up coming down to cost. That's what Amit Sachin, vice president of AI at Telecom Giant Reliance, Geo told us. Sachin focuses on Gen.A.I. applications and research. And he said that for use cases like content generation, an open AI model works just fine, especially on a smaller scale. But if you want to deploy models at a larger scale, an open source model makes far more sense. For instance, Geo Cinema, the erstwhile form of Geo Hot Star, the latest over-the-top streaming platform from Reliance, uses LLMs to connect words or user types in the search box to their watch or search history,
Starting point is 00:10:45 providing a more personalized and accurate result. Deep Sea Card 1 has basically opened up greater possibilities for AI infra companies to build agents at top open source models. But while they are good, while these agents can now summarize and mimic human conversations, there is still a long way to go before you let them take over something like, say, bank transactions. Stay tuned. Now we are likely to see a lot more deep-seek moments going forward. So what's the next four-minute mile in the AI race going to look like?
Starting point is 00:11:23 Well, first, for the convenience of their users, larger builders like Open AI and Anthropic will eventually take over the many functions that dedicated LLM ops firms are currently undertaking. The second more broader implication is that the nature of work itself will change once AI agents come into the wider market. The belief is that as he has, adoption of agents goes up, machines will be able to take up a lot of the tasks humans are
Starting point is 00:11:49 currently doing. But new jobs will also be created in the wake of various job functions being automated. But Narayan told us that it is still too soon to save. No one knows exactly what these will look like. But that said, the government's priorities with the Indian AI mission may be more about the perception of having a model built for the country and also addressing certain concerns about data integrity. All things considered, it's still a boost for stakeholders here who want to get into the model building business. And the more the models, the better the environment for enterprises. As for the near future, at least in the next six months or so, enterprises are sure to shift from using open AI to deep-seek, that on a much more extensive scale. In other words,
Starting point is 00:12:33 it's the moment for LLM-OPS firms to shine, at least for another year or so, before the next banister enters the picture. Daybreak is produced from the newsroom of the Ken India's first subscriber-focused business news platform. What you're listening to is just a small sample of our subscriber-only offerings. A full subscription unlocks daily long-form feature stories, newsletters and podcast extras.
Starting point is 00:13:03 Head to the ken.com and click on the red subscribe button on the top of the website. Today's episode was hosted by Rahil Filippo's and edited by Rajiv Sien.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.