41.4 F
London
April 9, 2025
PI Global Investments
Infrastructure

Modal Founder on Simplifying AI Infrastructure with Oracle Cloud Infrastructure


With offices in New York, Stockholm, and San Francisco, Modal is redefining how developers build and scale AI-based applications, providing a serverless platform that eliminates the need to manage infrastructure. 

Leveraging Oracle Cloud Infrastructure (OCI) AI infrastructure, Modal enables developers to run demanding AI workloads—such as model training, fine-tuning, and large-scale batch processing—without sweating the details of managing infrastructure and higher-level tooling.

In this conversation, Mahesh Thiagarajan, executive vice president, Oracle Cloud Infrastructure, sat down with Modal chief executive officer, Erik Bernhardsson—who previously led engineering teams at Spotify building its music recommendation system—to discuss the challenges of AI infrastructure, why Modal chose OCI, and what’s next for AI development at scale.

Thiagarajan: Could you give us a little background on Modal and what problems you’re solving? 

Bernhardsson: Modal is a serverless platform for AI, ML, and data developers. We make it easy to build and scale AI workloads in the cloud without dealing with the infrastructure headaches. Our focus is on making it easy to build applications and run AI workloads in the cloud. A lot of our work stems from frustration with existing platforms and tools, and the developer experience. Scaling AI infrastructure is hard—provisioning GPUs, managing containers, handling networking—so we built a platform where developers don’t have to think about any of that. Instead of spending weeks setting up infrastructure, they can just write code and deploy it instantly.

Thiagarajan: As a startup, what are your top challenges—tech, hiring, funding?  

Bernhardsson: Hiring and growth. Right now, capital isn’t a bottleneck for us—it’s about supporting our customers and bringing in the right people to scale effectively. I’ve worked at startups for my whole career, 15-20 years—the core thing is your people and who you hire. Hire scrappy, commercially minded, smart people, and you’re 80% of the way there. 

Thiagarajan: On the business side, what are some of the problems Modal is solving? 

Bernhardsson: Customers want to focus on what they’re good at, which is training models or building great consumer experiences, while we handle all the scaling, infrastructure management, provisioning, and containerization. Take one of our customers, Suno. Instead of hiring a massive team to manage their infrastructure, they use Modal. We handle the compute, the scaling, and the deployment, so they can focus on their core product. Our biggest value proposition to customers is that their engineers can move faster with Modal. They don’t have to build and maintain their own platform or worry about capacity planning. With Modal, developers who are using generative AI to create new applications, such as those used to compose music or design novel protein structures, can access a platform that delivers fast and reliable GPUs. With Modal using OCI AI infrastructure, developers building on Modal can ship code, get their data and AI models into production, and test applications much faster. Customers, especially AI startups also love our serverless business and pricing model that makes it easy for them to get started with a small team and build something really cool without breaking the bank.

Thiagarajan: Can you share details about why you chose OCI and how its helping Modal solve problems for customers? 

Bernhardsson: We’ve been very impressed with the hardware and the fact that Oracle offers bare metal instances, along with super-fast networking, disk and CPUs, and then the GPUs themselves, of course. OCI’s bare metal instances have been phenomenal at delivering value for our customers. We can provide the capacity our customers need, often with very short lead times. But beyond that, the agility of Oracle’s sales process has been a game changer. If I need to do a large scale out of my GPU environment, I can literally text someone and get them provisioned quickly. That kind of flexibility and partnership is rare.

Thiagarajan: AI infrastructure is getting crowded. How do you see the market? Where does OCI fit in, and how do you differentiate? 

Bernhardsson: I think what Oracle is doing extremely well is managing the hardware, the compute, the data centers, and doing that at a massive scale. Modal, on the other hand, is focused on the developer experience. We provide a serverless platform with clean APIs, Python SDKs, and instant execution. It’s a different part of the stack. There’s plenty of room for multiple players in this space. OCI AI infrastructure provides the raw power, and we make it easy for developers to access and scale it without friction.

Thiagarajan: Everyone’s talking about AI, but what’s underreported or overlooked?

Bernhardsson: I think audio and music are interesting. Historically, music tech has often been ahead of the curve—whether it was the gramophone, MP3s, or music streaming. Big shifts in technology tend to show up in music before spreading to other areas. We’ve seen a lot of AI applications already, but I still feel like we’re very early in understanding where this is all going. Fast forward 10 or 20 years and I think AI-generated music and audio will be far beyond where we are today. It’s exciting to watch because I think there’s a lot of untapped potential that people aren’t talking about as much as other AI breakthroughs.

Thiagarajan: What broader AI trends do you find interesting? 

Bernhardsson: One of the most exciting shifts is the realization that AI breakthroughs don’t have to rely on massive compute budgets. It’s been a little frustrating for me as someone with a math, computer science, and physics background because I want to believe in small, lean, scrappy teams solving very hard problems. Major breakthroughs weren’t capital-limited—they were driven by lean teams making clever innovations. A small, smart team that’s focused can also make a lot of progress. If AI is going to break through and change this world, we need it to be available to small teams. There are very few technological breakthroughs in the past where capital was the limiting factor, right? It’s often small teams with innovative, clever applications that drive things.

Thiagarajan: I think the point you’re illustrating is that continuous innovation can mean finding creative ways to use hardware to get the same outcome a lot more efficiently, right? 

Bernhardsson: Exactly. We’re going to see a major shift toward efficiency in AI infrastructure. If we look back 15 or even 10 years ago, you didn’t need a billion dollars to make a breakthrough. But now you need it. Over the next 10–15 years, I expect AI training costs to drop by orders of magnitude as we refine models, improve algorithms, and develop efficient hardware. We still have a long way to go, and AI training is wildly inefficient compared to the human brain. There’s so much room for improvement, and that’s where the next wave of innovation will come from.

OCI helps startups like Modal scale AI faster—without breaking the bank. With dedicated servers, full control, and built-in security, OCI provides reliability for AI innovators. See how OCI powers AI workloads here.



Source link

Related posts

Infrastructure-As-A-Service Market Size, Insights, Growth,

D.William

Innovative Thinker | WSP’s David Symons on designing infrastructure with neurodivergent people in mind

D.William

Biden-Harris Admin Pledges $520M for US Water Infrastructure

D.William

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.