Server room filled with GPU racks powering AI cloud services for RunpodPhoto by Brett Sayles on Pexels

Runpod, a cloud platform for AI applications, has reached a $120 million annual revenue run rate. The company started four years ago when two friends in New Jersey repurposed their cryptocurrency mining computers into AI servers after posting about it on Reddit. Founders Zhen Lu and Pardeep Singh built the business without outside money at first, and now it serves 500,000 developers around the world.

Background

Zhen Lu and Pardeep Singh worked as developers for Comcast in late 2021. They had a hobby of mining Ethereum cryptocurrency using special computers set up in their basements. These machines used graphics processing units, or GPUs, which are powerful for calculations. But mining was not paying off. The computers cost a lot, and they were not making back the money spent. On top of that, Ethereum was changing its network in an update called The Merge, which would end the kind of mining they did.

The two friends talked about what to do next. They both knew machine learning from their jobs. They saw that developers had a hard time using GPUs for AI work. The software and setup were messy and slow. So they decided to turn their mining rigs into servers for AI tasks. This was before tools like ChatGPT became popular, but they could see AI was growing fast.

They fixed the problems they faced themselves. They wanted a platform that was easy to use, fast to set up, and simple for developers. In early 2022, they put up a post on Reddit sharing what they built. People noticed. Developers started using it. That post brought in the first users. Within months, the platform had real traction.

At first, Runpod was just a side project. Lu and Singh ran it from their homes. They did not take any loans or free money offers that other similar companies did. They made sure the business paid for itself from day one. No free tiers for users. Every customer paid something. This kept things lean.

Key Details

Runpod offers cloud services focused on AI. Developers can rent GPU servers to train models, run tests, or deploy apps. The platform has tools like APIs and command-line interfaces to make work easier. There is also a serverless option that handles setup automatically. Users can scale from one GPU to thousands and back down quickly, with start times under 500 milliseconds.

Growth came fast. In six months, businesses wanted to use it for real production work. But customers said they could not trust servers in basements. So the founders partnered with data centers to expand. By May 2024, Runpod had 100,000 developers. That caught the eye of investors.

Radhika Malik, a partner at Dell Technologies Capital, saw their Reddit posts and reached out. She led a $20 million seed funding round along with Intel Capital. Julien Chaumond, co-founder of Hugging Face, became an investor too. He found them through the support chat while using the product.

Customer Base and Reach

Today, Runpod has 500,000 developers as customers. Some are solo hobbyists. Others are big companies like Replit, Cursor, OpenAI, Perplexity, Wix, and Zillow. Fortune 500 teams spend millions a year on it. The cloud covers 31 regions worldwide. This lets users pick servers close to them for speed.

The company stays profitable. It bootstrapped to over $1 million in revenue before the seed round. Now at $120 million run rate, it plans a Series A raise. Founders say they enter talks from a strong spot with real growth numbers.

"The actual experience of developing software on top of GPUs was just hot garbage," Zhen Lu said.

Runpod keeps a focus on developers. It builds what users need based on feedback. The platform handles different kinds of compute and storage to make one smooth cloud for AI work like training, fine-tuning, and running models.

What This Means

Runpod shows how a small idea can grow big in AI infrastructure. The market is crowded. Big clouds like AWS, Google Cloud, and Microsoft Azure offer GPUs. New players like CoreWeave, Lambda Labs, and Crusoe Cloud focus on AI too. Runpod stands out with easy tools built by developers for developers. It skips heavy operations work and stays close to its community start.

Demand for AI computing keeps rising. More apps need GPUs to run models. Runpod's path from basement to global scale means others might follow. Bootstrapping first builds discipline. Paying customers from early on proves value. Now with backing from Dell and Intel, it can expand faster.

The company bets on AI changing coding. Developers will build and run AI agents more. Platforms like Runpod make that possible. Its serverless tools let teams deploy without managing hardware. This could pull more users from bigger clouds seeking simpler options.

Enterprise deals are key. Multi-million contracts show trust. As AI moves to production, reliable scale matters. Runpod's 31 regions help with that. Plans for Series A mean more data centers and features ahead.

Competition will heat up. Hyperscalers add AI services. Specialists chase the same customers. Runpod's edge is speed and ease. No debt, no free tiers—it stays focused on paying users. This model supports long growth without burning cash.

The Reddit start matters. It built a user base organically. Today, word spreads the same way. Developers share tips and builds. This community keeps the platform sharp. As AI booms, Runpod aims to be the go-to cloud for custom systems.

Author

  • Tyler Brennan

    Tyler Brennan is a breaking news reporter for The News Gallery, delivering fast, accurate coverage of developing stories across the country. He focuses on real time reporting, on scene updates, and emerging national events. Brennan is recognized for his sharp instincts and clear, concise reporting under pressure.