Data center servers powering AI cloud infrastructure for Runpod platformPhoto by Brett Sayles on Pexels

Runpod, a cloud platform for AI applications, has reached a $120 million annual revenue run rate. The company, founded four years ago by Zhen Lu and Pardeep Singh, started when the two friends repurposed their basement computers from cryptocurrency mining to AI servers. Based in New Jersey, they grew the business through online posts and word of mouth, now serving 500,000 developers worldwide including big names like OpenAI, Perplexity, Replit, Cursor, Wix, and Zillow.

Background

Zhen Lu and Pardeep Singh worked as developers for Comcast when they started their side project in late 2021. They had spent around $50,000 on specialized computers with powerful graphics processing units, or GPUs, to mine Ethereum from their basements in New Jersey. They made some cryptocurrency, but not enough to cover costs. Mining also faced an end with Ethereum's network upgrade called The Merge. After a few months, they found it boring.

The two men convinced their wives to let them buy the equipment, so they needed a new use for it to keep peace at home. At work, they handled machine learning tasks. They decided to turn the mining setups into AI servers. This happened before tools like ChatGPT or DALL-E 2 became known.

As they worked on the hardware, they noticed problems with the software for GPUs. The setup felt slow and hard to use. Lu said the experience of building software on GPUs was poor. They set out to fix that. Runpod came from their wish to make GPU use easier for developers.

In early 2022, they launched the platform. It lets people host AI apps with fast setup, simple hardware choices, and tools like APIs and command-line options. There is also a serverless choice that handles setup automatically. At first, it supported basics like Jupyter notebooks for web apps.

Key Details

To find early users, Lu and Singh posted on Reddit in AI groups. They offered free server access for feedback. As first-time founders, they did not know much about marketing. The posts brought beta testers, then paying customers. In nine months, they quit their jobs and hit $1 million in revenue.

Growth sped up after ChatGPT launched. Users came from Reddit and Discord. The platform had to keep GPUs available. Singh said if capacity runs out, users switch providers.

Venture capitalists noticed them on Reddit. Radhika Malik from Dell Technologies Capital saw the posts and called. It was their first investor talk. Lu did not know how to pitch. Malik helped explain investor thinking and stayed in touch.

They ran the business for almost two years without outside money. Runpod charged from the start and covered costs, unlike some rivals that took debt or offered free tiers. Many other AI clouds started as crypto miners, but Runpod avoided that path.

Growth and Funding

By May 2024, Runpod had 100,000 developers. They raised $20 million in seed funding, co-led by Dell Technologies Capital and Intel Capital. Others joined, like Nat Friedman and Hugging Face co-founder Julien Chaumond. Chaumond found them through the support chat while using the product.

Today, Runpod serves 500,000 developers, from solo users to Fortune 500 teams spending millions yearly. The cloud covers 31 regions globally. It handles AI training, fine-tuning, and inference on different compute and storage setups.

The serverless endpoints let users scale to thousands of GPUs or down to zero with cold starts under 500 milliseconds. Recent fixes improved GitHub integration for faster container builds.

"We felt that the actual experience of developing software on top of GPUs was just hot garbage," Lu said.

Runpod keeps a small team where people handle multiple roles. Sales staff know tech, engineers think about products, and operations handle sales tasks. They talk often to the 300,000-plus developers who run production workloads on the platform.

What This Means

Runpod shows how a simple online post can spark fast growth in AI infrastructure. Demand for GPU compute has boomed with AI apps. The company meets needs for reliable, fast scaling without users managing hardware.

Founders plan a Series A raise soon. Their revenue run rate puts them in position for strong terms. Customers rely on them for key workloads, a big trust signal.

The platform helps bridge supply gaps in AI compute. With users like OpenAI and Perplexity, it powers real production apps. This growth comes as AI shifts from experiments to everyday tools.

Runpod focuses on speed and cost. Developers get what they need without complexity. As the AI field changes fast, the company aims to ship updates quickly while sticking to basics like affordable compute.

Enterprise teams now spend big, showing trust in the setup. Global reach in 31 regions helps serve users anywhere. The story from basements to $120 million highlights timing and execution in a hot market.

Lu and Singh built for developers like themselves. They care about user needs and move fast. This approach has drawn major backers and clients. Runpod positions itself as a base for future AI companies.

Author

  • Lauren Whitmore

    Lauren Whitmore is an evening news anchor and senior correspondent at The News Gallery. With years of experience in broadcast style journalism, she provides authoritative coverage and thoughtful analysis of the day’s top stories. Whitmore is known for her calm presence, clarity, and ability to guide audiences through complex news cycles.