Key Takeaways
Reflection AI announced Thursday it has closed a $2 billion funding round, catapulting the year-old startup into elite territory with an $8 billion valuation as investor enthusiasm for artificial intelligence shows no signs of slowing.
The funding round was led by chipmaker Nvidia, with participation from heavyweight investors including former Google CEO Eric Schmidt, Citi, and Donald Trump Jr.-backed private equity firm 1789 Capital. Existing investors Lightspeed, Sequoia, DST, B Capital, and CRV also joined the round, according to the company's announcement.
Founded in March 2024 by former Google DeepMind researchers Misha Laskin and Ioannis Antonoglou, Reflection AI originally focused on autonomous coding agents.
The company is now expanding its ambitions to build open-source frontier AI models that can compete with both Western closed labs like OpenAI and Anthropic, and Chinese AI firms such as DeepSeek.
Laskin, who led reward modeling for DeepMind's Gemini project, serves as CEO, while Antonoglou, co-creator of the legendary AlphaGo AI system that defeated the world Go champion in 2016, serves as CTO.
The duo has assembled a team of approximately 60 AI researchers and engineers across infrastructure, data training, and algorithm development.
The valuation leap is dramatic. Just seven months ago, Reflection raised $130 million at a $545 million valuation, according to PitchBook data. The new $8 billion valuation represents one of the largest jumps in recent AI startup history.
Building America's answer to DeepSeek
Reflection's pivot comes as Chinese AI models have demonstrated that cutting-edge performance doesn't require massive budgets. DeepSeek's R1 model stunned the industry by matching the capabilities of much more expensive models with training costs reportedly as low as $6 million.
In an interview with TechCrunch, Laskin framed the competitive landscape in stark terms. He stated that DeepSeek and other Chinese models serve as a wake-up call, adding that if American companies don't act, the global standard of intelligence won't be built by America.
Laskin explained that this situation places the U.S. and its allies at a competitive disadvantage, noting that enterprises and sovereign states often avoid using Chinese models due to potential legal repercussions.
He emphasized that the choice is between accepting competitive disadvantage or rising to the challenge.
The company's mission has garnered support from U.S. government officials. David Sacks, the White House AI and Crypto Czar, posted his endorsement on X, expressing enthusiasm for more American open-source AI models and emphasizing the desire for U.S. leadership in this category.
Open-source strategy with proprietary elements
Reflection's approach to being "open" focuses on accessibility rather than full transparency. According to Laskin, the company will release model weights for public use while keeping datasets and full training pipelines proprietary.
Laskin explained the rationale to TechCrunch, noting that model weights are the most impactful element because anyone can use and tinker with them, while infrastructure stacks can only be utilized by a select handful of companies.
The business model centers on serving large enterprises and governments.
Researchers will access the models freely, but revenue will come from enterprises building products on Reflection's models and governments developing sovereign AI systems—AI models developed and controlled by individual nations.
Lasking told TechCrunch that large enterprises by default want open models they can own, run on their infrastructure, control costs for, and customize for various workloads, particularly given the significant financial investment in AI.
What's next for reflection
The company has not yet released its first model, which will initially be text-based with multimodal capabilities planned for the future.
The funding will primarily support the computing resources needed to train new models. Reflection has secured a compute cluster and aims to release a frontier language model trained on tens of trillions of tokens in early 2026.
In a statement on X, the company highlighted its achievement of building what was once thought possible only inside the world's top labs: a large-scale LLM and reinforcement learning platform capable of training massive Mixture-of-Experts models at frontier scale.
The investment arrives as the AI sector continues attracting massive capital. Global venture funding in the third quarter of 2025 rose 38% year-over-year to $97 billion, with approximately 46% of that total directed toward AI firms, according to industry data.
Read more: