The US AI Startup Is Loving China's Open Source Model

Bitsfull2026/03/24 11:1312946

Summary:

The US AI Startup Is Loving China's Open Source Model

The United States spent three years, imposed four rounds of export controls, covered 24 categories of semiconductor equipment, and listed over 140 entities in an attempt to cut off China's access to advanced AI chips. However, according to a report released by the U.S.-China Economic and Security Review Commission (USCC) on March 24, 80% of U.S. AI startups are using Chinese open-source models.


The wall is built at the hardware level. The door is open at the software level.


This set of contradictions is not an abstract policy discussion. Just last week, Cursor, a $293 billion AI programming tool, was found to have its flagship feature, Composer 2, built on the underlying framework of Kimi K2.5 from the dark side of the moon. A Chinese company's model is driving the top AI development tool in the U.S.


Meanwhile, the Pentagon has slapped the "supply chain risk" label on Anthropic, a U.S. company.


The direction of control and the direction of actual reliance are completely opposite.


Starting from the first round of export restrictions on A100/H100 level chips in October 2022, the U.S.'s chip control has continued to escalate. In 2023, the H800 gap was blocked, expanding the performance-density control metric. In December 2024, another round was added, including 24 categories of semiconductor equipment control, blacklisting 140 Chinese entities, and even high-bandwidth memory (HBM) and DRAM were included in the restriction scope. In January 2025, the Commerce Department even introduced an "AI Diffusion Framework," attempting to establish a global control system at the model level, but this framework was withdrawn two days before it took effect. By December 2025, Trump reversed course and allowed the export of H200 chips to approved customers in China.



In the second half of this control timeline, the release pace of Chinese open-source models has been accelerating. In 2024, the DeepSeek-V2 and Qwen 2.5 series were successively open-sourced. On January 20, 2025, DeepSeek-R1 and Kimi K1.5 were released on the same day, with the former briefly topping the U.S. App Store download chart, surpassing ChatGPT. In the second half of 2025, Kimi-K2 and GLM-4.5 followed suit. By early 2026, ByteDance's Bean 2.0 had 155 million weekly active users, while Kimi K2.5 was directly adopted by Cursor. The tighter the control, the more models emerge.


According to official data from HuggingFace, the market share of Chinese open-source models in global downloads surged from around 1.2% at the end of 2024 to about 30% in early 2026. The cumulative download count of Alibaba's Qwen series surpassed 700 million times in January 2026, officially overtaking Meta's Llama. Chip restrictions have not hindered China's AI software output but may have instead accelerated a strategic shift toward the open-source path.


This data coincidence is not random. The USCC report used a precise framework to describe this phenomenon: the "dual circulation." In the hardware circulation, China is constrained by chip supply bottlenecks. In the software circulation, China penetrates global AI infrastructure through open-source models, forming downstream dependencies. The power direction of the two circulations is opposite but mutually reinforcing. Regulation has limited our ability to access top-notch computing power, but it has also driven a technological path that accomplishes more with less computing power. DeepSeek-R1 achieving cutting-edge performance at a reasoning cost far below that of GPT-4o is a product of this path.



The changes on HuggingFace are visible to the naked eye. According to platform statistics, at the end of 2024, Llama-derived models accounted for about sixty percent of new language models, with Qwen just over ten percent. By mid-2025, a crossroads emerged, and according to the official HuggingFace blog, the proportion of Qwen derivatives soared to over 40%, while Llama fell to around 15%. In early 2026, Qwen derivatives had approached half, and Llama continued to shrink to about 12%.


This crossing speed has exceeded the expectations of most people. Two years ago, open-source AI was almost synonymous with Meta's Llama ecosystem. Global developers fine-tuned, deployed, and built products based on Llama. Now, the same is happening on the Qwen ecosystem, just faster and on a broader scale.


This means that when global developers build AI applications, they are increasingly choosing Chinese models as the underlying foundation. Not because of a political stance but because of performance and openness. The Qwen 2.5 series covers a parameter range from 0.5B to 72B, allowing developers to fine-tune and deploy on their hardware without the need to pay for API calls to OpenAI or Anthropic. Open-source eliminates vendor lock-in and erases borders.


An important detail is that, according to a February report by MIT Technology Review, Chinese AI companies are forming a differentiation competition in open-source strategy. DeepSeek focuses on extreme cost efficiency, Kimi emphasizes long-context and code abilities, and Qwen pursues full parameter coverage. This multi-track advancement trend is enriching global developers' choices. Our open-source models are using strength to redefine the global AI supply chain.


But what does the end of this supply chain look like?


On March 19, developer @fynnso found model ID accounts/anysphere/models/kimi-k2p5-rl-0317-s515-fast in the Cursor codebase. Cursor co-founder Aman Sanger later acknowledged that Composer 2 is built on Kimi K2.5. According to Cursor VP Lee Robinson, "The base model only accounts for about a quarter of the compute; the rest comes from our own training." But the base is the base. A product valued at 29.3 billion USD, the base model comes from the dark side of the moon, a Chinese company funded by Alibaba and Red Mountain (HongShan).



Putting this dependency chain together with the Pentagon's moves, the absurdity becomes more apparent. On March 5, the Pentagon officially labeled Anthropic as a "supply chain risk." According to NPR, the reason was that Anthropic CEO Dario Amodei refused to compromise on two red lines: AI uses for autonomous weapons and mass surveillance of American citizens. Trump gave the military 6 months to phase out Claude, who was deeply embedded in military and national security platforms. Anthropic then sued the Pentagon on March 9.


On one side, the U.S. government slaps the "supply chain risk" label on its own company, while on the other side, 80% of U.S. startups are running on Chinese models. The former is a political game, the latter is a technological reality. There is no intersection between the two.


80% of U.S. startups are running on Chinese models, and the Pentagon's risk label is stuck on a U.S. company. Regulations stack up at the hardware level, dependencies quietly grow at the software level. On the other side of the three-year chip wall, a new fact is taking shape: Chinese open-source AI is no longer a "follower" but a global AI infrastructure supplier.