Author: Carbon Chain Value
The development trend of Ai+Crypto seems to be unfolding rapidly. However, the way it is unfolding this time is a little different from what everyone imagined before. It is unfolding in the form of destroying the opponent. Ai first destroys the traditional capital market, and then destroys the Crypto market.
On January 27, the download volume of DeepSeek, a Chinese AI big model that has emerged as a dark horse, surpassed ChatGPT for the first time. It topped the US APPStore list, attracting special attention and coverage from the global technology and investment communities and even the media.
Behind this incident, not only does it remind people of the possibility of rewriting the future pattern of Sino-US technological development, but it also conveys a short-term panic to the US capital market. Affected by this, Nvidia fell by 5.3%. ARM fell by 5.5%. Broadcom fell by 4.9%. TSMC fell by 4.5%. Micron, AMD, and Intel all had corresponding declines. Even the Nasdaq 100 futures fell to -400 points. It is expected to set the largest single-day drop since December 18. According to incomplete statistics, the market value of the US stock market is expected to evaporate by more than $1 trillion in trading on Monday. Lost one-third of the total market value of the crypto market.
The crypto market, which closely follows the trend of the US stock market, also experienced a sharp drop as DeepSeek did. Among them, Bitcoin fell below $100,500, with a 24-hour drop of 4.48%. ETH fell below $3,200, with a 24-hour drop of 3.83%. Many people are still scratching their heads and thinking about why the crypto market has plummeted so quickly? It may be related to the lower expectations of the Federal Reserve's interest rate cut or even other macro factors.
So where does the market panic come from? DeepSeek did not develop by accumulating strong capital and a large number of graphics cards like OpenAi, Meta, or even Googel. OpenAI was founded 10 years ago, has 4,500 employees, and has raised $6.6 billion in funds so far. Meta spent $60 billion to develop an artificial intelligence data center that is almost the size of Manhattan. In contrast, DeepSeek was founded less than 2 years ago, has 200 employees, and the development cost is less than $10 million. It did not spend a huge amount of money to pile up Nvidia's GPU graphics cards.
One can't help but ask: How can they compete with DeepSeek?
DeepSeek breaks not only the cost advantages at the capital/technology level, but also the traditional concepts and ideologies that people have always had.
DropBox's vice president of product lamented on social media X that DeepSeek is a classic disruptive story. Existing companies are optimizing existing processes, while disruptors are rethinking basic methods. DeepSeek asks: What if we do this smarter instead of investing in more hardware?
It’s important to know that currently, it’s extremely expensive to train top AI models. Companies like OpenAI, Anthropic, etc. spend upwards of $100 million on computing alone. They need large data centers equipped with thousands of $40,000 GPUs. Just like you need an entire power plant to run a factory.
DeepSeek popped up and said, “What if we did this for $5 million?” And they didn’t just say it, they did it. Their models are on par with or better than GPT-4 and Claude on many tasks. How? They rethought everything from scratch. Traditional AI is like writing every number with 32 decimal places. DeepSeek is like “What if we only use 8 decimal places? It’ll still be accurate enough!” Requires 75% less memory.
The results are astounding, says DropBox's VP of product, which says that the training cost has been reduced from $100 million to $5 million. The number of GPUs required has been reduced from 100,000 to 2,000. The API cost has been reduced by 95%. It can run on gaming GPUs, no data center hardware required. And most importantly, they're open source. This isn't magic, just incredibly clever engineering.
Some people also said that Deepseek completely overturned the traditional concept in the field of artificial intelligence:
China only does closed source/proprietary technology.
Silicon Valley is the center of global artificial intelligence development and has a huge lead.
OpenAI has an unparalleled moat.
You need to spend billions or even tens of billions of dollars to develop a SOTA model.
The value of the model will continue to accumulate (Fat Model Hypothesis
The scalability assumption implies that model performance scales linearly with the training input cost (compute, data, GPUs). All of these conventional wisdoms were shaken, if not completely overturned overnight.
Archerman Capital, a well-known American equity investment institution, commented on DeepSeek in a briefing, saying that first of all, DeepSeek represents a victory for open source over closed source, and its contribution to the community will quickly translate into prosperity for the entire open source community. I believe that open source forces, including Meta, will further develop the open source model on this basis. Open source is a matter of everyone adding fuel to the fire.
Secondly, OpenAI's path of miracles seems a bit crude and simplistic for the time being, but it is not ruled out that a new qualitative change will occur when a certain amount of volume is reached, and the gap between closed source and open source will widen again. It is hard to say. From the historical experience of AI development over the past 70 years, computing power is crucial, and it may still be in the future.
Then, DeepSeek makes open source models as good as closed source models and more efficient. The need to spend money to buy OpenAI's API is reduced. Private deployment and autonomous fine-tuning will provide more room for development of downstream applications. In the next one or two years, we will most likely witness more abundant inference chip products and a more prosperous LLM application ecosystem.
Finally, the demand for computing power will not decrease. There is a Jevons paradox that says that the improvement in the efficiency of steam engines during the First Industrial Revolution actually increased the total coal consumption in the market. It is similar to the period from the big brother era to the popularization of Nokia mobile phones. It became popular because it was cheap, and because of its popularity, the total market consumption increased.