Zhu Xiaohu of Jinsha River Ventures: DeepSeek is Making Me Believe in AGI
DeepSeek's open-source model is disrupting AI, challenging closed-source companies. Zhu Xiaohu predicts a shift toward open-source, reshaping the global AI landscape.
This article is an excerpt from a recent interview with Zhu Xiaohu, the Managing Partner of Jinsha River Ventures, conducted by Tencent News, with slight adjustments.
In just the past 20 days, Zhu Xiaohu’s attitude has undergone a surprising reversal.
One year ago, at the beginning of 2024, in our report titled "Zhu Xiaohu Told a Story of Chinese Realist AIGC", Zhu's views vividly presented a real-world Chinese AI story. He used phrases like "We could tell right away, this is definitely not going to work," "We said from the start, I’m not optimistic about large models," and "I didn’t even want to discuss it, you know? It’s meaningless," clearly stating he would never invest in any of the six Chinese large model startup companies.
Even by the end of 2024, his attitude toward so-called AGI (Artificial General Intelligence) remained cold—during a meeting, he firmly declared, “Now everyone more or less agrees with our judgment, right? Anyone still talking about AGI today has their own agenda—it's impossible! How could it possibly happen? AGI... it’s impossible to achieve with the current architecture.”
However, just after the 2025 Chinese New Year, with DeepSeek emerging rapidly on the global stage, and without any promotional efforts, this "ChatGPT-like" Chinese AI conversational product saw strong growth. Even this representative of China’s AI market and a spokesperson for realism began to, unexpectedly, closely examine and appreciate the beauty of AI.
One year later, in February 2025, Zhu Xiaohu once again accepted our interview. In this interview, he used words like “It really opened my eyes,” “Very stunning,” “Very surprised,” “I’m shocked,” and “Wow!” to express the profound impact he felt. He repeatedly used the words “So beautiful” and “Very deep” to describe his interactions with DeepSeek—emphasizing these two phrases 16 times in total. This investor, who had once labeled "AGI as a big scam," even stated: “DeepSeek is almost making me believe in AGI.”
The Price Doesn’t Matter Much Anymore, What Matters is Participating
Zhang Xiaojun: What’s your opinion of Liang Wenfeng?
Zhu Xiaohu: How can he write so beautifully? The product itself represents the team’s DNA. He probably enjoys beautiful language, philosophy, and deep thoughts on quantum mechanics, which is why he chose those particular datasets, influencing the entire response. It's really very human, very beautiful, and with depth.
Zhang Xiaojun: Many people think of Liang Wenfeng as a "representative of idealism and romanticism." Do you think of him as your opposite?
Zhu Xiaohu: Not necessarily! I also enjoy these kinds of words, right? When I see these words, I really think, "Wow!" It truly amazes me—these are things that resonate universally with humanity.
Zhang Xiaojun: Of course, they’re not raising funds today. If they opened funding rounds, would you invest?
Zhu Xiaohu: I would definitely invest! I would definitely invest! I think this is really meaningful. And today it’s already very clear—this kind of open-source ecosystem, like Android, is already emerging. With its momentum, it’ll be very hard for others to catch up!
Zhang Xiaojun: What conditions would you be willing to invest under?
Zhu Xiaohu: I think... (pauses for 3 seconds) the price doesn't really matter much anymore. The key thing is to participate in it. To really witness the emergence of AGI in humanity, to witness the birth of AI consciousness—these things are all very meaningful.
Zhang Xiaojun: (pauses for 2 seconds) Wow, your perspective has really changed a lot. Last year you said you wouldn’t invest in any large model companies.
Zhu Xiaohu: Yes, that's true! (laughs) This has really surprised me. At least with DeepSeek, I see a path to AGI, and I really feel like, at the very least, there’s a possibility of AI consciousness emerging.
Zhang Xiaojun: So no matter how much it costs, you’d be willing to invest?
Zhu Xiaohu: I think these things are extremely valuable.
Zhang Xiaojun: What’s the maximum amount you’d be willing to invest?
Zhu Xiaohu: The price is related to the amount you’re investing, right? If the price is too high, I’d just put in a little money to participate, right? (laughs)
Zhang Xiaojun: So no matter the price, you’re willing to participate?
Zhu Xiaohu: Yes, I’m willing to participate. Witnessing a change in human history is very interesting.
Zhang Xiaojun: Have you looked into DeepSeek’s recent technical reports and achievements? What do you think is the key breakthrough?
Zhu Xiaohu: The core is that human intervention is no longer needed. Originally it was RLHF (Reinforcement Learning with Human Feedback), but now it’s directly RL (Reinforcement Learning), so the costs can be kept very low. There are many innovative details, and when you add them all together, that’s why the cost is so low today.
But the most important thing is that it no longer requires human intervention. Human intervention is hard to scale and expand quickly. With machines, you just need to give it some high-quality initial data and guide it on how to think in a particular field, and it can keep going on its own. Scaling up is much easier. Though the initial data is also important and very hard to get, at least it’s a lot easier than before—this step is the most crucial.
Zhang Xiaojun: Do you think today’s DeepSeek is more of a follower or an innovator?
Zhu Xiaohu: It has already made innovations in many areas. Of course, OpenAI also says it has replicated a lot of core ideas and methods of O1, which is also possible—OpenAI is closed-source, so we don’t know if it’s using those methods. But it’s clear that DeepSeek has successfully replicated these techniques independently.
In any case, it’s basically caught up now, right?
Zhang Xiaojun: To some extent, has DeepSeek changed your view and perception of China’s technological innovation and progress? Because you’ve always been a “representative of realism,” thinking this approach suits China more, suits the local context. Has your view changed today?
Zhu Xiaohu: I’ve always thought that China’s open-source efforts could catch up. As long as OpenAI hits a wall and can’t move forward, China will definitely catch up! I just didn’t expect it to be this fast, and at such a low cost! The effect is so good! —This result really amazed me.
I used to think it would be like OpenAI, cold and robotic, but this time the effect is really stunning.
Zhang Xiaojun: Since you’re the "representative of realism," when you see someone like Liang Wenfeng, who represents technological idealism and romanticism, achieving success in China, what are your thoughts? —What do “Zhu Xiaohus” think of “Liang Wenfengs”?
Zhu Xiaohu: He’s not a typical entrepreneur. He already has substantial financial strength at Huanshi, and he’s already holding many cards. He’s not a typical startup. But indeed—because of his financial power, he’s allowed to pursue some ideals. This makes him a very different kind of new-generation entrepreneur.
Zhang Xiaojun: Have you made any efforts to invest?
Zhu Xiaohu: I’ve chatted with him, of course, hoping to get recognized and have a chance to participate, right? (laughs)
Zhang Xiaojun: Have you gotten anywhere with that?
Zhu Xiaohu: Not yet. We haven’t discussed it in depth yet. (laughs)
The Curse of the Leader
Zhang Xiaojun: In terms of impact, how much has the recent surge of DeepSeek affected OpenAI? How big of an impact has it had on the AGI narrative in the U.S.?
Zhu Xiaohu: A huge impact! If GPT-5 or a 100,000-card cluster doesn’t come out this year, or even if it does, if the performance and intelligence don’t improve significantly—at least 2 to 3 times—I think the open-source movement will definitely win.
If you’re spending 10 times the cost for only a 10% or 20% improvement in performance, who would keep spending so much to use a closed-source model? Everyone will move to open-source.
Zhang Xiaojun: Recently, Sam Altman (CEO of OpenAI) stated that, in terms of open-source, they have "always been on the wrong side of history." He also mentioned in an internal meeting that OpenAI’s conservative strategy on open-source over the past 5 years was a strategic mistake. What do you think of his statement? What are the chances of OpenAI going open-source in the future?
Zhu Xiaohu: That’s the curse of being a leader. When you’re in the lead, you definitely don’t want to open-source, you want to keep it closed; but once others catch up and you open-source, honestly, it becomes very difficult.
Also, with the costs involved today, they’ve already spent a lot of money—like 10 times the cost—yet they haven’t recouped those initial costs. If they open-source now, their business model completely changes.
So today, it’s a real test even for the big tech companies in the U.S., and even for American VCs. If you spend 10 times the cost to develop a foundational model, and Chinese companies can catch up in about 12 months, spending only a tenth of the cost—do you still want to spend all that money to keep pushing forward? It’s a very, very tough question.
Zhang Xiaojun: AI professionals in Silicon Valley say that the area is currently in a state of panic.
Zhu Xiaohu: Indeed, it’s a complete disruption. At first, everyone thought AI had high barriers to entry, high walls, but now it doesn’t seem like that. This gives the latecomers a huge advantage, right?
Zhang Xiaojun: So you don’t think OpenAI will go open-source?
Zhu Xiaohu: They’re already too late. Today, DeepSeek has already achieved over 20 million DAU, which is over 20% of OpenAI’s, and its daily download numbers are already surpassing OpenAI.
Its ecosystem could grow very quickly. If all the world’s programmers are already developing on DeepSeek’s open-source architecture and ecosystem, OpenAI opening up later won’t really matter.
Zhang Xiaojun: What kind of ripple effects will DeepSeek’s rise have on the global AI landscape, especially in the U.S. and China? How far will its impact reach?
Zhu Xiaohu: The very existence and value of closed-source models are now being severely questioned. If closed-source models are costly and don’t have a significant performance edge, why would anyone use them?
Zhang Xiaojun: Does OpenAI also face this existential question?
Zhu Xiaohu: It’s the same for them. Whether OpenAI still has value in the future is a very difficult question, right?
If GPT-5 and a 100,000-card cluster don’t provide a significant improvement—based on the latest info we have, about 2 or 3 companies in the U.S. have been training a 100,000-card cluster for about six months, but there’s no significant performance improvement—then why continue spending so much money on it?
Zhang Xiaojun: You said that "the Android of the AI era has already arrived," and you’re referring to DeepSeek.
Zhu Xiaohu: Exactly. Its growth rate is unbelievable! It’s something we’ve never seen before—20 days to 20 million DAU, and its daily download numbers are massive.
And the best part? It didn’t spend a single cent on advertising—unlike many companies in China that pour huge amounts of money into ads. DeepSeek spent zero on advertising. It’s all user-driven word-of-mouth.
If you search for DeepSeek on Xiaohongshu (Little Red Book), you’ll see how the beauty and depth of its language really blow users away. It’s so beautiful, and it has so much depth, right? Without spending any money on ads, its user retention is fantastic.
I use DeepSeek every day now, asking deep and difficult questions to see what it responds with—whether it can offer any insights for humanity.
Zhang Xiaojun: You mentioned that the Android of the AI era has arrived. Do you think OpenAI and Anthropic still have a chance to become the iOS of the AI era? Between the two, who do you think is more likely to become the iOS of the AI era?
Zhu Xiaohu: The key point is this: will closed-source models, like a 100,000-card cluster or GPT-5, still be able to improve several times over GPT-4? Will they have a 2-3x improvement in intelligence? That’s the only small window of opportunity.
If a 100,000-card cluster only improves by 10% or 20%, closed-source models won’t really have a chance—at least not to be widely used like these general-purpose models.
When I asked DeepSeek, it gave the same response—it said closed-source models might still have a place in certain verticals, fields that require proprietary data or even proprietary hardware.
But if you’re spending 10 times the cost to only improve by 10% or 20%, it’s better to use a free open-source model. Open-source models are already good enough in many scenarios. And in many cases, even in six months to a year, they’ll likely surpass humans—right?
Today, you can already see DeepSeek writing articles better than 99% of people. In fields like programming, physics, chemistry, and even medicine, within six to twelve months, it might surpass the vast majority of humans—this is already becoming evident.
Zhang Xiaojun: Why are China and the U.S. more likely to have two “Android” systems?
Zhu Xiaohu: If they go open-source, Llama will definitely continue to move forward, because DeepSeek is highly open-source, and people will quickly follow suit.
China and the U.S. won’t use the same open-source system. Even if they have two separate open-source systems, they’ll likely be highly compatible, with similar underlying structures.
Zhang Xiaojun: Just to add, OpenAI, since it's called "OpenAI," originally had a strong open-source ethos. What do you think has caused them to become increasingly closed-off over the past few years, effectively turning into “CloseAI”?
Zhu Xiaohu: They felt that their technology was far ahead of their competitors, and the investment was substantial. If you don’t close-source, it’s hard to recoup the upfront costs. So they probably wanted to see if they could build a closed-source company and make the business model smoother.
Zhang Xiaojun: Is this the dilemma of an innovator?
Zhu Xiaohu: Yes, indeed. It’s a very tough decision for any temporary leader.
Zhang Xiaojun: How do you think OpenAI will develop from here? Will it be able to maintain independent growth? What do you think the future holds for OpenAI?
Zhu Xiaohu: The Deep Research product they released today is also a great product. Up to now, they’ve always been at least a few months ahead of their competitors. But when they can’t move forward anymore, that’s when the real question about the future arises.
Their costs are very high, and if they can’t maintain a lead, the company will face significant challenges.
Zhang Xiaojun: How do you view DeepSeek’s impact on Nvidia? What’s the potential long-term effect?
Zhu Xiaohu: With AI’s strong capabilities and low costs, compute power will certainly be needed in the long run.
But first, it’s not necessarily dependent on Nvidia cards. Second, even if foreign companies are willing to spend money, buying Nvidia cards at such a high price doesn’t necessarily mean they’ll grow quickly. Nvidia’s stock price has already made very aggressive assumptions, and everyone expects that big companies will continue to ramp up their capital expenditure (CAPEX). But with the current speed, growth may not be as fast as initially expected.
People need to think about this: if I spend 10 times as much money today, others might catch up in a year—at most a year—by spending just 1/10 of the cost. So who will continue to push forward and invest that money?
Closed-Source Companies Will Face a Severe Test Today
Zhang Xiaojun: What impact do you think DeepSeek will have on China's AI and tech ecosystem moving forward? Will it be a key turning point in China's AI development?
Zhu Xiaohu: I think applications on top of it might explode massively. It’s already usable in many scenarios, and the costs are low enough. It’s even open-source, so I can recreate it at a very low cost—without worrying about building on someone else’s foundation. This is a huge liberation for many application companies, and there will definitely be a big explosion at the application layer.
Zhang Xiaojun: It’s expected that AI applications will explode in 2025.
Zhu Xiaohu: Definitely, definitely. I already feel that training closed-source models in China is completely meaningless now. Even OpenAI doesn’t see a significant difference of several times better than DeepSeek. Even if you’re 10-20% better than DeepSeek, it doesn’t matter. No one will use your closed-source model anymore.
Maybe only a few big companies will. The big companies might continue with closed-source models for their own barriers or specialized scenarios.
Zhang Xiaojun: It’s similar to chips—whether you want to do in-house development.
Zhu Xiaohu: Exactly. Big companies, because they have proprietary data, specific scenarios, and unique user needs, might continue with closed-source models. But I feel that Chinese big companies will also learn from DeepSeek’s framework and continue iterating on their own. That’s more likely. There’s no need to build everything from scratch.
Of course, Doubao might be different, because Doubao started from scratch, right? (Note: Doubao is the AI application developed by ByteDance, a Chinese internet giant.)
Zhang Xiaojun: It seems the cost of replicating isn’t high.
Zhu Xiaohu: Right, it’s very low! After the New Year, everyone is catching up very quickly. This year, AI teams in China might be working overtime.
Zhang Xiaojun: Earlier, you mentioned that AI application companies will benefit the most. So, who will be hurt the most?
Zhu Xiaohu: Closed-source companies today are facing a very severe test: should they continue down their own path, or should they rethink?
Zhang Xiaojun: A year ago, you said, “Open-source is a generation behind closed-source now, but in the long run, open-source will definitely catch up.” Why do you continue to firmly believe in the open-source path?
Zhu Xiaohu: The core question is: Does the Scaling Law hold? If the Scaling Law doesn’t hold, and there’s already a ceiling ahead, then closed-source will hit a wall, and open-source will definitely catch up.
Of course, I didn’t expect it to happen this quickly, and with such low costs! —That was really unexpected. But back in May-June last year in Silicon Valley, I had discussions with many Chinese engineers there, and at that time, people were already starting to question the Scaling Law. But back then, the 100,000-card cluster had just been built, and we didn’t know yet if the training results would be good or if progress could continue. Today, it’s clearer. After training for 6-7 months, the 100,000-card cluster’s performance might indeed be quite average.
Zhang Xiaojun: Is AGI still a "computational power game"?
Zhu Xiaohu: The requirements for computational power and algorithms are not that high; the key is high-quality data.
DeepSeek has proven this—why does it perform better than other models? A lot of times, it’s because the quality of the initial training data is higher. In the future, models might be like chefs. The training data I use, and the weight of my parameters, will definitely affect the result—some might be Sichuan cuisine, others might be Cantonese. So, it depends on what kind of training data you use and what your parameter weights are.
Why is DeepSeek’s text so beautiful, and especially in fields like philosophy and quantum mechanics, its answers are so deep? That might be due to the team’s DNA.
In the future, high-quality training data will be extremely important, especially in areas where the rules aren’t clear. First, you need to guide the AI on how to strengthen its learning. Your initial data really needs to be labeled by PhD-level experts from various fields.
That’s also why the CEO of Scale AI was so anxious and said some harsh things—his low-quality labels are now worthless! They have no meaning! To move forward now, you need extremely high-quality data labels.
Zhang Xiaojun: Do you have more information on how DeepSeek does its data labeling and ensures high-quality data?
Zhu Xiaohu: That’s understandable. People following behind can rely on others’ knowledge to train data and get high-quality data. Everyone does that. Besides Doubao, all AI model companies in China are doing it this way.
It’s not just about cost; it also speeds things up. But what data and text you select is different for each company.
The only thing DeepSeek hasn’t disclosed is the pre-training corpus—that’s the one thing they haven’t made public.
Zhang Xiaojun: Right.
Zhu Xiaohu: Of course, it performs very well. This might be some of their core secrets—exactly what text they used, which reflects their team’s DNA and preferences.
Zhang Xiaojun: DeepSeek’s responses are very emotionally intelligent, with high emotional value.
Zhu Xiaohu: Yes, exactly!
It really feels like a human response now! Unlike other models, which used to respond like cold machines.
When I used domestic models before, it felt like a simple replacement for a search engine, but it was still a machine, very cold. This time, DeepSeek’s responses feel like a human, and a very emotionally intelligent and intellectually deep human. I really love using DeepSeek—asking it tough questions and seeing how it responds.
Zhang Xiaojun: Did you have this feeling when you used ChatGPT?
Zhu Xiaohu: No, it was just a cold machine!
Zhang Xiaojun: How do you see the future development of China’s other large model companies?
Zhu Xiaohu: They all need to think about whether they should continue training their own closed-source models, or contribute to the entire ecosystem on DeepSeek? Or maybe completely pivot to applications? — Like what Mr. Kai-Fu Lee (founder of Sinovation Ventures) did, pivoting entirely to applications.
Or, based on open-source models, see if they can dive deeper into certain verticals? Like Baichuan, which has always wanted to focus on healthcare—can they leverage open-source models to excel in healthcare and make it better?
This is just speculative, but everyone is facing a very important decision. The earlier you make this decision, the better; the later you wait, the more passive you’ll be.
Zhang Xiaojun: It’s about transformation.
Zhu Xiaohu: Otherwise, it’s meaningless! What’s the point of continuing with closed-source? What’s the point?
Zhang Xiaojun: With DeepSeek’s momentum, if ByteDance makes a strong push, can they catch up?
Zhu Xiaohu: It’s not easy for ByteDance either. If they shift to open-source, it’s not that easy.
Why did ByteDance rise so quickly in the beginning? It was because they had such momentum, and the big companies couldn’t catch up. Now, AI has a momentum that’s even stronger than ByteDance’s! If they shift to open-source, they’d have to do it as thoroughly as DeepSeek. If they only do it halfway, like LLaMa, others will likely prefer to use DeepSeek.
Today, the most important thing for DeepSeek is to continue pushing forward, keeping pace with OpenAI. They need to establish their lead and build their open-source ecosystem. That way, even if big companies try to catch up later, it’ll be hard to overtake, right?
I even think we might see a symbolic event this year—could Tongyi Qianwen integrate its ecosystem with DeepSeek? That might be even more meaningful.
Zhang Xiaojun: Tongyi Qianwen and DeepSeek are compatible, so does that mean they will cooperate with Alibaba?
Zhu Xiaohu: Not necessarily. At least, everyone can integrate their ecosystems. If Tongyi Qianwen builds its own open-source ecosystem, it might be better to leverage DeepSeek—this is very possible.
Zhang Xiaojun: If DeepSeek is the Android of the AI era, could anyone else in China become the iOS?
Zhu Xiaohu: Today, globally, whether there’s still an iOS opportunity is questionable.
If a 100,000-card cluster can’t make a significant difference, why would we still need an iOS?
Unless you have a monopolistic hardware platform. Apple was able to dominate with the iPhone, and that gave them the opportunity to develop iOS. If you don’t have monopolistic hardware, then there’s no possibility of an iOS emerging.
Zhang Xiaojun: Does the global LLM industry need to reshape its valuation system?
Zhu Xiaohu: Closed-source models are clearly not worth that much anymore, right?
Especially OpenAI in the U.S., if their 100,000-card cluster hasn’t made any breakthroughs, just optimized reasoning, then Chinese companies will catch up quickly. This valuation definitely can’t hold.