McKinsey's Lilli case provides key development ideas for the enterprise AI market: edge computing + small model potential market opportunities. This AI assistant, which integrates 100,000 internal documents, has not only achieved an adoption rate of 70% of employees, but is also used an average of 17 times a week. This product stickiness is rare in enterprise tools. Below, I will talk about my thoughts:

1) Enterprise data security is a pain point: The core knowledge assets accumulated by McKinsey over the past 100 years and the specific data accumulated by some small and medium-sized enterprises are extremely sensitive and are not processed on the public cloud. How to explore a balance state of "data does not leave the local area, and AI capabilities are not discounted" is the actual market demand. Edge computing is an exploration direction;

2) Small professional models will replace large general models: Enterprise users do not need general models with "billions of parameters and all-round capabilities", but professional assistants that can accurately answer questions in specific fields. In contrast, there is a natural contradiction between the versatility and professional depth of large models, and small models are often more valued in enterprise scenarios;

3) Cost balance between self-built AI infra and API calls: Although the combination of edge computing and small models requires a large initial investment, the long-term operating costs are significantly reduced. Imagine if the AI large model used frequently by 45,000 employees comes from API calls, the resulting dependence, increase in usage scale and product theory will make self-built AI infra a rational choice for large and medium-sized enterprises;

4) New opportunities in the edge hardware market: Large model training cannot be separated from high-end GPUs, but edge reasoning has completely different hardware requirements. Processors optimized for edge AI by chip manufacturers such as Qualcomm and MediaTek are ushering in market opportunities. When every company wants to build its own "Lilli", edge AI chips designed for low power consumption and high efficiency will become a necessity for infrastructure;

5) The decentralized web3 AI market is also growing: Once enterprises’ demand for computing power, fine-tuning, and algorithms for small models is driven, how to balance resource scheduling will become a problem. Traditional centralized resource scheduling will become a problem, which will directly bring great market demand for web3AI decentralized small model fine-tuning networks, decentralized computing power service platforms, etc.

While the market is still discussing the general capabilities of AGI, it is even more gratifying to see that many enterprise users are already exploring the practical value of AI. Obviously, compared with the past resource monopoly leaps in computing power and algorithms, when the market focuses on edge computing + small models, it will bring greater market vitality.