After months of delving into the field where AI and Crypto Assets are combined, the understanding of this direction is deeper. This article makes a comparative analysis of the early views and the current direction of the track, and those who are familiar with the track can start from the second section. **
DecentralizationComputing Power Network: Faced with the challenges of market demand, the ultimate purpose of Decentralization is to drop costs. The community attributes and tokens of Web3 bring value that cannot be ignored, but it is still an added value for the Computing Power track itself, rather than a subversive change, and the focus is to find a way to combine with user needs, rather than blindly using the DecentralizationComputing Power network as a supplement to the lack of centralized Computing Power.
AI Marketplace: Discusses the idea of a full-link financialized AI marketplace where the value and vital value brought by the community and tokens are discussed. Such a market focuses not only on the underlying Computing Power and data, but also on the model itself and related applications. Model financialization is the core element of the AI market, on the one hand, it attracts users to directly participate in the value creation process of AI models, and on the other hand, it creates demand for the underlying Computing Power and data.
Onchain AI, ZKML faces the dual challenges of demand and supply, while OPML provides a more balanced solution between cost and efficiency. Although OPML is a technological innovation, it does not necessarily solve the fundamental challenge faced by on-chain AI, which is that there is no demand.
Application layer, most of the web3 AI application projects are too naive, AI applications are more reasonable to enhance user experience and improve development efficiency, or as an important part of the AI market.
First, the AI track review
In the past few months, I have done Depth research on the topic of AI + crypto, and after a few months of precipitation, I am glad that I have gained insight into the direction of some tracks at a relatively early stage, but I can also see that there are some opinions that do not seem accurate now.
**This article is only about opinions, not intro,**It will cover several general directions of AI in web3 and show my views and analysis of the track before and now. Different perspectives may have different inspirations, which can be viewed dialectically.
Let’s first review the main directions of AI + crypto set in the first half of the year:
1.1 Distributed Computing Power
In “A Rational Look at DecentralizationComputing Power Network”, based on the logic that Computing Power will become the most valuable resource in the future, the value that crypto can give to the Computing Power network is analyzed.
Although Decentralization Distributed Computing Power Network has the greatest demand for AI large model training, it also faces the biggest challenges and technical bottlenecks. This includes the need for complex data synchronization and network optimization issues. In addition, data privacy and security are also important constraints. Although there are some existing technologies that can provide initial solutions, they are still not applicable in large-scale distributed training tasks due to the huge computational and communication overhead. Obviously, the Decentralization Distributed Computing Power Network has more opportunities to land in model inference, and the incremental space that can predict the future is also large enough. However, it also faces challenges such as communication delays, data privacy, and model security. Compared with model training, inference has lower computational complexity and data interactivity, and is more suitable for conducting in a distributed environment.
1.2 Decentralization AI Marketplace
In “The Best Attempt to Decentralization AI Marketplace”, it is mentioned that a successful Decentralization AI marketplace needs to closely combine the advantages of AI and Web3, use the added value of distribution, asset Equity Confirmation, revenue distribution and DecentralizationComputing Power to drop the threshold of AI applications, encourage developers to upload and share models, and protect users’ data privacy rights, so as to build a developer-friendly AI resource trading and sharing platform that meets user needs.
The idea at the time (and probably not entirely accurate now) was that data-based AI marketplaces had much more potential. The marketplace of the dead model needs the support of a large number of high-quality models, but the early platform lacks a user base and high-quality resources, which makes it difficult to attract high-quality models due to insufficient incentives for excellent model providers; while the data-based marketplace can accumulate a large amount of valuable data and resources, especially private domain data, through decentralization, distributed collection, incentive layer design and data ownership guarantee.
Decentralization AI marketplace success relies on the accumulation of user resources and strong network effects, where users and developers can get more value from the marketplace than they can get outside the market. In the early days of the market, the focus is on accumulating high-quality models to attract and retain users, and then moving on to attracting and retaining more end-users after establishing a library of high-quality models and data barriers.
1.3 ZKML
Before the topic of ZKML was widely discussed, the value of on-chain AI was discussed in “AI + Web3 = ?”.
Without sacrificing Decentralization and trustlessness, onchain AI has the opportunity to lead the web3 world to the “next level”. The current Web3 is like the early stage of Web2, and it has not yet taken on the ability to take on wider adoption or create greater value. onchain AI is designed to provide a transparent and trustless solution.
1.4 AI Applications
In “AI + Crypto Starts to Talk about Web3 Women’s Game-HIM”, combined with the portfolio project “HIM”, the value of large models in web3 applications is analyzed. In addition to the hard core from infrastructure to algorithms, the development of trustless LLMs on the chain, another direction is to dilute the impact of the black box in the process of inference in the product, and find a suitable scenario to implement the powerful inference ability of the large model.
Second, the current AI track analysis
2.1 Computing Power Network: There is a lot of room for imagination but a high threshold
The big logic of the Computing Power network remains the same, but it still faces the challenge of market demand, who needs a solution with lower efficiency and stability? Therefore, I think the following points need to be figured out:
What is Decentralization for?
If you ask a founder of a DecentralizationComputing Power network now, he will tell you that our Computing Power network can enhance security and resistance to attacks, increase transparency and trust, optimize the use of resources, better data privacy and user control, resist censorship and interference…
These are common sense, and any web3 project can be involved in censorship-resistant, trustless, privacy, etc., but my point is that none of these matters. Decentralization Computing Power networks do not essentially solve the problem of privacy, and there are many contradictions such as security. Therefore: the ultimate purpose of Computing Power network Decentralization must be for lower costs. The higher the degree of Decentralization, the lower the cost of using Computing Power.
So, fundamentally, “using idle Computing Power” is more of a long-term narrative, and whether a DecentralizationComputing Power network can be made depends largely on whether he has figured out the following points:
Value provided by Web3
A clever token design and the consequent incentive/punishment mechanism are clearly a powerful value add provided by the decentralization community. Compared with the traditional Internet, tokens not only serve as a medium of exchange, but also complement smart contracts to enable protocols to achieve more complex incentive and governance mechanisms. At the same time, the openness and transparency of transactions, the drop of costs, and the improvement of efficiency all benefit from the value brought by crypto. This unique value provides more flexibility and room for innovation to motivate contributors.
But at the same time, I also hope that this seemingly reasonable “fit” can be viewed rationally, for the DecentralizationComputing Power network, the value brought by Web3 and Blockchain technology is only “added value” from another perspective, rather than a fundamental subversion, and cannot change the basic working mode of the entire network and break through the current technical bottleneck.
In short, the value of these web3s is to enhance the attractiveness of the Decentralization Network, but it will not completely change its core structure or operating model, and if the Decentralization Network is to truly occupy a place in the AI wave, the value of web3 alone is far from enough. Therefore, as mentioned later, the right technology solves the right problem, and the gameplay of the DecentralizationComputing Power network is by no means simply to solve the problem of the shortage of AI Computing Power, but to give this long-dormant track a new way of playing and thinking.
It may be like pow mining or storage mining, monetizing computing power as an asset. In this model, providers of Computing Power can earn Tokens as remuneration by contributing their own computing resources. The appeal is that it provides a way to directly convert computing resources into economic gains, thereby incentivizing more participants to join the network. It may also be based on web3 to create a market that consumes computing power, and open up a demand point that can accept unstable and slower computing power by financializing the upstream of computing power (such as models).
Want to understand how to combine with the actual needs of users, after all, the needs of users and participants are not necessarily just efficient Computing Power, “can make money” is always one of the most convincing motivations.
The core competitiveness of DecentralizationComputing Power network is price
If we must discuss DecentralizationComputing Power in terms of actual value, then the biggest imagination space brought by web3 is the Computing Power cost that has the opportunity to be further compressed.
The higher the Decentralization of the Computing PowerNode, the lower the price per unit Computing Power. It can be deduced from the following directions:
The introduction of tokens, the payment to NodeComputing Power providers from cash to the native Token of the protocol, which fundamentally drop operating costs;
The permissionless access and the strong community effect of web3 directly contribute to a market-driven cost optimization, more individual users and small businesses can use existing hardware resources to join the network, the supply of computing power increases, and the supply price of computing power in the market decreases. Under the model of autonomy and community management.
The open Computing Power market created by the protocol will drop the price game of Computing Power providers, thereby further reducing costs.
Case: ChainML
To put it simply: ChainML is a Decentralization Platform that provides Computing Power for inference and finetuning. In the short term, chainml will implement the Open Source AI proxy framework Council, which will bring demand growth to the Decentralization Computing Network through Council’s attempt (a chatbot that can be integrated into different applications). In the long run, chainml will be a complete AI + web3 platform (which will be analyzed in detail later), including the model market and Computing Power market.
I think the technical path planning of ChainML is very reasonable, and they think very clearly about the problems mentioned before, the purpose of DecentralizationComputing Power must not be to provide sufficient Computing Power supply for the AI industry on a par with centralized Computing Power, but to gradually drop the cost to allow the right demander to accept this lower quality Computing Power source. Therefore, from Computing Power DecentralizationComputing PowerNode the perspective of the product path, it should start from the centralized way, run the product link in the early stage, and start accumulating customers through strong BD capabilities, expand and base the market, and then gradually disperse the providers of centralized Computing Power to smaller companies at a higher cost, and finally roll out the Computing PowerNode on a large scale. This is the idea of chainml divide and conquer.
From the perspective of the layout of the demand side, ChainML has built an MVP of a centralized infrastructure protocol, and the design concept is portable. We have been running the system with customers since February this year and have been using it in production since April this year. Currently running on Google Cloud, but based on Kubernetes and other Open Source technologies, it’s easy to port to other environments (AWS, Azure, Coreweave, etc.). Decentralization of the protocol will follow, decentralization to niche clouds, and finally Miners who provide computing power.
2.2 AI Market: More Room for Imagination
This sector is called AI markerplace, which somewhat limits the imagination space. Strictly speaking, an “AI market” with real imagination space should be an intermediate platform that financializes the whole link of the model, covering from the underlying Computing Power and data to the model itself and related applications. As mentioned earlier DecentralizationComputing Power the main contradiction in the early stage was how to create demand, and a closed-loop market that financializes the whole link of AI has the opportunity to give birth to this kind of demand.
Something like this:**
An AI market supported by web3 is based on computing power and data, attracting developers to build or fine-tune models through more valuable data, and then develop corresponding model-based applications, which create demand for computing power while developing and using these applications and models. Under the incentive of Token and community, real-time data collection tasks based on bounty or normalized incentives for contributing data have the opportunity to expand and expand the unique advantages of the data layer in this market. At the same time, the popularity of applications also returns more valuable data to the data layer.
Community
In addition to the value brought by the token mentioned earlier, the community is undoubtedly one of the biggest gains brought by web3 and is the core driving force for the development of the platform. For example, the achievement of data diversity is an advantage of such platforms, which is essential for building accurate and unbiased AI models, and is also a bottleneck in the current direction of data.
I think the core of the whole platform is the model, and we realized early on that the success of an AI marketplace depends on the existence of high-quality models, and what incentive do developers have to provide models on a decentralization platform? But we also seem to have forgotten to think about a problem, spelling infrastructure is not as hard as traditional platforms, spelling developer communities are not as mature as traditional platforms, and spelling reputation does not have the first-mover advantage of traditional platforms, so compared with the huge user base and mature infrastructure of traditional AI platforms, web3 projects can only overtake in corners.
The answer may lie inAI model financialization
Models can be treated as a commodity, and treating AI models as investable assets could be an interesting innovation in Web3 and Decentralization markets. This marketplace allows users to directly participate in and benefit from the value creation process of AI models. This mechanism also encourages the pursuit of higher-quality models and contributions to the community, as the user’s benefits are directly related to the performance and application of the model;
Users can invest by staking the model, and the revenue sharing mechanism is introduced to motivate users to choose and support potential models on the one hand, providing economic incentives for model developers to create better models. On the other hand, the most intuitive criterion for stakers to judge a model (especially for image generation models) is to conduct multiple measurements, which provides a demand for the DecentralizationComputing Power of the platform, which may also be one of the ways out of the previously mentioned “who would want to use a less efficient and more unstable Computing Power?”
2.3 Onchain AI: OPML overtaking in corners?
ZKML: Demand and supply both end of the thunder
What is certain is that on-chain AI must be a direction full of imagination and worthy of in-depth research. Breakthroughs in on-chain AI can bring unprecedented value to web3. But at the same time, ZKML’s extremely high academic threshold and requirements for the underlying infrastructure are indeed not suitable for most startups. Most projects don’t necessarily need to incorporate the support of trustless LLMs to achieve a breakthrough in their own value.
However, not all AI models need to be moved on-chain to use ZK for trustless, just as most people don’t care about how the chatbot infers about queries and gives results, and they don’t care whether the stable diffusion used is a certain version of the model architecture or specific parameter settings. In most scenarios, most users focus on whether the model can give a satisfactory output, rather than whether the inference process is trustless or transparent.
If proving does not bring a hundredfold overhead or higher inference cost, perhaps ZKML still has the strength to fight, but in the face of high on-chain inference costs and higher costs, any demander has reason to question the necessity of Onchain AI.
From the demand side
What the user cares about is whether the result given by the model is make sense, as long as the result is reasonable, the trustless brought by ZKML can be said to be worthless.
If a neural network-based trading bot brings a hundredfold return to users every cycle, who would question whether the Algorithm is centralized or verifiable?
Similarly, if the trading bot starts to lose money to users, then the project team should think more about how to improve the capabilities of the model rather than spending energy and capital on making the model verifiable. This is the contradiction in the requirements of ZKML, in other words, the verifiability of the model does not fundamentally solve people’s doubts about AI in many scenarios, which is a bit of a contradiction.
From the supply side
There is a long way to go to develop a proving model that is sufficient to support the big oracle model, and judging from the current attempts of the head project, it is almost impossible to see the day when the big model will be put on the chain.
Referring to our previous article on ZKML, from a technical point of view, the goal of ZKML is to convert neural networks into ZK circuits, and the difficulties are:
ZK circuits do not support floating-point numbers;
Large-scale neural networks are difficult to convert.
From the current progress:
The latest ZKML library supports some simple neural network ZK, which is said to be able to chain basic linear regression models. But there are very few demos in existence.
Theoretically, the maximum ** can support the parameter of ~100M, but it only exists in theory. **
ZKML’s development progress has not met expectations, judging from the current progress of the track head project modulus lab and the proof-of-proving released by EZKL, some simple models can be converted into ZK circuits to do model on-chain or inference proofs on the chain. But this is far from the value of ZKML not event close, and the bottleneck of the technology does not seem to have the core motivation to break through, a track with a serious lack of demand is fundamentally unable to gain the attention of the academic community, which means that it is more difficult to make excellent poc to attract/meet the remaining demand, which may also be the death spiral that kills ZKML.
OPML: Transition or Endgame?
The difference between OPML and ZKML is that ZKML proves the complete inference process, while OPML re-executes part of the inference process when the inference is challenged. Obviously, the biggest problem that OPML solves is the high cost/overhead, which is a very pragmatic optimization.
As the pioneer of OPML, the HyperOracle team gave the architecture and progression process of one-phase to multi-phase opML in “opML is All You Need: Run a 13B ML Model in Ethereum”:
Build a virtual machine for off-chain execution and on-chain validation, ensuring equivalence between offline VMs and VMs implemented in on-chain smart contracts.
In order to ensure the inference efficiency of the AI model in the VM, a specially designed lightweight DNN library (not dependent on popular machine learning frameworks like Tensorflow or PyTorch) was implemented, and the team also provided a script that could convert Tensorflow and PyTorch models into this lightweight library.
Compile the AI model inference code into VM program instructions through cross-compilation.
VM images are managed through the Merkle tree. Only the Merkle root, which represents the VM state, will be uploaded to the on-chain Smart Contract.
However, it is clear that a key flaw in this design is that all computation must be performed in a virtual machine, which prevents the use of GPU/TPU acceleration and parallel processing, limiting efficiency. Hence the introduction of multi-phase opML.
Only in the final phase, the calculation is performed in the VM.
In other phases, the computation of state transitions takes place in a native environment, which takes advantage of the capabilities of e.g. CPU, GPU, TPU, and supports parallel processing. This reduces reliance on VMs and significantly improves execution performance to a level comparable to native environments.
Reference:
LET’S BE REAL
There is a view that OPML is a transition before the realization of a full ZKML, but it is more realistic to say that it is better to regard it as a kind of Onchain AI based on the cost structure and landing expectations of the trade-off, perhaps the day of the full realization of ZKML will never come, at least I am pessimistic about this, then the hype of Onchain AI will eventually have to face the most realistic landing and cost, then OPML may be Onchain The best practices of AI, just like the ecology of OP and ZK, have never been a substitute relationship.
Although, don’t forget, the shortcomings of the previous requirements still exist, OPML’s cost- and efficiency-based optimization does not fundamentally solve the problem of “since users care more about the rationality of the results, why move AI to the chain to make trustless”, transparency, ownership, and Trustless, these buffs are really full of bells and whistles, but do users really care? In contrast, the embodiment of value should be in the reasoning ability of the model.
I think this kind of cost optimization is technically an innovative and solid attempt, but it’s more of a lame circle in terms of value;**
Perhaps the Onchain AI track itself is holding a hammer to find a nail, but this is also true, the development of an early industry is to continue to explore the innovative combination of cross-domain technologies, and find the best fit point in the continuous running-in.
2.4 Application Layer: 99% of stitching monsters
I have to say that AI’s attempts at the web3 application layer are indeed going forward, as if everyone is fomo, but 99% of the integration still stays in integration, and there is no need to map how valuable the project itself is by the reasoning ability of gpt.
From the application layer, there are roughly two ways out:
Improve user experience and development efficiency with the help of AI capabilities: In this case, AI will not be the core highlight, but more often as a behind-the-scenes worker, or even indifferent to users. The combination of crypto wants to be very Satoshi, grasps the point of high fit, the most valuable point, is to use AI as a production value tool on the one hand, improve efficiency and quality, on the other hand, through the reasoning ability of AI to improve the user’s game experience, AI and crypto do bring very important value, but fundamentally still use the means of instrumentalizing technology, the real advantage and core of the project is still the team’s ability to develop games
Combined with the AI marketplace, it has become an important part of the entire ecosystem for users.
Three, finally…
If there is really anything that needs to be emphasized or summarized: AI is still one of the most noteworthy and most promising tracks in web3, this general logic will not change;
But I think the most noteworthy is the AI marketplace gameplay, fundamentally this platform or infra design is in line with the needs of value creation and to meet the interests of all parties, macroscopically, in addition to the model or Computing Power itself to create a unique way of web3 value capture is attractive enough, at the same time, this also allows users to directly participate in the wave of AI in a unique way.
Maybe in three months I’ll overturn my current idea again, so:
The above is just my opinion on this track is very real, and it really does not constitute any investment advice!
Reference
“opML is All You Need: Run a 13B ML Model in Ethereum”: __Ui5I9gFOy7-da_jI1lgEqtnzSIKcwuBIrk-6YM0Y
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
The latest thinking and exploration of AI+Crypto tracks
Author: Ian@Foresight Ventures
Tl; DR
After months of delving into the field where AI and Crypto Assets are combined, the understanding of this direction is deeper. This article makes a comparative analysis of the early views and the current direction of the track, and those who are familiar with the track can start from the second section. **
First, the AI track review
In the past few months, I have done Depth research on the topic of AI + crypto, and after a few months of precipitation, I am glad that I have gained insight into the direction of some tracks at a relatively early stage, but I can also see that there are some opinions that do not seem accurate now.
**This article is only about opinions, not intro,**It will cover several general directions of AI in web3 and show my views and analysis of the track before and now. Different perspectives may have different inspirations, which can be viewed dialectically.
Let’s first review the main directions of AI + crypto set in the first half of the year:
1.1 Distributed Computing Power
In “A Rational Look at DecentralizationComputing Power Network”, based on the logic that Computing Power will become the most valuable resource in the future, the value that crypto can give to the Computing Power network is analyzed.
Although Decentralization Distributed Computing Power Network has the greatest demand for AI large model training, it also faces the biggest challenges and technical bottlenecks. This includes the need for complex data synchronization and network optimization issues. In addition, data privacy and security are also important constraints. Although there are some existing technologies that can provide initial solutions, they are still not applicable in large-scale distributed training tasks due to the huge computational and communication overhead. Obviously, the Decentralization Distributed Computing Power Network has more opportunities to land in model inference, and the incremental space that can predict the future is also large enough. However, it also faces challenges such as communication delays, data privacy, and model security. Compared with model training, inference has lower computational complexity and data interactivity, and is more suitable for conducting in a distributed environment.
1.2 Decentralization AI Marketplace
In “The Best Attempt to Decentralization AI Marketplace”, it is mentioned that a successful Decentralization AI marketplace needs to closely combine the advantages of AI and Web3, use the added value of distribution, asset Equity Confirmation, revenue distribution and DecentralizationComputing Power to drop the threshold of AI applications, encourage developers to upload and share models, and protect users’ data privacy rights, so as to build a developer-friendly AI resource trading and sharing platform that meets user needs.
The idea at the time (and probably not entirely accurate now) was that data-based AI marketplaces had much more potential. The marketplace of the dead model needs the support of a large number of high-quality models, but the early platform lacks a user base and high-quality resources, which makes it difficult to attract high-quality models due to insufficient incentives for excellent model providers; while the data-based marketplace can accumulate a large amount of valuable data and resources, especially private domain data, through decentralization, distributed collection, incentive layer design and data ownership guarantee.
Decentralization AI marketplace success relies on the accumulation of user resources and strong network effects, where users and developers can get more value from the marketplace than they can get outside the market. In the early days of the market, the focus is on accumulating high-quality models to attract and retain users, and then moving on to attracting and retaining more end-users after establishing a library of high-quality models and data barriers.
1.3 ZKML
Before the topic of ZKML was widely discussed, the value of on-chain AI was discussed in “AI + Web3 = ?”.
Without sacrificing Decentralization and trustlessness, onchain AI has the opportunity to lead the web3 world to the “next level”. The current Web3 is like the early stage of Web2, and it has not yet taken on the ability to take on wider adoption or create greater value. onchain AI is designed to provide a transparent and trustless solution.
1.4 AI Applications
In “AI + Crypto Starts to Talk about Web3 Women’s Game-HIM”, combined with the portfolio project “HIM”, the value of large models in web3 applications is analyzed. In addition to the hard core from infrastructure to algorithms, the development of trustless LLMs on the chain, another direction is to dilute the impact of the black box in the process of inference in the product, and find a suitable scenario to implement the powerful inference ability of the large model.
Second, the current AI track analysis
2.1 Computing Power Network: There is a lot of room for imagination but a high threshold
The big logic of the Computing Power network remains the same, but it still faces the challenge of market demand, who needs a solution with lower efficiency and stability? Therefore, I think the following points need to be figured out:
What is Decentralization for?
If you ask a founder of a DecentralizationComputing Power network now, he will tell you that our Computing Power network can enhance security and resistance to attacks, increase transparency and trust, optimize the use of resources, better data privacy and user control, resist censorship and interference…
These are common sense, and any web3 project can be involved in censorship-resistant, trustless, privacy, etc., but my point is that none of these matters. Decentralization Computing Power networks do not essentially solve the problem of privacy, and there are many contradictions such as security. Therefore: the ultimate purpose of Computing Power network Decentralization must be for lower costs. The higher the degree of Decentralization, the lower the cost of using Computing Power.
So, fundamentally, “using idle Computing Power” is more of a long-term narrative, and whether a DecentralizationComputing Power network can be made depends largely on whether he has figured out the following points:
Value provided by Web3
A clever token design and the consequent incentive/punishment mechanism are clearly a powerful value add provided by the decentralization community. Compared with the traditional Internet, tokens not only serve as a medium of exchange, but also complement smart contracts to enable protocols to achieve more complex incentive and governance mechanisms. At the same time, the openness and transparency of transactions, the drop of costs, and the improvement of efficiency all benefit from the value brought by crypto. This unique value provides more flexibility and room for innovation to motivate contributors.
But at the same time, I also hope that this seemingly reasonable “fit” can be viewed rationally, for the DecentralizationComputing Power network, the value brought by Web3 and Blockchain technology is only “added value” from another perspective, rather than a fundamental subversion, and cannot change the basic working mode of the entire network and break through the current technical bottleneck.
In short, the value of these web3s is to enhance the attractiveness of the Decentralization Network, but it will not completely change its core structure or operating model, and if the Decentralization Network is to truly occupy a place in the AI wave, the value of web3 alone is far from enough. Therefore, as mentioned later, the right technology solves the right problem, and the gameplay of the DecentralizationComputing Power network is by no means simply to solve the problem of the shortage of AI Computing Power, but to give this long-dormant track a new way of playing and thinking.
It may be like pow mining or storage mining, monetizing computing power as an asset. In this model, providers of Computing Power can earn Tokens as remuneration by contributing their own computing resources. The appeal is that it provides a way to directly convert computing resources into economic gains, thereby incentivizing more participants to join the network. It may also be based on web3 to create a market that consumes computing power, and open up a demand point that can accept unstable and slower computing power by financializing the upstream of computing power (such as models).
Want to understand how to combine with the actual needs of users, after all, the needs of users and participants are not necessarily just efficient Computing Power, “can make money” is always one of the most convincing motivations.
The core competitiveness of DecentralizationComputing Power network is price
If we must discuss DecentralizationComputing Power in terms of actual value, then the biggest imagination space brought by web3 is the Computing Power cost that has the opportunity to be further compressed.
The higher the Decentralization of the Computing PowerNode, the lower the price per unit Computing Power. It can be deduced from the following directions:
Case: ChainML
To put it simply: ChainML is a Decentralization Platform that provides Computing Power for inference and finetuning. In the short term, chainml will implement the Open Source AI proxy framework Council, which will bring demand growth to the Decentralization Computing Network through Council’s attempt (a chatbot that can be integrated into different applications). In the long run, chainml will be a complete AI + web3 platform (which will be analyzed in detail later), including the model market and Computing Power market.
I think the technical path planning of ChainML is very reasonable, and they think very clearly about the problems mentioned before, the purpose of DecentralizationComputing Power must not be to provide sufficient Computing Power supply for the AI industry on a par with centralized Computing Power, but to gradually drop the cost to allow the right demander to accept this lower quality Computing Power source. Therefore, from Computing Power DecentralizationComputing PowerNode the perspective of the product path, it should start from the centralized way, run the product link in the early stage, and start accumulating customers through strong BD capabilities, expand and base the market, and then gradually disperse the providers of centralized Computing Power to smaller companies at a higher cost, and finally roll out the Computing PowerNode on a large scale. This is the idea of chainml divide and conquer.
From the perspective of the layout of the demand side, ChainML has built an MVP of a centralized infrastructure protocol, and the design concept is portable. We have been running the system with customers since February this year and have been using it in production since April this year. Currently running on Google Cloud, but based on Kubernetes and other Open Source technologies, it’s easy to port to other environments (AWS, Azure, Coreweave, etc.). Decentralization of the protocol will follow, decentralization to niche clouds, and finally Miners who provide computing power.
2.2 AI Market: More Room for Imagination
This sector is called AI markerplace, which somewhat limits the imagination space. Strictly speaking, an “AI market” with real imagination space should be an intermediate platform that financializes the whole link of the model, covering from the underlying Computing Power and data to the model itself and related applications. As mentioned earlier DecentralizationComputing Power the main contradiction in the early stage was how to create demand, and a closed-loop market that financializes the whole link of AI has the opportunity to give birth to this kind of demand.
Something like this:**
An AI market supported by web3 is based on computing power and data, attracting developers to build or fine-tune models through more valuable data, and then develop corresponding model-based applications, which create demand for computing power while developing and using these applications and models. Under the incentive of Token and community, real-time data collection tasks based on bounty or normalized incentives for contributing data have the opportunity to expand and expand the unique advantages of the data layer in this market. At the same time, the popularity of applications also returns more valuable data to the data layer.
Community
In addition to the value brought by the token mentioned earlier, the community is undoubtedly one of the biggest gains brought by web3 and is the core driving force for the development of the platform. For example, the achievement of data diversity is an advantage of such platforms, which is essential for building accurate and unbiased AI models, and is also a bottleneck in the current direction of data.
I think the core of the whole platform is the model, and we realized early on that the success of an AI marketplace depends on the existence of high-quality models, and what incentive do developers have to provide models on a decentralization platform? But we also seem to have forgotten to think about a problem, spelling infrastructure is not as hard as traditional platforms, spelling developer communities are not as mature as traditional platforms, and spelling reputation does not have the first-mover advantage of traditional platforms, so compared with the huge user base and mature infrastructure of traditional AI platforms, web3 projects can only overtake in corners.
The answer may lie inAI model financialization
2.3 Onchain AI: OPML overtaking in corners?
ZKML: Demand and supply both end of the thunder
What is certain is that on-chain AI must be a direction full of imagination and worthy of in-depth research. Breakthroughs in on-chain AI can bring unprecedented value to web3. But at the same time, ZKML’s extremely high academic threshold and requirements for the underlying infrastructure are indeed not suitable for most startups. Most projects don’t necessarily need to incorporate the support of trustless LLMs to achieve a breakthrough in their own value.
However, not all AI models need to be moved on-chain to use ZK for trustless, just as most people don’t care about how the chatbot infers about queries and gives results, and they don’t care whether the stable diffusion used is a certain version of the model architecture or specific parameter settings. In most scenarios, most users focus on whether the model can give a satisfactory output, rather than whether the inference process is trustless or transparent.
If proving does not bring a hundredfold overhead or higher inference cost, perhaps ZKML still has the strength to fight, but in the face of high on-chain inference costs and higher costs, any demander has reason to question the necessity of Onchain AI.
From the demand side
What the user cares about is whether the result given by the model is make sense, as long as the result is reasonable, the trustless brought by ZKML can be said to be worthless.
From the supply side
There is a long way to go to develop a proving model that is sufficient to support the big oracle model, and judging from the current attempts of the head project, it is almost impossible to see the day when the big model will be put on the chain.
Referring to our previous article on ZKML, from a technical point of view, the goal of ZKML is to convert neural networks into ZK circuits, and the difficulties are:
From the current progress:
ZKML’s development progress has not met expectations, judging from the current progress of the track head project modulus lab and the proof-of-proving released by EZKL, some simple models can be converted into ZK circuits to do model on-chain or inference proofs on the chain. But this is far from the value of ZKML not event close, and the bottleneck of the technology does not seem to have the core motivation to break through, a track with a serious lack of demand is fundamentally unable to gain the attention of the academic community, which means that it is more difficult to make excellent poc to attract/meet the remaining demand, which may also be the death spiral that kills ZKML.
OPML: Transition or Endgame?
The difference between OPML and ZKML is that ZKML proves the complete inference process, while OPML re-executes part of the inference process when the inference is challenged. Obviously, the biggest problem that OPML solves is the high cost/overhead, which is a very pragmatic optimization.
As the pioneer of OPML, the HyperOracle team gave the architecture and progression process of one-phase to multi-phase opML in “opML is All You Need: Run a 13B ML Model in Ethereum”:
However, it is clear that a key flaw in this design is that all computation must be performed in a virtual machine, which prevents the use of GPU/TPU acceleration and parallel processing, limiting efficiency. Hence the introduction of multi-phase opML.
Reference:
LET’S BE REAL
There is a view that OPML is a transition before the realization of a full ZKML, but it is more realistic to say that it is better to regard it as a kind of Onchain AI based on the cost structure and landing expectations of the trade-off, perhaps the day of the full realization of ZKML will never come, at least I am pessimistic about this, then the hype of Onchain AI will eventually have to face the most realistic landing and cost, then OPML may be Onchain The best practices of AI, just like the ecology of OP and ZK, have never been a substitute relationship.
Although, don’t forget, the shortcomings of the previous requirements still exist, OPML’s cost- and efficiency-based optimization does not fundamentally solve the problem of “since users care more about the rationality of the results, why move AI to the chain to make trustless”, transparency, ownership, and Trustless, these buffs are really full of bells and whistles, but do users really care? In contrast, the embodiment of value should be in the reasoning ability of the model.
I think this kind of cost optimization is technically an innovative and solid attempt, but it’s more of a lame circle in terms of value;**
Perhaps the Onchain AI track itself is holding a hammer to find a nail, but this is also true, the development of an early industry is to continue to explore the innovative combination of cross-domain technologies, and find the best fit point in the continuous running-in.
2.4 Application Layer: 99% of stitching monsters
I have to say that AI’s attempts at the web3 application layer are indeed going forward, as if everyone is fomo, but 99% of the integration still stays in integration, and there is no need to map how valuable the project itself is by the reasoning ability of gpt.
From the application layer, there are roughly two ways out:
Improve user experience and development efficiency with the help of AI capabilities: In this case, AI will not be the core highlight, but more often as a behind-the-scenes worker, or even indifferent to users. The combination of crypto wants to be very Satoshi, grasps the point of high fit, the most valuable point, is to use AI as a production value tool on the one hand, improve efficiency and quality, on the other hand, through the reasoning ability of AI to improve the user’s game experience, AI and crypto do bring very important value, but fundamentally still use the means of instrumentalizing technology, the real advantage and core of the project is still the team’s ability to develop games
Combined with the AI marketplace, it has become an important part of the entire ecosystem for users.
Three, finally…
If there is really anything that needs to be emphasized or summarized: AI is still one of the most noteworthy and most promising tracks in web3, this general logic will not change;
But I think the most noteworthy is the AI marketplace gameplay, fundamentally this platform or infra design is in line with the needs of value creation and to meet the interests of all parties, macroscopically, in addition to the model or Computing Power itself to create a unique way of web3 value capture is attractive enough, at the same time, this also allows users to directly participate in the wave of AI in a unique way.
Maybe in three months I’ll overturn my current idea again, so:
The above is just my opinion on this track is very real, and it really does not constitute any investment advice!
Reference
“opML is All You Need: Run a 13B ML Model in Ethereum”: __Ui5I9gFOy7-da_jI1lgEqtnzSIKcwuBIrk-6YM0Y