Source: IOBC Capital
Web3, as a decentralized, open, and transparent new paradigm of the internet, has a natural opportunity for integration with AI. In the traditional centralized architecture, AI computing and data resources are strictly controlled, and there are many challenges such as computational bottlenecks, privacy breaches, and algorithmic black boxes. However, Web3, based on distributed technology, can inject new vitality into the development of AI through shared computing networks, open data markets, and privacy computation. At the same time, AI can also bring many capabilities to Web3, such as smart contract optimization and anti-cheating algorithms, which can help in its ecosystem development. Therefore, exploring the combination of Web3 and AI is crucial for building the next generation of internet infrastructure and unlocking the value of data and computing power.
Data-Driven: A Solid Foundation for AI and Web3
Data is the core driving force behind the development of AI, just like fuel is to an engine. AI models require a large amount of high-quality data to gain deep understanding and powerful reasoning capabilities. Data not only provides the training foundation for machine learning models but also determines the accuracy and reliability of the models.
In the traditional centralized AI data acquisition and utilization model, there are several main problems:
1. High cost of data acquisition, which is difficult for small and medium-sized enterprises to afford.
2. Data resources being monopolized by tech giants, forming data silos.
3. Personal data privacy facing the risks of leakage and abuse.
Web3 can solve the pain points of the traditional model with a new decentralized data paradigm. Grass allows users to sell idle network resources to AI companies, enabling decentralized web data acquisition. After cleaning and transformation, these data can provide real and high-quality training data for AI models. Public AI adopts the “label to earn” model, incentivizing global workers to participate in data labeling and aggregating global expertise to enhance data analysis capabilities. Blockchain data trading platforms like Ocean Protocol and Streamr provide an open and transparent trading environment for data supply and demand, incentivizing data innovation and sharing.
However, there are still some issues with real-world data acquisition, such as varying data quality, processing difficulties, and lack of diversity and representativeness. Synthetic data may become the future star of Web3 data. Based on generative AI technology and simulation, synthetic data can simulate the attributes of real data as an effective supplement, improving data utilization efficiency. In fields such as autonomous driving, financial market trading, and game development, synthetic data has already demonstrated its mature application potential.
Privacy Protection: The Role of FHE in Web3
In the data-driven era, privacy protection has become a global focus. Regulations such as the General Data Protection Regulation (GDPR) in the European Union reflect strict protection of personal privacy. However, this also brings challenges: some sensitive data cannot be fully utilized due to privacy risks, which undoubtedly limits the potential and reasoning capabilities of AI models.
Fully Homomorphic Encryption (FHE) allows direct computation operations on encrypted data without the need to decrypt the data, and the computation results are consistent with those of the same computation on plaintext data.
FHE provides strong protection for AI privacy computation, enabling GPU computing to perform model training and inference tasks in an environment without touching the original data. This brings significant advantages to AI companies. They can securely open API services while protecting trade secrets.
FHEML supports encryption processing of data and models throughout the machine learning cycle, ensuring the security of sensitive information and preventing the risk of data leakage. In this way, FHEML strengthens data privacy and provides a secure computing framework for AI applications.
FHEML is a complement to ZKML, which proves the correct execution of machine learning, while FHEML emphasizes computing on encrypted data to maintain data privacy.
Computational Revolution: AI Computing in Decentralized Networks
The computational complexity of current AI systems doubles every three months, leading to a surge in computational demand that far exceeds the supply of existing computing resources. For example, OpenAI’s GPT-3 model training requires tremendous computing power equivalent to 355 years of training on a single device. Such a shortage of computing power not only limits the advancement of AI technology but also makes advanced AI models inaccessible to most researchers and developers.
Moreover, the global GPU utilization rate is less than 40%, coupled with the slowing performance improvement of microprocessors and chip shortages caused by supply chain and geopolitical factors, which exacerbates the problem of computing power supply. AI practitioners are caught in a dilemma: either purchasing hardware or renting cloud resources. They urgently need an on-demand, cost-effective computing service.
IO.net is a decentralized AI computing network based on Solana. It aggregates idle GPU resources globally to provide an economical and accessible computing market for AI companies. The demand for computing power can publish computational tasks on the network, and smart contracts allocate tasks to contributing miner nodes. Miners execute tasks and submit results, receiving rewards after verification. IO.net’s solution improves resource utilization efficiency and helps solve the computational bottleneck problem in AI and other fields.
In addition to general decentralized computing networks, there are platforms like Gensyn and Flock.io that focus on AI training, as well as dedicated computing networks like Ritual and Fetch.ai that focus on AI inference.
Decentralized computing networks provide a fair and transparent computing market, breaking monopolies, reducing application barriers, and improving computing utilization efficiency. In the Web3 ecosystem, decentralized computing networks will play a critical role in attracting more innovative DApps to promote the development and application of AI technology.
DePIN: Web3 Empowering Edge AI
Imagine that your phone, smartwatch, and even smart devices in your home have the ability to run AI – this is the charm of Edge AI. It enables computation to occur at the source of data generation, achieving low-latency, real-time processing, while protecting user privacy. Edge AI technology has already been applied in critical areas such as autonomous driving.
In the Web3 field, we have a more familiar name – DePIN. Web3 emphasizes decentralization and user data sovereignty, and DePIN enhances user privacy protection by processing data locally, reducing the risk of data leakage. Web3’s native token economy mechanism can incentivize DePIN nodes to provide computing resources, creating a sustainable ecosystem.
Currently, DePIN is rapidly developing in the Solana ecosystem and has become one of the preferred blockchain platforms for project deployment. Solana’s high TPS, low transaction fees, and technical innovations provide strong support for the DePIN project. Currently, the market value of DePIN projects on Solana exceeds $10 billion, and well-known projects such as Render Network and Helium Network have made significant progress.
IMO: A New Paradigm for AI Model Deployment
The concept of IMO was first proposed by Ora Protocol, which tokenizes AI models.
In the traditional model, due to the lack of profit-sharing mechanisms, developers often struggle to obtain continuous revenue from the subsequent use of AI models, especially when the models are integrated into other products and services. It is difficult for the original creators to track usage and gain profits. Additionally, the performance and effectiveness of AI models often lack transparency, making it difficult for potential investors and users to evaluate their true value, limiting the market recognition and commercial potential of the models.
IMO provides a new way of funding support and value sharing for open-source AI models. Investors can purchase IMO tokens to share the subsequent revenue generated by the models. Ora Protocol uses the ERC-7641 and ERC-7007 standards, combined with Onchain AI Oracle and OPML technology, to ensure the authenticity of AI models and enable token holders to share revenue.
The IMO model enhances transparency and trust, encourages open collaboration, adapts to the trends of the crypto market, and injects momentum into the sustainable development of AI technology. IMO is still in the early stages of experimentation, but as market acceptance and participation expand, its innovation and potential value are worth looking forward to.
AI Agent: A New Era of Interactive Experience
AI Agents have the ability to perceive the environment, engage in independent thinking, and take appropriate actions to achieve predefined goals. With the support of large language models, AI Agents can not only understand natural language but also plan decisions and execute complex tasks. They can serve as virtual assistants, learn user preferences through interaction, and provide personalized solutions. In the absence of explicit instructions, AI Agents can also autonomously solve problems, improve efficiency, and create new value.
Myshell is an open AI-native application platform that provides a comprehensive and user-friendly set of creative tools. It supports users in configuring robot capabilities, appearance, sound, and connecting external knowledge bases. Myshell aims to build a fair and open AI content ecosystem, empowering individuals to become super creators using generative AI technology. Myshell has trained specialized large language models to make role-playing more human-like. Voice cloning technology can accelerate personalized interaction with AI products, reducing the cost of voice synthesis by 99%, and voice cloning can be achieved in just one minute. With the customized AI Agent provided by Myshell, it can currently be applied in various fields such as video chat, language learning, and image generation.
In the fusion of Web3 and AI, the current focus is more on the exploration of the infrastructure layer. Key questions include how to obtain high-quality data, protect data privacy, host models on the blockchain, improve the efficient use of decentralized computing power, and verify large language models. With the gradual improvement of these infrastructures, we have reason to believe that the fusion of Web3 and AI will give birth to a series of innovative business models and services.