As you may have seen on Twitter, we are very interested in the current AI/LLM space. While there is still a lot of room for improvement in accelerating research, we see potential.
The advent of large language models (LLMs) in the cryptocurrency space is revolutionizing the way non-technical players interact, understand, and contribute to the industry.
Before, if you didn’t know how to code, you felt completely lost. Large language models like chatGPT are now bridging the gap between complex programming languages and everyday language. This is very important because the cryptocurrency space is dominated by people with specialized technical expertise.
If you come across something you don’t understand, or think that a project is deliberately obscuring the reality of its underlying system, you can ask chatGPT and get a quick, almost free answer.
DeFi is democratizing access to finance, and large language models are democratizing access to DeFi.
In today’s article, we’ll present some ideas that we think large language models may have on DeFi.
1. DeFi security
As we’ve noted, DeFi is transforming financial services by reducing friction and overhead costs, as well as replacing large teams with efficient code.
We’ve detailed where DeFi is headed. DeFi:
Reduced friction costs – Fuel costs will eventually go down
Reduce overhead costs because there is no physical location, only code
Reduce labor costs, you’ve replaced thousands of bankers with 100 programmers
Allow anyone to provide financial services (such as lending and market making)
DeFi is a leaner operating model that doesn’t rely on a middleman for execution.
In DeFi, “counterparty risk” is replaced by software security risk. The code and mechanisms that protect your assets and facilitate your transactions are constantly at risk from external threats that try to steal and exploit funds.
AI, especially LLMs, plays a key role in automating the development and auditing of smart contracts. By analyzing the codebase and identifying patterns, AI can find vulnerabilities (over time) and optimize the performance of smart contracts, reducing human error and improving the reliability of DeFi protocols. By comparing contracts to databases of known vulnerabilities and attack vectors, LLMs can highlight areas of risk.
One area where LLMs are already a viable and accepted solution to software security problems is to help write test suites. Writing unit tests can be tedious, but it’s an important part of software quality assurance and is often overlooked because of the rush to get to market too quickly.
However, there is a “dark side” to this. If LLMs can help you audit your code, they can also help hackers find ways to exploit your code in the open-source world of encryption.
Fortunately, the crypto community is full of white hats and has a bounty system that helps mitigate some of the risks.
Cybersecurity professionals do not advocate “security through obfuscation.” Instead, they assume that the attacker is already familiar with the system’s code and vulnerabilities. AI and LLMs can help automatically detect insecure code at scale, especially for non-programmers. More smart contracts are deployed every day than humans can audit. Sometimes in order to capture economic opportunities (such as mining), it is necessary to interact with new and popular contracts without having to wait for a period of time to test.
That’s where a platform like Rug.AI comes in, providing you with an automated assessment of new projects against known code vulnerabilities.
Perhaps the most revolutionary aspect is the ability of LLMs to help write code. As long as the user has a basic understanding of their needs, they can describe what they want in natural language, and LLMs can translate those descriptions into functional code.
This lowers the barrier to entry for creating blockchain-based applications, allowing a wider range of innovators to contribute to the ecosystem.
And that’s just the beginning. Personally, we’ve found that LLMs are better suited for refactoring code, or explaining what code does for beginners, rather than for brand new projects. It’s important to give context and clear specifications to your model, otherwise there’s a “garbage in, garbage out” situation.
LLMs can also help those who don’t know how to code by translating smart contract code into natural language. Maybe you don’t want to learn programming, but you do want to make sure that the code of the protocol you’re using matches the promise of the protocol.
Although we suspect that LLMs will not be able to replace high-quality developers in the short term, developers can do another round of rational examination of their work through LLMs.
Conclusion: Encryption has become much simpler and more secure for all of us. Just be careful not to over-rely on these LLMs. They sometimes make mistakes with confidence. The ability of LLMs to fully understand and predict code is still developing.
2. Data analysis and insights
When collecting data in the cryptocurrency space, you’ll come across Dune Analytics sooner or later. If you haven’t heard of it, Dune Analytics is a platform that allows users to create and publish data analytics visualizations, with a primary focus on ETH blockchain and other related blockchains. It is a useful and user-friendly tool for tracking DeFi metrics.
Dune Analytics already has GPT-4 capabilities that can interpret queries in natural language.
If you’re confused about a query, or want to create and edit one, you can turn to chatGPT. Note that it will perform better if you provide some example queries in the same conversation, and you’ll still want to learn on your own in order to validate chatGPT’s work. However, it’s a great way to learn and ask questions, and you can ask chatGPT like you would a tutor.
LLMs significantly lower the barrier to entry for non-technical cryptocurrency participants.
In terms of insight, though, LLMs are disappointing when it comes to providing unique insights. In complex, rational financial markets, don’t expect LLMs to give the right answers. If you’re someone who acts on instinct and intuition, you’ll find that LLMs fall far short of your expectations.
However, we’ve found an effective use – to check if the obvious is missing. You’re less likely to find non-obvious or contradictory insights that actually pay off. This is not surprising (if someone develops an AI that delivers super high market returns, they don’t release this part to the wider public).
3, “The Discord Administrator Disappears?”
In the cryptocurrency space, managing a group of users who are passionate about a popular project but have changing needs is one of the most unrecognized and painful jobs. Many of the same common questions are asked repeatedly, sometimes consecutively. This seems to be a pain point that should be easily solved with LLMs.
LLMs have also shown some accuracy in detecting whether messages are self-promoting (spam). We expect this to be used to detect malicious links (or other hacks) as well. It’s really hard to manage a busy discord group with thousands of active members and regular postings, so we’re looking forward to some LLMs-powered Discord bots to help.
4, “Whimsical Things”
A recurring meme in the crypto space is the launch of currencies based on popular memes. These range from staying memes like DOGE, SHIB, and PEPE, to random currencies that disappear within an hour based on the hot search terms of the day (mostly scams, which we avoid engaging in).
If you have access to the Twitter Firehose API, you can track the sentiment of cryptocurrencies in real-time and train an LLM to flag trends and then use humans to interpret the nuances in them. A simple example of an application would be when there is a viral moment, and you can launch a meme currency based on sentiment analysis.
Perhaps there is a way to build something like a poor man’s version of a sentiment grabber that monitors a subset of popular crypto influencers across multiple social media channels without having to deal with the cost and bandwidth of a “rocket jet” type of API data source.
LLMs are great for this because they provide insight into context (parsing sarcasm and spoofs online to derive real insights). This LLM buddy will evolve and learn with the crypto industry, where most of the action is discussed on crypto Twitter. The crypto industry, with its open debate forums and open-source technology, provides a unique environment for LLMs to capture market opportunities.
But to avoid being fooled by intentional social media manipulation, the technology needs to be more sophisticated: artificial grassroots campaigns, undisclosed sponsorships, and online trolls. In another article, we covered an interesting third-party research report suggesting that some entities may be consciously manipulating social media in order to increase the value of crypto projects related to FTX/Alameda.
NCRI’s analysis shows that bot-like accounts account for a significant percentage (around 20%) of online discussions mentioning FTX’s listed coin.
This bot-like activity heralds the prices of many FTX coins in the data sample.
After the promotion of FTX, the activity of these coins became more and more inauthentic over time: the proportion of inauthentic, bot comments steadily increased, accounting for about 50% of the total discussion.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
How is AI applied in DeFi?
AUTHOR: DEFI EDUCATION
Translation: Vernacular blockchain
**
**
As you may have seen on Twitter, we are very interested in the current AI/LLM space. While there is still a lot of room for improvement in accelerating research, we see potential.
The advent of large language models (LLMs) in the cryptocurrency space is revolutionizing the way non-technical players interact, understand, and contribute to the industry.
Before, if you didn’t know how to code, you felt completely lost. Large language models like chatGPT are now bridging the gap between complex programming languages and everyday language. This is very important because the cryptocurrency space is dominated by people with specialized technical expertise.
If you come across something you don’t understand, or think that a project is deliberately obscuring the reality of its underlying system, you can ask chatGPT and get a quick, almost free answer.
DeFi is democratizing access to finance, and large language models are democratizing access to DeFi.
In today’s article, we’ll present some ideas that we think large language models may have on DeFi.
1. DeFi security
As we’ve noted, DeFi is transforming financial services by reducing friction and overhead costs, as well as replacing large teams with efficient code.
We’ve detailed where DeFi is headed. DeFi:
In DeFi, “counterparty risk” is replaced by software security risk. The code and mechanisms that protect your assets and facilitate your transactions are constantly at risk from external threats that try to steal and exploit funds.
AI, especially LLMs, plays a key role in automating the development and auditing of smart contracts. By analyzing the codebase and identifying patterns, AI can find vulnerabilities (over time) and optimize the performance of smart contracts, reducing human error and improving the reliability of DeFi protocols. By comparing contracts to databases of known vulnerabilities and attack vectors, LLMs can highlight areas of risk.
One area where LLMs are already a viable and accepted solution to software security problems is to help write test suites. Writing unit tests can be tedious, but it’s an important part of software quality assurance and is often overlooked because of the rush to get to market too quickly.
However, there is a “dark side” to this. If LLMs can help you audit your code, they can also help hackers find ways to exploit your code in the open-source world of encryption.
Fortunately, the crypto community is full of white hats and has a bounty system that helps mitigate some of the risks.
Cybersecurity professionals do not advocate “security through obfuscation.” Instead, they assume that the attacker is already familiar with the system’s code and vulnerabilities. AI and LLMs can help automatically detect insecure code at scale, especially for non-programmers. More smart contracts are deployed every day than humans can audit. Sometimes in order to capture economic opportunities (such as mining), it is necessary to interact with new and popular contracts without having to wait for a period of time to test.
That’s where a platform like Rug.AI comes in, providing you with an automated assessment of new projects against known code vulnerabilities.
Perhaps the most revolutionary aspect is the ability of LLMs to help write code. As long as the user has a basic understanding of their needs, they can describe what they want in natural language, and LLMs can translate those descriptions into functional code.
This lowers the barrier to entry for creating blockchain-based applications, allowing a wider range of innovators to contribute to the ecosystem.
And that’s just the beginning. Personally, we’ve found that LLMs are better suited for refactoring code, or explaining what code does for beginners, rather than for brand new projects. It’s important to give context and clear specifications to your model, otherwise there’s a “garbage in, garbage out” situation.
LLMs can also help those who don’t know how to code by translating smart contract code into natural language. Maybe you don’t want to learn programming, but you do want to make sure that the code of the protocol you’re using matches the promise of the protocol.
Although we suspect that LLMs will not be able to replace high-quality developers in the short term, developers can do another round of rational examination of their work through LLMs.
Conclusion: Encryption has become much simpler and more secure for all of us. Just be careful not to over-rely on these LLMs. They sometimes make mistakes with confidence. The ability of LLMs to fully understand and predict code is still developing.
2. Data analysis and insights
When collecting data in the cryptocurrency space, you’ll come across Dune Analytics sooner or later. If you haven’t heard of it, Dune Analytics is a platform that allows users to create and publish data analytics visualizations, with a primary focus on ETH blockchain and other related blockchains. It is a useful and user-friendly tool for tracking DeFi metrics.
Dune Analytics already has GPT-4 capabilities that can interpret queries in natural language.
If you’re confused about a query, or want to create and edit one, you can turn to chatGPT. Note that it will perform better if you provide some example queries in the same conversation, and you’ll still want to learn on your own in order to validate chatGPT’s work. However, it’s a great way to learn and ask questions, and you can ask chatGPT like you would a tutor.
LLMs significantly lower the barrier to entry for non-technical cryptocurrency participants.
In terms of insight, though, LLMs are disappointing when it comes to providing unique insights. In complex, rational financial markets, don’t expect LLMs to give the right answers. If you’re someone who acts on instinct and intuition, you’ll find that LLMs fall far short of your expectations.
However, we’ve found an effective use – to check if the obvious is missing. You’re less likely to find non-obvious or contradictory insights that actually pay off. This is not surprising (if someone develops an AI that delivers super high market returns, they don’t release this part to the wider public).
3, “The Discord Administrator Disappears?”
In the cryptocurrency space, managing a group of users who are passionate about a popular project but have changing needs is one of the most unrecognized and painful jobs. Many of the same common questions are asked repeatedly, sometimes consecutively. This seems to be a pain point that should be easily solved with LLMs.
LLMs have also shown some accuracy in detecting whether messages are self-promoting (spam). We expect this to be used to detect malicious links (or other hacks) as well. It’s really hard to manage a busy discord group with thousands of active members and regular postings, so we’re looking forward to some LLMs-powered Discord bots to help.
4, “Whimsical Things”
A recurring meme in the crypto space is the launch of currencies based on popular memes. These range from staying memes like DOGE, SHIB, and PEPE, to random currencies that disappear within an hour based on the hot search terms of the day (mostly scams, which we avoid engaging in).
If you have access to the Twitter Firehose API, you can track the sentiment of cryptocurrencies in real-time and train an LLM to flag trends and then use humans to interpret the nuances in them. A simple example of an application would be when there is a viral moment, and you can launch a meme currency based on sentiment analysis.
Perhaps there is a way to build something like a poor man’s version of a sentiment grabber that monitors a subset of popular crypto influencers across multiple social media channels without having to deal with the cost and bandwidth of a “rocket jet” type of API data source.
LLMs are great for this because they provide insight into context (parsing sarcasm and spoofs online to derive real insights). This LLM buddy will evolve and learn with the crypto industry, where most of the action is discussed on crypto Twitter. The crypto industry, with its open debate forums and open-source technology, provides a unique environment for LLMs to capture market opportunities.
But to avoid being fooled by intentional social media manipulation, the technology needs to be more sophisticated: artificial grassroots campaigns, undisclosed sponsorships, and online trolls. In another article, we covered an interesting third-party research report suggesting that some entities may be consciously manipulating social media in order to increase the value of crypto projects related to FTX/Alameda.
NCRI’s analysis shows that bot-like accounts account for a significant percentage (around 20%) of online discussions mentioning FTX’s listed coin.
This bot-like activity heralds the prices of many FTX coins in the data sample.
After the promotion of FTX, the activity of these coins became more and more inauthentic over time: the proportion of inauthentic, bot comments steadily increased, accounting for about 50% of the total discussion.