Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
The Future of Privacy in AI and Cryptocurrency
In yesterday’s article, I shared the risks to personal privacy that I encountered with AI applications.
The reason this risk occurs is due to the way today’s AI applications operate—when end users call on AI, their data is directly uploaded to an AI foundation model in the cloud, and the foundation model performs inference based on that data, ultimately obtaining the user’s behavioral traits.
The longer this data is accumulated, the more able the AI foundation model will be to understand the user’s behavioral habits comprehensively through algorithms.
For individuals, this risk becomes a privacy leak; for companies, it turns into a disclosure of trade secrets.
In a video interview about AI from last year featuring Huang Renxun, he also mentioned this risk and strictly required that his company’s employees, when using AI tools, keep what data must remain on the local device and what data may be uploaded to the cloud.
Back then, though, when I heard that warning, it was only something I learned as information. It wasn’t until I personally experienced this risk that his warning came back to mind.
This problem is only just beginning to surface now, but it will quickly escalate.
So I believe that in the future, the more we move into an era of AI adoption, the more we need a device (whether it’s a terminal—like a phone, or glasses, or some other form we can’t yet imagine) that runs a streamlined version of a foundation model locally, processes most sensitive data locally, performs inference for simple requests, and only uploads some heavy tasks and data—after “filtering”—to the cloud so that the cloud’s complex foundation model can handle them. This prevents the cloud foundation model from directly extracting a user’s personal behavioral traits.
And in the encryption ecosystem, privacy protection has long been on the agenda.
Earlier, Vitalik also mentioned that public chains like Ethereum can hinder their widespread use in commercial settings because the transparency and openness of data and information make it difficult—since in the commercial world, many transaction counterparties often don’t want to disclose transaction information to protect trade secrets.
Not long ago, some commercial users pointed out that they are quite cautious about using stablecoins at scale. Because stablecoin accounts on public chains are all public, this means anyone can see which accounts hold how many stablecoins. Once account identity information leaks, how much money (stablecoins) a company/enterprise holds becomes transparent information.
So whether it’s AI or the encryption ecosystem, privacy is a problem that must be addressed next.
However, if we carefully compare how AI and the encryption ecosystem apply privacy, at least for now it looks like encrypted applications are ahead—because the encryption ecosystem has long had privacy coins (such as Monero, ZCASH, etc.) and mixers.
But these privacy applications have been doing everything they can to evade regulation, so they’ve inevitably picked up a number of negative labels to varying degrees.
The privacy applications that can truly be accepted by the public and regulators—at least based on what currently seems most mature and feasible—might still be solutions based on zero-knowledge proofs. For example, a method that has been experimented with:
Have an institution that has obtained a regulatory license act as an intermediary. Both transaction parties conduct the transaction through this intermediary, while concealing their identities and transaction information, and only place the final generated zero-knowledge proof on the public chain for verification.
This way, transaction information is kept private, and both transaction parties can also avoid being treated as suspects in criminal activities.
I hope that in terms of privacy protection and transaction compliance, exploration and application in the encryption ecosystem can leverage its own advantages to first open up a new path for AI applications.