Decentralized storage has gained quite a bit of attention in recent years, but the old problem remains—cost. When the annual cost of mainstream storage networks for 1TB often runs into thousands of dollars, the cost advantage is almost a castle in the air.
Recently, a project has been breaking through with a different approach. Its core lies in the application of erasure coding technology, reducing the replication factor to 4-5 times, achieving an 80-100 fold decrease in cost. Put another way, if the traditional solution costs 100 units, this solution only costs 1 unit. How significant is this? Storing 1TB of data, the annual cost drops to around $50.
It's not about simply slashing costs recklessly, but doing so while ensuring data availability. Even if two-thirds of the nodes go offline simultaneously, data can still be recovered. For scenarios like AI model training, game resource storage, and NFT projects—where large capacity storage is truly needed—this change is more than just numbers on a ledger.
With costs lowered, the key is how to activate the ecosystem. Currently, these types of projects are attracting the attention of developers and project teams, aiming to truly unlock the imagination space of Web3 infrastructure.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
8 Likes
Reward
8
3
Repost
Share
Comment
0/400
governance_lurker
· 8h ago
$50 a year? If it can really stay stable, the storage sector is going to explode.
View OriginalReply0
VibesOverCharts
· 8h ago
$50 for 1TB? Now this is starting to get interesting. Those previous projects just kept shouting slogans, and it was really annoying.
View OriginalReply0
MidsommarWallet
· 8h ago
Saving 50 dollars a year for 1TB? If that's true, centralized storage deserves a big slap.
Decentralized storage has gained quite a bit of attention in recent years, but the old problem remains—cost. When the annual cost of mainstream storage networks for 1TB often runs into thousands of dollars, the cost advantage is almost a castle in the air.
Recently, a project has been breaking through with a different approach. Its core lies in the application of erasure coding technology, reducing the replication factor to 4-5 times, achieving an 80-100 fold decrease in cost. Put another way, if the traditional solution costs 100 units, this solution only costs 1 unit. How significant is this? Storing 1TB of data, the annual cost drops to around $50.
It's not about simply slashing costs recklessly, but doing so while ensuring data availability. Even if two-thirds of the nodes go offline simultaneously, data can still be recovered. For scenarios like AI model training, game resource storage, and NFT projects—where large capacity storage is truly needed—this change is more than just numbers on a ledger.
With costs lowered, the key is how to activate the ecosystem. Currently, these types of projects are attracting the attention of developers and project teams, aiming to truly unlock the imagination space of Web3 infrastructure.