With the Ğ1v1, the need for an off-chain storage mechanism appeared clearly for data like:
- transaction comments
- profile data (contact info, profil picture, …)
- marketplaces (ğchange, airbnjune…)
The v1 approach was mixed and unsatisfying:
- transaction comments went in the main blockchain
- profile data and messaging went to Cesium+ pods with documentation and performance issues
- Ğchange ads went to Ğchange pods (copy of C+ pods) with moderation issues in addition
- airbnjune and other went to centralized services (lack of easy to use decentralized platform)
All of this came with a deteriorated user experience that we do not want to have in v2 ecosystem. The options for v2 datapods depend on the features we want, for example:
- cost (in Ğ1?) for data storage depending on the size of the data
- free storage without limits (spamming issues)
- moderation (ability to whitelist / blacklist ads, conversations…)
- speed (realtime / blockchain-like time…)
Note: this can be achieved after migration to v2, but it’s nice to think about it early.
Note: this is linked to the subject Proposition d'un système de stockage libre intégré à la blockchain pour toutes les données des utilisateurs (DHT) but we can address it with a simpler approach
When we finish typing and plugify de Indexer with @ManUtopiK , we will add Cs+ dump complete profiles, and mecanisme to check users signature and profiles updating.
This can be achieve same way for transactions comments (field already exist in indexer, and comments are here from py-g1-migrator).
I think it could be a good option for Ğ1v2 migration. Free, no quotas, centralized.
Then we could think of a better approach later?
V2S indexer then will be organisze by optionnals pluggins.
You can boot it only for blockchain indexing (v1 or v2 or both), or just for datapod, or both.
I also think that Indexers can do many things easily with rock solid (postgreSQL, node.js) and well known technologies (it means more devs and contributors).
Indexers has a tremendous power to deliver machine or human exploitable blockchain information.
So it is a good candidate to store more information, as a powerful complement to blockchain information.
It is centralized only for off-chain information, but (may be) this can be solved with a federation mechanism as a plugin (pub-sub or another protocol).
For a client software, it will be more simple to connect to only two API (RPC node and GraphQL indexer), sometimes on the same domain, as a strong couple of synchronized and trustful information.
To give an example of what can be done on chain, Polkadot (and Kusama) implement the following:
On the main chain you can submit an extrinsic
identity.setIdentity(info) with data like name, website, riot, email…
The user can ask a registrar to confirm a piece of information with
The data is not stored on chain storage, but only emitted as an event. You can browse these events with polkascan indexer for instance: https://explorer.polkascan.io/polkadot/extrinsic?pallet=Identity&callName=set_identity. This is an example of an identity declared like that: https://explorer.polkascan.io/polkadot/extrinsic/13915801-2.
It’s still necessary to protect against storage spam, at least by applying a quota to all accounts, a more restrictive quota to non-member accounts, and requiring that the account exists (i.e. satisfies existential deposit).
Or else Macron will probably crash your server