Supporting tokenized assets at scale

How we list & maintain standardized metrics for thousands of RWAs

Token Terminal has launched an asset-first data model for tokenized assets or RWAs. We now track onchain metrics at both the project or issuer and the asset level. Our new asset-first data model takes a token’s address and chain as inputs, then automatically lists it with our core metrics across all supported networks.

This means that Token Terminal is equipped to stay at the frontier of the tokenized assets market, with industry-leading coverage of stablecoins, tokenized funds, tokenized stocks, and more.

Asset-first dimensions.

These fields allow us to break down the results by chain, combine the same asset across chains, and/or group assets by issuer or underlying exposure.

Standardized metrics per listed asset:

  • asset_market_cap_circulating

  • asset_price

  • asset_holders

  • asset_transfer_volume

  • asset_transfer_count

  • asset_mints

  • asset_redemptions

Why now

The number of tokenized assets is growing quickly. New stablecoins continue to get issued across chains. Tokenized funds and stocks are increasingly used to bring traditional assets onchain. DEXs and lending markets continue to create more LP position tokens. As more DAOs form, they keep issuing new governance and ownership tokens.

We see this shift directly in customer requests. Institutional investors are increasingly asking for asset-level metrics that work the same way across chains and asset categories. They want to access a consistent set of core metrics for all kinds of tokenized assets across chains.

Our new asset-first data model lets us cater to that demand at scale.

Investor impact

With an asset-first approach to data modeling, we’re able to massively expand coverage in a standardized manner. Examples of supported tokenized assets:

  • Stablecoins

  • Tokenized funds

  • Tokenized stocks

  • DEX LP tokens

  • Lending LP tokens

  • DAO governance tokens

The listings are automated through our in-house listing agent. In practice, onboarding a new asset means identifying its token address and chain, labeling it, and adding it to the supported asset universe.

These types of upgrades to the underlying data models are only possible because Token Terminal runs an end-to-end, in-house data pipeline. By controlling each step, we can iterate on the schemas directly, something that wouldn’t be possible if we relied on third-party APIs.