Tokenization Is Not Just Putting Assets on a Blockchain
Joseph Wang12 min read·Just now--
A few years ago, tokenization was mostly pitched as a way to cut expensive assets into smaller pieces: a skyscraper divided into ten thousand tokens, a Picasso anyone could own a sliver of. That framing was intuitive. It was also mostly a promise.
The relevant shift is not a new token standard. It is who is using tokenization now, and what problem they are trying to solve.
BlackRock, J.P. Morgan, and DTCC are not chasing the art-token story. They are building around money-market funds, Treasuries, regulated securities, and the rails that move them. DTCC is now targeting initial limited production trades of tokenized securities in July 2026, with a full service launch planned for October. The relevant question is not whether tokenization will exist, but which parts of the financial system it can actually rewire.
Three things made the topic more concrete than it was a few years ago. Stablecoins proved that tokenized money could operate at scale. Higher-yielding short-term instruments made tokenized Treasury and money-market products economically useful, not just technically interesting. And regulated institutions are now testing tokenization in issuance, collateral, and settlement workflows rather than treating it as a retail-access story.
Large financial firms are not optimizing for “anyone can buy a slice of a painting.” They are optimizing for settlement, collateral, and operability on shared digital rails, still inside regulation and still anchored to off-chain legal claims. Note that distinction; it drives most of the argument below.
A Token Is a Representation, Not the Asset Itself
The word token predates blockchain by centuries. A gift voucher is a token. An access card is a token. A mobile payment credential is a token. Each is a stand-in: a thing that represents a right, not the right itself.
A blockchain token works the same way, with one important addition: its state can be recorded on a ledger that is cryptographically verifiable and operated through programmable rules, rather than being reconciled only through a chain of separate private databases.
That addition matters for finance. The core structure, however, does not change: the token is not the asset. It is a representation of a claim on the asset.
A stablecoin token represents a claim on a fiat reserve or the issuer’s balance sheet. A tokenized Treasury fund token represents a share in a fund that holds U.S. government debt. A tokenized real estate token represents, at best, some form of legal interest in a property that still exists in the physical world, still requires courts to enforce, and still needs someone to maintain and manage it.
This is not a minor technical footnote. It is where much of the real complexity in tokenization begins.
What Tokenization Actually Means
Tokenization is the process of representing rights to money, securities, or other assets as digital tokens on a blockchain, so that ownership and transfer can be recorded and operated through programmable infrastructure.
The key words are rights and programmable. Tokenization is not about creating a digital copy of an asset. It puts the claim to an asset into a form that a programmable system can read, verify, transfer, and settle.
A rough taxonomy of what exists today:
Each move down the table tends to add more off-chain dependencies: custody, valuation, legal enforcement, and dispute resolution. These have to work correctly for the token to mean anything.
Native crypto assets like ETH and BTC are different because they do not point to an external claim. Every other category in the table eventually runs into the question of who guarantees the asset underneath.
Why Token Standards Matter
On smart-contract platforms such as Ethereum, a token is usually not just a database entry. It is a contract interface that other software needs to understand. If every issuer implemented balances, transfers, and approvals differently, wallets, exchanges, and analytics tools would need a custom integration for every token.
ERC-20 became the basic interface for fungible tokens on Ethereum by standardizing a small set of operations: how to read total supply and balances, how to transfer tokens, how to approve another address to spend on your behalf, and which events are emitted when balances move. A stablecoin, a governance token, and a tokenized fund share can represent very different claims while still exposing a common interface to wallets, exchanges, and other contracts.
Regulated assets often need more than basic ERC-20 behavior. Standards such as ERC-1404 add transfer restrictions, allowing an issuer to reject transfers to wallets that are not whitelisted or do not satisfy a rule. Tokenization is not the removal of financial rules. It is often the relocation of some of those rules into programmable interfaces.
Why Tokenize at All?
The traditional pitch for tokenization tends to lead with fractional ownership and democratized access: anyone can own a sliver of a building, a piece of a private equity fund, a corner of an otherwise illiquid market. These are real benefits, at least in theory. But for financial institutions, they are rarely the primary motivation.
The more serious case for tokenization is about market infrastructure, not ticket size.
The old pitch emphasized fractional ownership, lower minimums, and broader access. The institutional pitch is different: faster settlement, programmable transfer rules, shared ownership records, collateral movement, and fewer reconciliation layers. Fractional ownership is the easiest benefit to explain, but the more durable value is cleaner infrastructure between issuance, transfer, and settlement.
BlackRock describes tokenization as updating the plumbing of the financial system. DTCC’s interest is in extending existing regulated market infrastructure: improving collateral workflows, back-office operations, and interoperability, not putting art on a blockchain. The institutional language is deliberately operational.
What Is Actually Being Tokenized Right Now?
Three trends describe where tokenization has actually landed between 2023 and 2026. This is not a prediction; it is an observable pattern in which assets moved from concept to production.
Trend 1: Tokenized money came first
Stablecoins are by far the largest tokenized asset category: roughly $302 billion in aggregate market value as of early May 2026, compared to about $10 billion in tokenized Treasuries and just over $1 billion in tokenized public equities. The reason is structural. Every tokenized market eventually needs a money leg, something on-chain that does not fluctuate in value. Stablecoins filled that gap first. Headline transfer volumes should not be read as payment volume, since a large share reflects trading, arbitrage, and internal movements. The narrower point is still important: stablecoins proved that tokenized money can operate at scale before more complex tokenized assets did.
On the institutional side, J.P. Morgan’s JPMD deposit token became available to clients on Base in 2025, and BIS research frames tokenized central-bank reserves and commercial bank deposits as the long-run foundation of unified settlement infrastructure, not stablecoins as a permanent substitute.
Trend 2: Tokenized funds are scaling as institutional rails
Among tokenized non-money assets, tokenized Treasuries and money-market funds are where institutional adoption has actually concentrated. The reasons are not surprising: these instruments are standardized, heavily regulated, have well-understood valuations, and sit at the center of institutional cash-management workflows. Tokenizing them does not require reinventing their legal structure. It means issuing fund shares as tokens that can be transferred, used as collateral, or redeemed programmatically on top of an existing legal foundation.
Two products illustrate the institutional pattern clearly. BlackRock’s BUIDL fund stood at roughly $1.88 billion in AUM as of early May 2026 and had expanded to additional chains, with its shares beginning to serve as eligible collateral in certain trading contexts. Franklin Templeton’s BENJI suite reached $1.98 billion in AUM by late April 2026, with its FOBXX fund reporting net assets of $824.6 million. FOBXX is the first U.S.-registered mutual fund to use a public blockchain as its official system of record for ownership. In other words, the on-chain state is the legal record, not a mirror of a separate register. The BENJI platform has added intraday yield accrual, permissioned wallet-to-wallet transfers, and stablecoin-based subscription and redemption as built-in capabilities. This is what “fund operations on programmable rails” means in practice: not just issuance, but the full operational cycle integrated into the same system.
A small on-chain check points in the same direction. For this check, I reconstructed weekly supply and visible holder counts from ERC-20 `Transfer` events on Ethereum. Mints appear as transfers from the zero address, burns as transfers to the zero address, and visible holders are addresses with a positive reconstructed balance at each weekly snapshot.
In the Ethereum deployments examined above, BUIDL and USYC both reached large on-chain supply while their visible holder bases remained in the dozens of addresses. Address counts are not investor counts, since custodial or omnibus wallets can sit behind a single address. The conclusion is also not driven by dust balances: the counts remain in the same order of magnitude under stricter balance thresholds. The pattern fits the broader picture. The first large tokenized-fund products look more like institutional instruments moving onto programmable rails than mass-retail fractional-ownership networks.
A second check tells the same story from the balance side. At the latest snapshot, visible on-chain balances were concentrated in a small number of addresses: the top 10 addresses held 84% of reconstructed BUIDL balances and 99% of reconstructed USYC balances on Ethereum. This is not a measure of beneficial ownership; custodial and omnibus wallets can aggregate many end users. It reinforces the same point: these deployments look more like institutional rails than broad retail ownership networks.
Physical assets such as real estate, art, and private credit still dominate many RWA explainers, but they are usually harder to execute: custody, valuation, legal treatment, and redemption are less standardized. For that reason, they are a weaker starting point for understanding where tokenization is working today.
Trend 3: The frontier moved from issuance to infrastructure
The most significant institutional activity in 2025–2026 is not about minting new token types. It is about connecting tokenized assets to post-trade workflows: investor registers, capital calls, distributions, settlement, collateral management, and reconciliation.
Kinexys Fund Flow, which completed its first transaction in 2025, illustrates the shift. Its focus is tokenizing the investor register and transactional data of an alternative fund, so that capital calls, distributions, and fund transfers can settle near-instantly through blockchain deposit accounts rather than through the current batch-processing cycle. The bottleneck being addressed is post-trade operations, not token format.
DTCC’s approach to the same problem is to extend existing regulated market infrastructure rather than replace it. Its DTC Tokenization Service, backed by an SEC no-action letter, targets initial limited production trades in July 2026 and a full launch in October, beginning with high-liquidity securities such as Russell 1000 names, index ETFs, and U.S. Treasuries. The explicit design constraint is that tokenized assets must carry the same investor protections, entitlements, and ownership rights as their traditional equivalents.
BIS Project Agorá points to the same endpoint: tokenized commercial bank deposits and central bank money interacting on shared programmable infrastructure. That is the infrastructure version of tokenization, not the retail-asset version.
U.S. market-structure legislation such as the CLARITY Act is part of the same backdrop. The bill is not mainly about tokenized fund mechanics, but it reflects the broader institutional demand for clearer boundaries around digital assets, intermediaries, and trading venues. For tokenization, that matters less as a short-term catalyst and more as a signal: regulated products need regulated rails.
How Tokenization Works at a High Level
Stripping away implementation details, tokenization involves a layered process. Each layer matters, and each can fail independently.
Define the underlying asset. Is this a claim on cash, a fund share, a debt instrument, a commodity, or a real-estate interest? The answer determines the entire legal and operational architecture downstream.
Create the legal claim. What does a token holder actually hold? This is the most important question in the entire stack, and it is answered off-chain by lawyers, regulators, and contract structures, not by the blockchain.
Place or verify the asset. Who holds the underlying? A custodian bank, a fund administrator, a reserve auditor? The on-chain token is only as good as the off-chain answer to this question.
Issue the token. The on-chain step. A smart contract records token balances, enforces transfer rules, and handles minting and burning.
Apply transfer rules. Many tokenized assets are restricted: only KYC-verified investors, only certain jurisdictions, only accredited investors. These rules can be embedded in the token contract or enforced at the platform layer.
Support redemption and settlement. What happens when someone wants out? Who processes the redemption? How does the on-chain state reconcile with the off-chain register of legal owners? Who handles corporate actions, dividends, or interest payments?
The hard part of tokenization is rarely the token issuance. The hard part is maintaining a correct and legally enforceable mapping between the token state on-chain and the real-world claim it represents, continuously, across transfers, corporate actions, and redemptions.
What Tokenization Does Not Magically Solve
There is a version of the tokenization story where every inefficiency in financial markets dissolves once assets are on-chain. That story is wrong in at least four specific ways.
It does not automatically create liquidity.
A token is a format, not a market. Dividing a building into 10,000 tokens does not produce 10,000 buyers. Liquidity comes from market structure, matching mechanisms, price discovery, investor base, and regulatory treatment. A tokenized asset with no secondary market can be just as illiquid as its untokenized equivalent. The history of real-estate tokenization is littered with this mistake.
It does not remove off-chain dependence or settlement work.
For any asset with an off-chain analog, such as a fund, a bond, a property, or a commodity, the token’s value depends on off-chain infrastructure working correctly: custodians, valuations, legal claims, and dispute resolution. A token transfer can execute in seconds, but settlement in the commercial sense also requires confirming the legal change of ownership, updating the investor register, and reconciling cash. For regulated assets, these steps still exist after the on-chain transaction confirms. Many tokenized asset failures have not been about the token. They have been about the operational and legal layer around it.
It does not eliminate regulation.
A tokenized security is almost always still a security. A tokenized fund share is still a regulated instrument. Security token offerings (STOs), the regulated counterpart to early-era ICOs, were explicitly designed to treat tokens as securities subject to existing laws, governed by frameworks like MiFID II in the EU. Institutions are not moving toward tokenization because it bypasses regulation. They are moving toward it within regulated frameworks, because the operational benefits are real and the legal treatment is becoming clearer. DTCC’s explicit goal is to extend existing investor protections into the tokenized format, not to remove them.
It does not make every asset equally suitable.
The assets that tokenize well first are those that are standardized, have clear valuations, have existing financial intermediaries with high operational friction, and have a defined legal framework for ownership and transfer. Cash and Treasuries fit this profile almost perfectly. A one-of-a-kind commercial property in a jurisdiction with ambiguous title law does not.
A simple way to filter tokenization claims is to ask two questions: is the legal claim clear, and is the existing workflow painful enough that programmable rails are worth the integration cost?
The main mental model is simple: tokenization does not change what an asset fundamentally is or how its risks work. It changes how the asset’s ownership and transfer are recorded and operated. That can be valuable, sometimes very valuable, but it is a different claim than “putting it on-chain makes it better.”
The useful question is no longer whether tokenization will happen. It is which workflows have both the legal clarity and the operational friction to make tokenization worth the integration cost. That question has a different answer for a treasury operations team than for a real-estate fund. The assets and institutions moving fastest are the ones where the answer is unambiguous.
Further Reading
- Blueprint for the Future Monetary System: BIS Annual Economic Report, 2023. Foundational framing for tokenization, unified ledgers, and future settlement infrastructure.
- DTCC Advances Tokenization Service: May 2026 update on the July pilot and October launch timeline.
- Larry Fink’s 2026 Annual Chairman’s Letter: Discusses tokenization as updating the plumbing of the financial system.
- Franklin Templeton FOBXX: First U.S.-registered mutual fund to use a public blockchain as official system of record.
- Kinexys Fund Flow: J.P. Morgan’s near-instant tokenized fund settlement product.
- RWA.xyz: On-chain market data for tokenized Treasuries, stablecoins, and other real-world assets.