
CME Group Signals Plan for a Proprietary Token as Tokenization Agenda Expands
Read Our Expert Analysis
Create an account or login for free to unlock our expert analysis and key takeaways for this development.
By continuing, you agree to receive marketing communications and our weekly newsletter. You can opt-out at any time.
Recommended for you

CME Group opens 24/7 trading for cryptocurrency derivatives
CME will enable continuous trading for selected bitcoin and ether futures and options starting May 29 to align with 24/7 spot markets and reduce hedging gaps; the move comes as the exchange reports record crypto derivatives engagement and signals broader strategic experiments with tokenization and a Google Cloud partnership.
Institutions Drive Tokenized Asset Wave as Retail Readies to Follow
Senior executives at a Hong Kong conference said tokenized representations of traditional assets are moving from pilots toward production use among large financial firms, anchored by cash‑like instruments, treasuries and stablecoin settlement. Panelists warned that technical limits (throughput, latency, finality and transaction‑ordering) and emerging concentration among middleware and custody providers must be addressed—through atomic delivery‑versus‑payment, programmable compliance and interoperable custody—before meaningful retail uptake follows.

Galaxy Digital leads $7M seed round for Tenbin to tokenize gold and emerging‑market FX on CME-linked rails
Tenbin raised $7 million in a seed round led by Galaxy Ventures to launch tokenized gold and FX products that use CME futures hedging to keep on‑chain prices aligned with off‑chain markets. The startup aims to deliver faster settlement, improved liquidity and yield capture for token holders while integrating with DeFi protocols and prime brokers.

Zeta Network Signals Strategic Move Toward Tokenized Real-World Assets to Bolster Institutional Treasury
Zeta Network Group said it is evaluating tokenizing real-world assets to complement its bitcoin-centric treasury and mining operations. The company frames this as a way to add yield stability and duration management while staying aligned with public-company governance and regulatory requirements.
Tokenization’s Second Act: Making Real‑World Assets Composable
The first wave of tokenization largely digitized existing processes; the next phase must rebuild issuance, settlement and compliance as native, programmable layers so asset tokens can act as interoperable building blocks in digital‑money rails. That transition depends on solving throughput, latency/finality and transaction‑ordering limits, while regulatory choices and middleware concentration will shape whether markets centralize on platform‑led rails or remain open and composable.
Deutsche Börse doubles down on tokenization, integrates tokenized equities via 360T
Deutsche Börse’s 360T platform onboarded a Kraken‑backed tokenized equity product on Feb. 9, 2026, signaling a concrete step to fold ledgered shares into regulated trading rails. Broader market and regulatory signals — on‑chain tokenized equities nearing $1bn, sharp year‑over‑year growth and evolving EU/US guidance — are accelerating hybrid, custody‑integrated approaches even as technical and custody questions persist.
Franklin Templeton and Binance launch off-exchange tokenized fund collateral for institutional trading
Franklin Templeton and Binance unveiled a program that lets institutional traders pledge tokenized money-market fund units as collateral while custodians keep the assets outside the exchange. The arrangement aims to lower counterparty exposure and improve capital efficiency by letting pledged holdings keep earning yield while mirrored within Binance’s trading environment.
OpenAI’s compute financing gap makes a crypto token plausible
Large, multi‑year GPU and cloud commitments are creating a capital‑timing mismatch for OpenAI that conventional equity and debt struggle to resolve. A market‑traded token—whether issued by OpenAI or by distributed compute protocols—could convert future compute or revenue into liquid claims, but deployment requires robust metering, verifiable auditing, and regulatory clarity to avoid destabilizing core AI infrastructure.