Downtown Manhattan, home of the DTCC. Photo: Shutterstock
The DTCC just published a white paper on the potential of distributed financial technology, titled “Embracing Disruption – Tapping the Potential of Distributed Ledgers to Improve the Post-Trade Landscape.”
The DTCC—or Depository Trust & Clearing Corporation—is the predominant financial institution that manages the centralized clearing and settling of securities.
Talk about timing.
Last week, the two firms best known for going after the securities settlement use case settlement took turns sharing the limelight.
First, R3 announced that they were experimenting with an iteration of Ethereum through Microsoft’s Blockchain as a Service platform with nearly a dozen banks. Later, Digital Asset Holdings announced a plump Series A, raising $52 million from predominantly strategic investors, valuing the company at $100 million.
Big wins for two firms confidently surfing the current wave of blockchain fueled hype.
It’s no wonder that R3 and DAH are jumping head first into the space in a huge way. As DTCC President and CEO Michael Bodson noted in a press release accompanying the latest white paper, “The industry has a once-in-a-generation opportunity to reimagine and modernize its infrastructure to resolve long-standing operational challenge.”
But he also added a caveat—that the potential is there, but only if we approach these problems from the right way. As Bodson explained, “To realize the potential of distributed ledger technology in a responsible manner and to avoid a disconnected maze of siloed solutions, the industry must work together in a coordinated fashion.”
If that last bit sounds like a bit of a party crasher, it’s because, maybe it is.
Now… “you could argue the DTCC has a vested interest in protecting the “old way” of doing things,” FT Alphaville’s Izabella Kaminska prudently admits. But “it’s also worth thinking of the DTCC as the “master sorcerer” in this equation They’ve been doing this stuff for much longer than any of the start-up fintech crowd.”
Which is a point well taken here at Ripple Insights. After all, we’ve been citing the DTCC for quite some time in trying to better understand what use cases truly make sense for blockchain technology.
A favorite is their 2012 paper, which analyzes the benefits and costs of reducing the securities settlement cycle.
One of the primary conclusions of that report is that the databasing functionality was indeed not the bottleneck toward reducing settlement time and cost. More important was industry support, risk management, regulations and compatibility with and consideration of legacy systems.
Nearly four years later, the DTCC’s latest paper builds on those themes more explicitly.
New platforms will only be relevant if they can integrate with existing systems, which already work quite well. That’s a lot of work:
In assessing the applicability of distributed ledgers to post-trade processing, it is important to understand that the distributed ledger platforms in use today are simply a ledger of transactions that is essentially replicated to all of the cooperating servers. The technology does not have built-in integration with existing systems and supporting infrastructure. It does not simply integrate with user identity management systems or have any master data about legal entities or securities. It does not include supporting workflows, exception processing or any of the extensive preprocessing logic that often accompanies complex matching, allocation and other processes that precede the point at which a transaction is considered complete.
Decentralized systems are inherently less efficient—implying that we better have a good reason to use them in the first place:
Decentralized processing is, by definition, a shared computing function among members of a community (trusted or not), which requires synchronization and coordination. Some implementations of distributed ledger, such as Bitcoin, use a consensus mechanism to manage coordination, while others use variations such as a lead node mechanism. Regardless, all such designs include steps that add latency to transaction processing. A decentralized design requires significant computing and storage resources because all nodes perform the computations and store the ledger data, which can also result in significantly increased network bandwidth requirements depending on the number of network nodes and the size of each transaction.
And then what about regulations?
Global regulatory requirements for data privacy that are different based on geography raise additional challenges for decentralized systems that distribute every transaction to every node. In certain regulatory jurisdictions, the laws protecting an individual’s data privacy restrict the ability to store certain data outside of the regulated region. Several vendors have recently proposed alternative “partitioned” ledgers to address these challenges, but given that all of the current work on distributed ledger technology has been done without regulatory oversight or endorsement, it is still unclear as to the level of regional data containment that will be required.
And so the DTCC concludes:
…a mature, supported, integrated distributed ledger technology has the potential to help improve a number of existing financial market infrastructure limitations. However, it may not be the solution to every problem because there may be alternative opportunities to lower the costs and risks of current infrastructure by standardizing industry workflows and expanding the use of cloud technologies.
Which, incidentally, is the same conclusion that the Vermont state government came to last week, as we report.
It’s also one of the key themes that we’ve been harping on all year—that 2016 will be the year that you realize you don’t need the blockchain for your particular use case.
That isn’t to say that the DTCC—or Ripple Insights for that matter—doesn’t believe that we’re experiencing a once-in-a-life-time shift in how we approach these kinds of financial problems. Which is also why we’re incredibly excited to see what the likes of R3 and DAH have in store for the industry over the next few years:
This is the opportunity to create an industry-wide initiative to develop the right architecture, prioritize the infrastructure building blocks and support focused and collaborative experiments to help the technology mature.
For a second there, it almost sounds like they’re talking about the Internet of Value.