Author: Ediphy – ediphy.io
Following the publication of the EU’s MiFIR Review Proposal in November last year, much has been written about the consolidated tape (CT) topic as the draft proposals open up the viability of a CT finally coming into being. The majority of such commentary and debate has focused on two topics – (1) technology solutions and standardised data formats; and (2) bad data quality. Having worked with the non-equity MiFID transparency data for over four years now, including as part of our own CT initiative (see Press Release), we share some thoughts on what is missing from the current discourse.
Many potential providers focus exclusively on their technical solution, which could be adapted to deliver a CT solution. It is true that there are multiple technology providers who can ingest a bunch of standardised FIX messages and present those in an aggregated feed, yet this specification of the problem misses a few key issues. First, the way data is currently published by the operators of the circa 300 trading venue and APAs is not standardised. Either a protocol needs to be agreed and adopted, which will take a long time, or the consolidated tape provider will need to deal with a variety of data transmission methods and formats. Whilst we are advocates of standard industry protocols, we believe this is an optimization that can happen over time to reduce operational costs and complexity but should not be a gating issue to the instantiation of the CT. In the example below, we are capturing data from 10 different sources using a variety of methods, including csv slice files, websockets and web GUIs.
The second aspect of the tape most people miss is that it is, by its nature, going to have to deal with transactions being reported out of temporal order. This is the case both for when transaction information is deferred as part of the transparency regime and the inevitability of trades being reported late, being canceled or otherwise amended. Thus, maintenance of a historical tape of record becomes a key requirement of a consolidated tape provider and, given this history will potentially keep changing after the period in question, this history needs to bitemporal (a history of histories, so to speak) so that users can understand how the history has changed over time.
Moving onto data quality, we wholeheartedly agree that improving the quality of the data being published is essential to ensuring a complete and correct tape. Indeed, we calculate a number of checks on the data to monitor the accuracy of the data over time and to highlight inconsistencies. As an example, we look at the delays between the transaction and publication timestamps on the transaction records and see a number of oddities. The graph below shows the average publication delay broken down per day.
One can immediately see a pattern of there being a negative publication delay (ie. transactions being published before they have happened) every Tuesday, which is driven by an operator publishing aggregated trades with a transaction timestamp (which should be blank) and an incorrect publication timestamp. Our analysis shows many such problems with the data being published, which is why we believe it is critical to incorporate a high level of regulatory involvement in the CT framework. A consolidated tape provider should be in a position to provide the analytics and identify ongoing, systematic issues with the data, but it will be up to National Competent Authorities and ESMA to ensure these issues are addressed in an appropriate time frame.
However, our main point when it comes to data quality issues is that they shouldn’t be made an impediment to commencing the publication of a CT. As the old adage says, “Don’t let perfect be the enemy of the good.” The proportion of data being published which is of poor quality is relatively small, and there is much value to be gained from it, even taking all the issues into account. With some filtering and outlier detection techniques we already produce things like liquidity scores and transaction cost analysis, as shown in the examples below.
These use cases should be available to market participants as quickly as possible and we need to acknowledge that this data is never going to be “perfect” so let’s not stall getting a solution out there by waiting to make it so. Furthermore, the increased set of eyeballs and analysis on the consolidated data will naturally surface issues more quickly and increase the iteration cycle to get them resolved.
Finally, all the talk of technology, data standards and data quality misses the bigger point – what type of entity do we want the provider of the consolidated tape to be? The goal of the CT should not only be to consolidate the data but also to make it public in the truest sense of the word. The purpose of the tape cannot be fully realised if it can only be consumed by a minority due to excessive cost. The CT needs to be accessible by all to ensure it can become a key component of the Capital Markets Union, leveling the playing field and making the European markets truly transparent to all participants. This is why we are committed, under our initiative, to provide the CT on a cost recovery basis as it aligns to our principles of fairness and transparency. We are now working with multiple large capital markets firms, representing data consumers from both the buy side and sell side, who share our vision of delivering a CT under a utility model and democratising access to this critical data set. Please get in touch and join us on this mission.