Data service providers: Why big doesn’t always mean best

Article highlights

  • Rapid growth in exchange-traded futures and options volumes, especially in equity options on Indian exchanges, has resulted in a significant increase in transaction messages and reference data points to manage.
  • Accurate futures and options reference data is crucial for price discovery, making efficient data management essential. Any issues with the data, from source to delivery, can cause significant difficulties.
  • The lack of standardised codes for describing specific traded instruments and actions poses a challenge in efficient reference data management. Different market intermediaries and data aggregation platforms may use different naming conventions, making translations complex and error-prone.

Big isn’t necessarily always best when it comes to data services delivery and it’s important for financial firms to weigh up the relative pros and cons when considering what makes the most sense for them to support their data management needs.

In FOW’s last blog The “70/30 Rule for Effective Transaction Data Management”  they outlined the size of the reference data element in a single futures contract relative to market data (it’s around a 70:30 split) and also provided a rough breakdown of the reference data fields and associated data points that must be populated – correctly – to ensure accurate data reporting to internal, client and regulatory end destinations. This rough calculation was based on at least 160 data points in every transaction message, multiplied by the number of futures and options contracts traded.

Challenges in reference data management

In 2023, exchange traded futures and options volumes increased by more than 60% (c. 137 billion contracts). A significant contributor to this increase was trading in equity options on Indian exchanges; however, F&O trading activity increased across the board for all asset classes including equities, interest rates and commodities.

That’s an awful lot of transaction messages and associated reference data points to manage, and as noted in previous blogs, for products based on forward pricing, efficient data management is not only a post trade event challenge. Since accurate futures and options reference data is ‘mission critical’ in price discovery, it follows that any problem with this data, at source – or at any point from capture to delivery to end user applications – will be the cause of major headaches.

As we have also touched on previously, another key challenge in efficient reference data management is the lack of industry-standard codes to describe specific traded instruments and actions – different market intermediaries and data aggregation platforms may call things by different names. Translating different ID formats from multiple sources into the formats prescribed by specific applications and end destinations is a very difficult activity to manage at the best of times. Magnify the challenge by the myriad and fast-growing number of instruments, sources and end destinations and the potential for error increases exponentially.

Optimising regulatory reporting

Getting the right transaction data in the right place in the right way at the right time is an imperative for all financial markets participants, for internal and client reporting purposes and to meet increasing, and increasingly stringent, regulatory reporting obligations. With this year’s EMIR Refit, and other regulatory ‘upgrades’ tables in other reporting jurisdictions, comes a greater expectation of more rigorous regulatory responses to ‘bad’ reporting – and the associated risk of major remedial work, fines and worse.

There are many large-scale, generalist, data service providers that source and deliver data to market participants, and offer what may seem like the obvious advantages – and scale economies – of their multi-asset and multi-venue data coverage. Along with the benefits of this multi-asset, multi-user, many to many service strategy there are, however, corresponding downsides with respect to service specialism and delivery.

At the other end of the spectrum, niche players and industry specialists are focused expressly on a specific data segment – in our case, futures and options. By definition we have a much broader and deeper knowledge of the specific assets, instruments and associated transaction data within our purview.

Why small can be better

In our 25+ years sourcing, transforming and delivering futures and options reference data to financial markets participants we have continuously refined and enhanced our service offering to respond quickly to changing conditions, while maintaining the flexibility to give our customers the data they need, in the way they want it.

In addition to our unmatched expertise in translating and transforming F&O reference data from different sources and formats into those required to meet particular reporting requirements, we have a ‘hands on’ service model that ensures a very fast and effective response to all customer requests and issues – whether related to the data itself, or its efficient movement through customer applications.

With respect to queries around specific data – for example, when a customer is having trouble locating a specific code – customer support is provided directly by members of our F&O-specialist data team. Issues associated with data workflow – from us to the customer or on the customer side – are handled through our team of technical support experts. We also proactively reach out to customers to advise them of data workflow issues at their end that they may not themselves be aware of, and work with them to resolve them.

FOW are available to provide a rapid, human response to data and technical queries pretty much around the clock – and for IT issues, 365 days a year – to quickly resolve delivery and messaging problems for every customer, regardless of their size or geographic location. Our response and resolution times are much faster than what customers will typically experience with large scale data originators; hardly surprising considering the sheer breadth of their data coverage and scale of their data management operations.

You may also like

Comments are closed.