Author: FINBOURNE– finbourne.com
In part two of this commentary, and following our recent attendance at TSAM London, Stephen Collie, Head of Sales Engineering, FINBOURNE Technology explores the makeup of the modern data stack and how incremental innovation is critical in the transition to an interoperable future.
Innovation doesn’t have to be big bang
As one of the attending CTOs put it, we’ve talked about agility forever in the world of tech and ops but being able to pivot and “change the direction of travel” now needs to be firmly on the business agenda too.
All of this makes it a very exciting time for the world of technology. You only have to look around, at events like TSAM, to see a crop of new fintechs appearing. But it’s not just about automation, it’s about enablement of your greatest asset – your people. In the morning panel discussion, it was made clear that transformation should be about delivering “flexibility, motivation and wellbeing” for your employees. We couldn’t agree more and are steadfast in our belief that SaaS technology can be an enabler to, and not a replacement of talent.
In my own panel I suggested that firms seldom hire super-skilled data scientists with the intention that they spend their day cleansing and scrubbing data. Yet for many this is a sad but true story. Being able to get these highly trained and talented individuals back to what they love doing and what will add the most value to the business, is the kind of innovation everyone should be doing.
The confusion can often be where to start. What does innovation look like? The simplest way to see it is that you don’t innovate by focusing on innovation; it’s an incremental objective based on your business goals. It is the outcome of all the small changes you make as an organisation, for example, addressing multiple IBORs or eliminating data siloes, which combined achieve agility, efficiency and growth.
Where do you start?
Having experienced first-hand the pain that technical debt and inertia can cause, my colleagues and I set down a path to address the data dilemma across the investment chain, while lowering the risk of operational change. Challenging traditional constructs, we designed a distinct new approach, in the form of a SaaS powered, cloud-native, interoperable data store.
We met the low tolerance for risk by systematically bridging the gap between existing architecture and the future-state data stack – supporting the very real need to translate between data formats and derive value and meaning from data. By providing an interoperable foundation, asset managers can easily plug-and-play their components together and leverage emerging technologies, such as AI and ML, to create meaningful analytics.
Most importantly, the approach we have taken offers a strong and viable alternative to the multi-year, big-bang transformation projects that have occupied the industry for much of the past few decades. It was validating to see this sentiment echoed by a Head of Affairs at a European asset manager, when they elegantly stated the industry had, for too long, only focused on “gathering data”.
Pay attention to data-driven, user-led technologies
Successful transformation projects must leverage cloud technologies to analyse and understand data, allowing it to be translated between formats and joined together seamlessly. Utilising deep technology and industry expertise, we have done just this; building a solid data foundation which understands data, and enables the interpretation and translation of it, all while putting in place the right control framework to make it securely accessible across the organisation.
Unlike incumbent providers, who come at the problem from a workflow or functionality-first perspective (and find they have to address data issues later, within standalone data management offerings) we felt it was high time to flip the model and concentrate on making the data accurate, timely and trusted. Building the data fabric from the bottom up allows accurate data to feed seamlessly and in real-time, into investment capabilities built on top e.g. portfolio management, order management and accounting.
Similarly, while rigid front to back systems mandate a data model on their clients, SaaS technology offers composability –meaning you no longer have to change your business model to fit the technology or translate your existing models into something the system understands.
Taking an API-first approach tackles the lack of timeliness and trust, helping to reproduce data with complete accuracy and confidence, every time. But it’s important to understand APIs are not a complete strategy in themselves – connectivity is only a starting point. I see a lot of firms getting hung up on having an API strategy, which is a good start but without the right control environment, entitlements protocol and domain knowledge to boot, unlocking the potential of your data across your teams and functions, will be difficult.
Equally, I see firms retro-fitting APIs to existing implementations, something which is unlikely to provide the value that is expected. While this may successfully connect systems, it doesn’t necessarily account for or join together the many different languages of those systems. Knowing what the data means and who is entitled to see it, is arguably the more challenging and value-adding task.
Back to the future
It is encouraging to hear from TSAM attendees that the industry recognises the steps needed to manage data chaos and reach the nirvana state that is needed to survive today and thrive tomorrow. The resounding themes around multiple data models, the difficulty of achieving high-quality data, and how to gain value from complex data sets, resonate strongly with the mission we are on; to transition the Buy Side out of the past and make it fit for the future. These same challenges were also mirrored in a recent roundtable discussion we held, attended by over a dozen COOs across boutique and global asset managers, and UK pension funds (you can read more on that here).
If we piece these conversations together, the following priorities will be at the top of the business agenda for the foreseeable future:
- Meeting investor and regulatory-led transparency, with greater granularity across public and private assets
- Demonstrating more value to investors through a digitalised client experience
- Achieving agility as a business (and not just operationally), to respond to financial and geopolitical shocks
- Regaining control of operational efficiencies, moving from operating at cost to operating at profit again
- Liberating employees, leveraging available talent and skills to support success
Our view is that an interoperable SaaS investment data platform has the flexibility to enable firms to achieve these priorities most efficiently and in the shortest time. It provides a trusted data fabric that becomes part of a toolkit or hub for investment operations. This modern data stack not only delivers what is necessary for firms today, but also offers optionality, so that they can befree to choose what services to build, buy, outsource and integrate to, without having to maintain the underpinning infrastructure.
To start the transition and move towards an interoperable future, firms must first move on from a single source or front-to-back stack, to enable the transparency and flexibility required. Single vendor dependency aside, one platform can never achieve the same level of innovation as that collectively available in the market or complete the final piece of the puzzle – translating and gaining true value from investment data.
Doing this, not only structurally changes the cost of investing, but also makes it easier to meet the needs and demands of the end investor. In the end, while this meets the cost imperative, it also bears a critical human impact on firms – delivering the means to empower employees and winning the trust of clients. You can put a price on cost savings, but trust…well that is priceless.