Why You Need To “Blow Up” Your Legacy BI Platform and How to Start Again With a Sustainable Layered Approach









Data is important and the key to deriving information which can drive meaningful decisions.  Yet most organizations have a data stack that is sprawling, brittle, unwieldy, and when it comes to driving information for key decisions often not useful.

Organizations that are dealing with such legacy systems might know what data they have and they are using it to some extent, but by and large, the information that they have is siloed across systems and not being utilized to its full potential – claims management solutions were built in isolation of client information systems; commercial loans and personal loans were two very different areas; deposit systems and wealth management platforms were often completely separate conversations; the data models were designed in different eras and with different intentions.

Pulling all of this data together is challenging, expensive and time consuming. However, not pulling it together is just as costly as organizations risk missing out on opportunities to build on existing relationships and offer more services to their clients. This is an area where new entrants are leveraging their newer technologies and using analytics to predict what services their clients may need next and offering solutions to their clients at just the right time.

So what can institutions do that are struggling to keep up?

Big Data is important, but not an answer in itself – certainly one layer in the solution. Big Data is of particular importance to financial service institutions that are dealing with outdated, complex systems, and vast amounts of historical and real-time data, while at the same time, trying to keep pace with the new velocity of change required in the industry.

What does Big Data mean for the financial services industry and why is it relevant? Well, for starters: financial service institutions have colossal amounts of data but this data is scattered in legacy systems and generally not interpretable.  In many of the core banking transformation projects that I’ve been involved in one of the most challenging aspects of the program is the archaeology on the data and finding the interpreters for the different fields – this is the landscape from which organizations are trying to derive meaningful insights!

A bespoke approach has been taken to address different areas of banking over the years.  A BI platform here, a data mart there, a product for this, and framework for that.  The systems complexity has grown exponentially with each additional actor. Most organizations are creaking under the strain of keeping all of these systems alive, let alone current and nimble. The cracks are starting to show, and this is a very serious threat when you consider that new entrants don’t have this legacy albatross and can take more of a holistic approach to data, store it all in one area and then pull out what is needed, when it is needed.


The solution is to completely re-think data strategies and remove the silos; instead build horizontal layers that hold all data and handle a category of needs. One approach is to stream event-based activities by using an event-based hub for all of the activities that react to events, also known as complex event processing (CEP).

Consider the following event:

A large value transaction arrives at a bank and is placed into a big data layer at which point the event-based hub triggers a notification to all interested parties that have indicated interest in this event:

  • Fraud – receives the notification because they want to know where the funds are coming from and what parties are involved – have they validated that know your customer (KYC) requirements have been checked? Are they on black or grey lists? Is there a case that needs to be started for further investigation?
  • Risk – receives the notification as they are keen to look at the interest rate and currency exposure and the potential next steps for these funds. Are these funds held in a currency where we have some risk? What is the interest rate on this deposit and how does it relate to the rest of our portfolio?
  • Sales and marketing for wealth and retail – receives the notification as they want to ensure that this is the ideal location for these funds, and if not, suggest a more suitable product. They also want to know who the receiving party is and if a relationship management approach is in place for them. If not, this is the perfect opportunity to reach out to the client.

At the base of this event is the same data: customer, counterparty, amount, source, currency, destination, etc. – but for most organizations, these simple data elements get pumped all through the building and down into a BI platform before they are reassembled and cascaded into reports for each of the stakeholders mentioned above.  A simple publish-and-subscribe eventing bus would greatly simplify the process.

Data has become an essential asset in the digital age (not just in banking) and the quest for data has spun up an entire industry that feeds on information while banks have more than enough data sources they struggle to create coherent and efficient information from this. By adopting a data strategy similar to the one described above, financial institutions can make better use of their data, speed up the process of getting the right data to the right people, and stay relevant in today’s changing banking landscape.

Leave a Reply

Skip to toolbar