Diving Deeper

Make Big Data Smart Data (Part 3)

Part 3: Now What?

This third and final installment of “Making Big Data Smart Data” provides practical guidance to senior-level banking leaders – the head of consumer credit card, for instance – who understand the need to design and execute a data management strategy. We’ll look at the process of how we help clients put their data to work . . . making sure “all the holes line up” as they assemble a solution that enables them to leverage data and gain a competitive edge.

What have we covered up to this point? For starters, vision is wonderful, but planning is critical. Both are informed by understanding the competitive landscape. And then there’s the matter of cost. Outcomes just might be the result of your investment of time and money. As the first blog post mentioned, implementing a data management system can be a costly endeavor; but we aim to minimize cost and maximize impact right from the start. Finally, being Agile and building in sprints doesn’t guarantee success. When you construct something incrementally, piece by piece without thoughtful planning, you just might wind up with a disjointed whole. 

The first post in the series “Making Big Data Smart Data” describes data management in today’s financial services environment and encourages organizations to look “out there” across the data management landscape. The second post explains the need to look “in here,” to take careful note of what a financial services organization can do internally to prepare itself to become a data-driven enterprise.

Now what? After looking out there and in here, it’s time to look up ahead. It’s time to implement.

The Review: It Starts with Parts

Any review of data management maturity should include the key aspects of the framework. That’s where our review begins. We start by defining why we’re having a conversation about the challenges a financial services organization is facing. And then we ask the all-important question: “What are you going to do when we give you the solution?” Organizations have critical business objectives that  require smart data management to achieve – and their investment in the space should be laser-focused on achieving them.

From there, we briefly get a lay of the land. That’s about a two- or three-week exercise. We always advise clients to be on guard for anyone who would tell them what they should be doing before they know the terrain. (We’re equally skeptical of an “assessment” that takes months to complete.) We use this time to ground ourselves in where we need to start; we assess gaps and work with our client to determine remediations.

Our approach involves using a template for each element as part of our review process. We might start with data architecture, move to data development, and so forth, describing what our evaluation revealed, how we assess our findings, and where we think we can add value. We look to prioritize our findings, taking each facet of data management into consideration. We make our recommendations on how to address the biggest gaps and develop a plan and strategy for doing so.

It’s important to note that implementing our plan adds value on day one, and we enhance that value as we make incremental improvements during the course of executing a data management strategy. We’re always moving the bank forward in terms of functionality. Once we present our plan and make recommendations, it’s not a matter of waiting months to see results. Resolving gaps in data security, for example, we’ll take our client from version 1 to version 1.1 in a few weeks . . . and then to version 1.2 a few weeks later. By identifying and remediating the biggest gaps in performance first, we can make substantial progress in a very short time, and clients can begin to reap the rewards immediately. Because of our staged plan, the organization is better off every single month, and we can roll with the punches as new priorities arise, knowing that we solved the biggest problems first.

Building the Engine

So now that we have a plan in place based on our review, we start with the foundation: the tech stack. All the systems and processes we turn on have to live somewhere. Is the next step in terms of data maturity to migrate to the cloud? Is it just to make better use of an existing cluster or a SAS grid or SQL warehouse? Creating a new storage layer could be as intensive as building the client a data lake in the cloud, or as easy as being clever in the way we help clients use their existing database to do what they need to do.

This stage of building the engine may involve addressing issues such as pipelining/ETL, data governance procedures, or control design and implementation:

  • Pipeline/ETL – Moving data to where it needs to be and equipping those processes with the proper documentation and testing to make them reliable and transparent.
  • Data Governance – How is the quality and integrity of data maintained? How is it protected? Where do the approvals live? Who “owns” the data and how is it monitored? How are controls implemented?
  • Data Quality – Making data reliable by ensuring quality is well-defined and well-managed for things like completeness, consistency, and correctness.

As we noted during our initial review stage, we start by planning and making recommendations based on a cohesive strategy, so the solution is holistic.

Fuel Up and Fire Up

With the engine built, now it’s time to prioritize. What are the most critical functions that need to happen first? Here’s a look at several functions that take priority:

  • Define Critical Data Sets – We define critical data sets and data elements that would be part of governance activities. While an organization may have 10,000 unique data elements, analysts can’t feasibly manage all of them – nor should they try. We start with the most important ones, usually defined by their materiality and impact on downstream processes.
  • Inventory and Capture Metadata – While this may conjure images of an army of people creating spreadsheets, the process is very fast and it’s extremely valuable once it’s done. The process also captures nuances to the data, which are very valuable. What is this data supposed to be? What kind of data type is it? What’s an acceptable value? Can this be null? 
  • Documenting Data Lineage – Documenting where data comes from and having a chain of custody to know where it all goes is a necessary evil. We take care of this process quickly because we know how to automate it. But knowing how data has changed hands and been transformed before it gets used is critical to using it confidently and, probably more importantly for some, defending it later.
  • Execute Data Quality Controls – Once the framework is in place for managing data quality and defining critical elements for what’s valuable, an organization can begin to implement various methods to ensure that data is well controlled and reliable.
  • Data Quality Management – Having good transparency, as well as effective monitoring and reporting, is important because it enables people who need specific elements of data to access what’s important to them when they need it. Otherwise, they’ll spend all their time waiting and wondering.
  • Data Issue Management – So the new system is turned on. What happens when something goes wrong? Is there a backlog for the data team, a central repository for defects? Have proactive alerts been created for key teams and applications? This looks different from one organization to the next, and we can help build a program that is ready to go when the system is fired up.
  • Data Sharing Agreements – A somewhat tedious, but a quick and immensely valuable exercise, creating these agreements is smart practice. They offer clarity to producers and consumers of data for what data is created, what constitutes satisfaction with that data, how it will be used, and when it needs to be ready each quarter/month/day.

Hit the Open Road

With each step – reviewing needs, building the engine, fuel up/fire up – we work with our clients to ensure their success. This commitment is built into our service model, and it is the essence of the partnership we establish with them. We’re there to help if they need us. Ultimately, though, our goal is to build a system that they can operate without having to rely on us when they hit the open road.

Here are a few of the ways we do that:

  • Operationalize Data Infrastructure Strategy – Now that the system is up and running and a client is heading down the road, we can turn on all our ETL processes, work out any initial kinks, and go into BAU-mode.
  • Develop Business Intelligence Toolset – Now that we’ve provided an organization with a better handle on its data, putting it right at their fingertips, we make sure everyone knows how to access what they need when they need it. The business intelligence tools that are part of our solution, from data warehouses to enterprise reporting, offer them critical insight on their data. 
  • Prioritize and Develop New Capabilities – Prioritization is an incremental process. Once a system goes live, people are going to start requesting enhancements and modifications to their reports and ETL processes. Here’s where we start iterating for the future.
  • Document Roles and Responsibilities – If it’s everyone’s job, it doesn’t get done. Though it varies by organization, assigning some basic roles to identify who is responsible for providing and ensuring the quality of data, for instance, helps establish an interaction model to keep everything aligned and moving forward.
  • Transition Ownership – This piece is the thrust of what we have to offer. Our service isn’t designed to make clients dependent on us. Rather, our goal is to get them up and running. We help them take advantage of all the efficient, productive things they can do now that we’ve helped them build, install, and provided instruction on their new data management system.

Keep Your Eye on the Prize

As you go through the process of building a data management solution with your data engineering partner, remember to keep your eye on the prize. Why are you doing this? What are you going to do when you have a solution in place? What outcomes are you trying to achieve? Intentionality is critical. You don’t want to get sidetracked pushing papers around with consultants dealing with metadata documentation, armies of people with spreadsheets churning work while stuck in “Park.”

While you’re focused on developing an effective data management solution, look for opportunities to fix your foundation. If there are too many hops between the source and use of data, reduce them. If data quality controls are insufficient, bolster them. “Lift and shift” is too often code for “shelve it forever.”

There’s no question about it. Your bank’s data can help you enhance profitability, build new products, minimize exposure to risk, and deepen customer relationships. But it has to be expertly leveraged. That begins with clear vision and defined expectations and, of course, thoughtful planning. Yet as we’ve just described, your ultimate success depends on transforming vision and planning into action. For that, you’ll need the expertise of experienced data practitioners. When it comes to execution and implementation, they’re the ones who make sure all the holes line up.