Summary Solution: From EUCs to structured, automated data, we walk you through the process, best practices and change management to ensure you’re leveraging analysts’ skillsets to the fullest.
Since its introduction in September 1987, Excel has been the go-to resource in an analyst’s toolkit due to its versatility and capacity to quickly translate complex formulas into illustrative tables and charts. Over the ensuing years, a typical financial services analyst would build elaborate, interwoven spreadsheets, simple models, assumptions and calculations, all in Excel, creating end-user computing (EUC) applications. And while these EUCs built the foundation for modern automation, generally speaking, they lacked traceability or documentation, making model adjustments challenging and presenting serious challenges for governance, risk and regulatory compliance functions.
With the passage of time, these EUCs would become more entrenched in business practices, making them prone to errors. They often contained numerous dependencies and touch points with data repositories and reporting tools. This made them very cumbersome to effect small changes in data or logic without setting off a domino effect that could result in analysts having to spend time checking – and double checking – to ensure changes were reflected throughout various sheets. This level of complexity and executional nuance often led to key person dependencies and lost productivity. And as organizations grew, the obstacles increased exponentially – to the point where using Excel and making manual processes was not sustainable.
As times have changed and the amount of available data has increased, the need for data analysis has exploded. Organizations have become adept at translating data into insights that drive decision-making and growth. And unlike their predecessors, high performers are focused on analyzing the data and generating insights instead of cranking the various Excel operations to gather the necessary pieces of information. This increased agility is a direct result of process automation.
Key Motivators for Process Automation
While manual systems will work to a point, there are a couple of key reasons why organizations turn to automated processes. Often, there’s a regulatory motivation. Tools like Excel create governance challenges because of their lack of traceability, documentation and controls for manual error.
In other instances, it’s a simple resource issue. End-to-end automation of critical processes saves time and eliminates taxing dual-control exercises that require additional resources and often cause team member attrition. These systems are more scalable – and adept at adding new functionality – condensing go-to-market timeframes significantly. And because of their capacity to self-run, high-growth organizations are able to redeploy build teams to automate new areas of the business.
Developing a Deeper Understanding of the Business Intent
In order to design the high-level requirements of an automation project, build teams need to understand existing processes at a granular level, with the goal of determining whether to automate, eliminate or simplify. The best place to begin collecting this information is often through the analysts themselves. Empathy interviews are a useful tool for identifying process painpoints. Teams may also shadow business analysts performing various processes, which can lead to documentation refinement as well as new system requirements to better meet the needs and objectives of the organization.
Once the underlying business intent is understood, teams must delve into the processes themselves. Analysts begin by diving into major process step documentation. But the learnings shouldn’t end there. It’s important to invest time in understanding and cataloging each sub-process step, and then rationalizing it within the context of its overall impact on key metrics (remember your mantra: automate, eliminate or simplify).
As a best practice for each sub process, try documenting the business logic in easy-to-understand language and pseudo code so that the information can be used as a going-forward reference point. Focus on documenting dependencies and the handoff mechanisms between processes. Additionally, attempt to capture failure modes for each of the processes.
Design and Build the Solution
Once you have an understanding of where you’ve been and where you’re headed, develop a future-state process flow and data flow diagram. During this process, it’s important to separate out data inputs from core functionality logic. Similarly, work to preserve the integrity of the outputs by using the software development principles of reduce, reuse and recycle wherever possible. That means avoiding hardcoding and leveraging parameters to drive the process change. Try employing modularity – small functions with single responsibility – which will encourage reuse.
As you build, attempt to make all functions flexible and extensible, in an effort to accommodate future business process changes. Don’t be afraid to challenge current methodology if it reduces complexity and ultimately creates more value. The goal is to consolidate documentation, scripts, past results and business requirements all in a single repository, while in a modular form that preserves flexibility and transparency for future innovators. Most importantly, as you go through this process, establish an iterative feedback loop with the customer, enabling your team to deliver shippable features for each sprint.
As you work to orchestrate homogenous components, develop a simple wrapper application that enables simple configuration-driven execution. At its core, the main script should call other scripts based on the configuration file. Once this skeletal program is developed, create a testing framework that includes end-to-end tests, regression testing and unit testing. The intelligence from the framework will help drive the development process and build organizational confidence, which can be slow going with the introduction of automated insights – particularly in a historically manual environment.
Next, using the existing framework, develop each individual data extraction and transformation script to create the input data abstraction layer. Abstracting the data sources from downstream usages makes changes in data or logic easy to absorb without disruption to critical processes (unlike their EUC ancestors). Work toward simplifying outputs and standardizing output formats, which will streamline reporting setups in Tableau and other tools.
If the individual modules of the overall process are heterogenous in nature, then you may want to consider an open-source package like Apache Airflow. Once deployed, system users can set dependencies and triggers to enable end-to-end execution of the entire process. This process can be scheduled to kick off automatically or on a specified basis, depending on the business needs.
Help Users Adapt to the Solution
Building these systems is a challenge unto itself, but the journey doesn’t end there. You should expect some level of change pain as you transition from an EUC/Excel world to a structured execution system. Try embedding business analysts – ideally the same ones who drove your design and implementation – as product owners to ensure understanding and adoption. Utilizing this team as experts and teachers will also enable you to train other business analysts and communicate the value proposition of automation.
In terms of maintaining and enhancing the system, successful automation will require analysts to develop additional technical skills in SQL and Python, as well as a comfort in open-source tools like Apache Airflow. This is an opportunity to upskill analytics teams, and management should provide the necessary time, space and incentives to encourage the jump toward process improvement through more sustainable technologies.
While the initial setup can be time consuming, simple process automation will free analysts from the cumbersome task of maintaining EUC/Excel systems. If you need help designing, developing and/or delivering a simple platform-agnostic, configuration-driven, Python-based application for your business analysts, Flying Phase can help. Click here to get in touch with our team.