In our new, all-digital world, becoming fully analytics-driven is the baseline survival criteria for banks. But is that an achievable goal for all financial institutions? And, if it is, what is the roadmap to a single analytics-driven system for pricing, personalization, and credit decisioning?

When banks turn to us to launch them on the path of becoming an analytical, data-based, fully automated organization – we look at several different parameters.

The hidden treasure

Globally unprecedented unemployment rates introduced banks to a new level of uncertainty in financial risk-prediction.

Meanwhile, studies show that personality traits and scoring, based on psychometric questionnaires, gamification, or analysis of social data, can be predictive of risk behavior and can assist with detecting intention of people to pay – or how likely they are to expose themselves to financial risk.

More than that, all banks have data – and more data than they know they have, in most cases – on their clients’ spending and saving preferences, and increasingly also on their everyday habits and lifestyles. The most advanced banks perform experiments that augment their data pool by adopting and using the right technology, along with a test-and-learn mentality and strategy.

Such experiments can help institutions generate new offers that are perfectly tailored to individual customer needs. They can also be instrumental in assessing the effectiveness of different communication methods or identifying changes in customers’ risk profiles, preferences, and appetite.

Well-designed experiments can be an effective and reliable source for quick learnings, adjustments, and growth. Working with a system that supports randomized experiments natively and seamlessly is a competitive edge that can make a difference.

Data collection

The means of data collection change right along with the organization. That said, a systemized analytical process is easily the best enabler of faster and more accurate data collection. Here are some pitfalls you may want to watch out for when gathering your pool of data:

  1. While you should strive for improved data quality, do forget about perfect data, and definitely don’t wait for it. Data will never be perfect. Instead, use the data you have and augment it with quick and agile experimentation.
  2. Beware of paralysis by analysis. Too much data is a double-edged sword. It can be a treasure trove of wealth or a source of confusion and indecision. Always remember – the key is to ask the right questions.
  3. Enable your stakeholders in the business, pricing, analytics departments (among others) to generate and access data-driven insights with little to no IT-friction.

Flexible modeling framework

To manage the wealth of data banks can accumulate, they need a modeling framework with predictive & machine learning models capable of dealing with data volume and variety. This modeling framework must be backed by the appropriate technology stack. Nurturing an organization into using the right modeling approach is crucial for the health of its analytical framework.

When building a flexible modeling framework, follow these guiding principles:

  1. Address the key questions at hand and do not be bogged down with nitty-gritty details or model perfection.
  2. Every sock has a slipper. Tailor the right model to the right problem, and remember that sometimes simple is best, especially when reality keeps changing by the day or when there is a need for increased transparency.
  3. Stress-test your models and assumptions and check the soundness of the predicted outcomes. Especially when there are too many unknowns, it is important to run and quantify many scenarios to help inform key decisions.
  4. Test and learn. Put together a framework for learning from new data, and to enable your system to adjust your models quickly.
  5. Make sure you engage the business throughout the process – especially in volatile or uncertain times.
  6. Finally, avoid the last mile problem. When you develop the models, make sure you have an understanding of who will use them and how they will be used in production and build your models accordingly.

Mix it up

Machine learning models are excellent tools for uncovering complex relationships hidden in the data, and for providing highly accurate predictions. However, if you need to extrapolate beyond the reach of the currently available data or more specific inferences with respect to the impact of a particular variable, then machine learning models on their own might not be very effective.

Hence the need for hybrid modeling approach– take advanced machine learning models and combine them with traditional modeling techniques like Generalized Linear Models (GLM) or Generalized Additive Models (GAM), which are currently in wide use by banks in pricing.

There are a few advantages to these models that would make them more appealing these days, both for businesses and regulators: they are more transparent, allow control or adjustments to the direction and magnitude of key variables, and can be easily modified when performing stress tests.

Agility all the way

Analytical agility has always been important – but it became vital with the volatile market conditions that 2020 brought on us. Banks need to quickly model a complex new reality on which they can base their business decisions.

While a complex, overarching model cannot be adjusted fast enough to react to rapidly changing circumstances, customer needs, and more, experimentation can help with finding answers to very specific problems. Experimentation can be effectively used to define the right timing for offers or communications, the product or bundles of products to offer next or to model customer price sensitivity.

Analytical proficiency

When asked in a recent Earnix survey, more than half of respondents rated their analytical proficiency as ‘medium,’ with a further 29% believing that their analytical prowess is ‘low and needs improvement.’

To fully leverage the potential of embedding advanced analytics into the business, financial institutions need a certain level of organizational analytical proficiency. Banks need to invest in educating the entire organization, including key business stakeholders & executives on the importance of data & analytics to drive business performance and operational efficiency.

It is equally important to promote close collaboration across the organization between data science, risk, product, finance, sales, marketing, and IT and to create agile cross-team processes. Along these same lines, we are starting to see banks hire dedicated teams that have the goal of maintaining strong connections between data scientists and the business.

Systemize, automate, and repeat

According to another recent Earnix survey, 37% of banking professionals estimate that it takes their bank three months or more to deploy new prices throughout their organizations.

The secret to a more agile and faster pricing process lies in automation – an outstanding goal for all banking organizations. It is sophisticated automated processes that propel data through the bank’s system, enabling intelligent pricing decisions that are delivered immediately to the customer. Such a system translates naturally into faster time-to-market and is the key to exceptional governance.

Yet another benefit is the reduction in manual and repetitive tasks. With fewer errors, resources are freed up, and banks can refocus on what matters (like their customers’ needs). In another recent Earnix survey, nearly 30% of those asked complained about an overwhelming amount of credit requests they could not handle on time.

A streamlined decision-making process, along with agile analytics, is also the way for legacy institutions to catch up with (and overtake) their nimbler competitors such as challenger banks, adopt open banking solutions and lean into this new era, that requires maximum agility and stringent governance. It will allow them to refocus on their customers – instead of attending to cumbersome legacy processes.

With regulators becoming more hands-on following the numerous COVID-induced government lending programs, consumers being spoilt for choice and executives ever more vigilant – the question is not about whether to systemize or not, but why banks haven’t started yet. Your financial institution cannot possibly wait any longer. We are living through banking’s watershed moment – and the only way out is towards systemization, automation, analytics, and data.

Come, join us on the other side.

Dr. Reuven Shnaps

As the company’s Chief Analytics Officer, Reuven Shnaps leads the Earnix Analytics organization. In his role Reuven oversees Earnix product roadmap management and directs the company’s analytics strategy, as well as research and delivery for Earnix customers worldwide. Reuven has spent more than 20 years developing and analyzing advanced statistical and economic models. Over his 10+ years at Earnix, he has gained extensive experience in crafting analytical solutions, and managing and implementing pricing & analytical projects for leading financial institutions around the globe. Prior to joining Earnix, Reuven worked in the economic and statistical consulting group at Deloitte & Touche in the United States, serving corporate clients across multiple industries. He holds a PhD and a master of arts in economics from The University of Pennsylvania and a master of arts in business economics and a bachelor of arts in economics and statistics from Bar Ilan University.