Over the last month or so, I have read numerous blogs that deal with forward-looking trends for 2017. While each blog has its angle, one cannot miss the huge role being placed on data analysis and analytics. As such, at the end of last year I started a three-part blog series around what it takes to choose the right analytical solution; a guide for those seeking a comprehensive analytical platform for making better business decisions around risk, pricing, marketing, and growth metrics.

I mentioned in my first blog “A Solid Analytical Solution: What Matters Most?” that there are two imperatives for building a solid analytical solution – Operational Excellence and Analytical Sophistication. In this blog, I want to delve deeper into Operational Excellence and discuss how a solid analytical solution must have Operational Excellence components woven throughout to support better enterprise decision making.

What is Operational Excellence?
Operational Excellence considers how a solution serves the entire decision making process that it is tasked to solve inside an organization. For complex decisions around how to price, market, and deliver products and services, this process involves multiple steps involving analysts or users from different disciplines, departments, and geographies.

How does Operational Excellence support better enterprise decisions?

1. Data as Fuel

We have all heard the analogy that data is the fuel for moving businesses forward. The need to source, store, and structure data in an appropriate manner for use by analytical technologies is often understated. Operational Excellence certainly includes the ability to transform data into insight in a simple and structured manner. What does this include?

  • Data Access. Solid analytical platforms often have data management included. Nothing is more frustrating for analysts when determine the best price or offer to deliver to an end customer than not having access to all of the risk, cost, loss, or marketing data attributes of a customer segment.
  • Data Structure. The need to search, query, and join data from both modern and legacy systems is still present. Not having a simple manner to do this impedes the fast paced analytical environments of data.
  • Data Distribution. Distributing data for quicker processing and faster results gets analysts the fuel they need to drive their business forward. Can your analytical solution leverage distributed data sources?

2. Connectivity and System Reliability.

Connectivity and system reliability are commonly underappreciated until something fails in an enterprise software environment. Suddenly, they become very important. But the fact of the matter is, professional product and engineering teams consider these two aspects on a daily basis. What should be considered?

  • Connectivity to adjacent solutions. From a connectivity perspective, advanced programming interfaces (APIs) and web services are crucial to leveraging data and information from other solutions. You can’t always push a next best offer or action decision to a front line system without accessing CRM, BI, and/or Digital Marketing solutions. The ability to use Representational State Transfer (REST)/Javascript Object Notation (JSON) formats to get and return attribute values from other source systems is crucial when extracting needed information from these source systems.
  • Request and Return of Information. In addition to APIs and JSON formats, the SOAP (Simple Object Access Protocol) specification allows for exchange of structured information via a XML based messaging framework that introduces extensibility, neutrality, and independence into an analytical solution. It makes the retrieval and pushing of information very simple for users, such that other source systems can be accessed in a rapid manner.
  • System Reliability. Uptime is important. From a system reliability perspective, technologies like Amazon Web Services (AWS), for example, can be leveraged by the software to ensure a very high uptime percentage rate (99.95%). Cloud environments are extremely secure as well, much more so than typical on premise data environments.

3. Scale and User Concurrency. It’s certainly no secret that data volumes, varieties, and velocities are growing inside of insurance and financial services organizations. To that end, the scale at which data can be managed and processed is a crucial component in a modern software solution. Exactly what is important?

  • Large Scale Data Processing. Leveraging open source technologies such as Apache Spark allows for lightning fast cluster computing, which addresses volume restrictions typically seen by non-distributed computing systems. It increases the threading, throughput, and scale of analytical software environments. Technologies like these are easy to use and allow analysts to combine SQL (structured query language), streaming, and complex analytics in a single environment.
  • Machine Learning. Similarly, technologies like H2O can be used to support the use of machine learning automation and algorithms. Models can be created and saved as plain old java objects (POJO) files, and then imported back into the software for use. This increases the ability to leverage, explore, and analyze cloud based analytical data sets.
  • User Concurrency. New data stores, application servers, and emerging technologies must be continually evaluated to ensure user concurrency continues to move forward.

4. Comprehensiveness of the Platform.

End-to-end speed and efficiency (time to market) of an analytical solution is important.

  • All Steps Covered. When all the steps in an analytical decision making processes are included in one integrated system, there is no need for lengthy integration activities. All relevant parties can easily and quickly get and input information, and automation occurs where possible. Additionally, audit and control processes are easy to account for and fewer mistakes are made. This can often be difficult for platforms that make heavy analytical based decisions (i.e. stats or optimization) in the process.
  • Time to Market. As a result of covering all process steps in a single platform, shortened execution cycles of changes are realized. Timeframes – from input of data to decision out in the field – often go from months to days. In the case of simple changes, this time can even be hours!

Hopefully this gives you some insight into how we view Operational Excellence from a software development perspective and how important it is when choosing an analytical system. In our final post of this series, we will discuss Analytical Sophistication – and how organizations can leverage predictive and prescriptive analytical techniques to their fullest potential. We will talk automation, optimization, modelling, and more. If this is of interest to you – please stay tuned! If you’d like to talk prior to that, please contact us at Earnix!