Every day we see the impact of real-time analytics and personalization on modern business. Mobile has been the ultimate catalyst, driving new customer experiences that are tailored to the individual, as data fuels a race to provide more value than competitors do.
The emergence of the open data economy is enabling access to entirely new business opportunities, but there are misconceptions around what it means to truly implement a real-time solution. I wanted to use this post to take a brief look at what it really takes to compete in this landscape where milliseconds matter, and expose some of the myths around the use of the term “real-time.”
Let’s say you are looking at making your operational business processes smarter. Your team has targeted two main battlefields: The first is identifying customer needs on the fly, to optimize services, and present them with relevant cross-sell/up-sell offers at the moment of engagement. The other is upgrading your payment systems with a more advanced kind of analytics that detects and rejects fraudulent, suspicious, or simply improper payment requests while in flight—but that doesn’t interrupt service unnecessarily.
After researching the required capabilities, you decide to integrate predictive analytics into your core operational systems, with a vision of using these models to score transactions as they occur. What many don’t realize when selecting their analytics applications is that there can be limitations based on the system on which they deploy those applications.
The Illusion of Real-Time
It’s very fashionable to hang the real-time tag on just about everything these days. But if a solution requires data to be copied and moved so the analytics can be invoked on some other platform, the result will be sub optimized because the analysis is being performed on old data. You can have the best strategy and the most advanced tools in place, but if you’re working with data that isn’t live you’re not getting the best result. And if you can’t integrate analytics at the transaction level while the transaction is in flight, it’s like trying to stop fraud after the transaction has already happened, or suggesting a product to an individual after they have already left your site. These limitations can stifle innovation and leave competitors with an open lane to provide better-quality service on the back of better customer insight.
So Why Does This Happen?
It all comes down to the ability to do in-transaction analytics processing. Many projects with a vision of real-time analytics and processing just can’t process the data fast enough because they don’t take infrastructure into account when the solution is architected. If you are not running your analytics in the same location as your data, you have to move it and process the analysis as a remote call. The extra time that you lose when data has to be moved and processed off-platform has a very real cost, especially on projects that need truly real-time responsiveness at their core.
And lack of speed is not the only cost of moving your data somewhere else in order to process it; your security is also threatened when a system requires you to take data off-platform for analysis. Disaster recovery, business continuity and data integrity all come under pressure—as you are essentiality losing control of the foundations of your project.
These pitfalls are based on an outdated understanding of available options for matching functional capabilities to the underlying IT infrastructure. Discussions need to be had about the limitations of executing analytics outside a company’s core systems, and the reality of what it takes to deliver a competitive real-time solution.
The Right Approach: Move Analytics to the Data, Not the Other Way Around
When it comes down to it, every millisecond matters when injecting intelligence into operational systems. Look for ways to leverage your technology to perform transaction analysis in flight. The key way to do so when designing a project is to locate the transactional data at the heart of it, understand the current SLAs for the operations associated with that data, then ensure the added processing will not go against the terms of those SLAs. It’s the clearest way to identify whether your analytics will be running truly real-time or sort-of real-time, on truly fresh and accurate data or on aged data.
Infrastructure can make or break any analytics project—look to integrate your analytics processing directly on the same infrastructure as your core operational systems in order to save those milliseconds!
It’s all about knowing your options so you can understand the value you will be able to provide the business today and down the line. For more information, I recommend the “Decision Management Solutions” paper available below, and would love to discuss it further in the comments section provided there.
About the Author: Paul DiMarzio
Paul has 30+ years experience with IBM focused on bringing new and emerging technologies to the mainframe. He is currently responsible for developing and executing IBM’s worldwide z Systems big data and analytics portfolio marketing strategy.