Real-time data is fast becoming a critical data source for enterprises, but to reap the benefits they must update legacy systems and devices while ensuring data governance and quality
In today’s data-driven competitive landscape, many enterprises are reaping the benefits of analyzing large stores of information, much of it kept in data lakes or warehouses. As analytical tools become more sophisticated, though, both enterprises and their customers are demanding immediate insights to maximize the value of their data.
“It’s the new standard,” says Juan Vitantonio, Director, Head of Group BI and Visualization at HSBC. “Once you have it you realize that without it, you’re going to be behind the rest. Real-time data availability is the way forward. We’re all moving into the cloud, so we are all competing for more accurate, realistic, and timely data.”
“This is something that everyone wants to move towards,” agrees Gladwin Mendez, who was the former Data and Technology Operations Officer at investment manager Fisher Funds, and current Advisory Board Member at Corinium.
“I don’t think you’ll find any organization right now that is thinking of the old batch way of doing things.” The old way typically only allows enterprises to act on insights hours or days after collecting their data, he adds.
“For example, as data came through, you may have done a batch process overnight, then pushed those into your data warehouse, that’s another 24 hours. The data warehouse ran its analysis or predictive models and then sent out an interaction 48 hours later. That way of interacting with your customers is just way too late.”
Depending on the business, collecting data that is ready to be processed in real time may require an overhaul of sluggish internal processes, making large-scale investments to equip devices with sensors, improving data quality, or overcoming issues with legacy systems.
A New Way of Doing Business
For any mature financial institution, gathering data is no longer the major hurdle to producing real-time insights, says Vitantonio.
“I think it’s the other way around: we collect too much data,” he says. “Our challenge is to understand the scope and how we want to use that data.”
“We perhaps use 50 to 60% of the data that we collect in a way that we could get insights out of it and make decisions faster,” he adds. “So, it’s about how you process it and how you make it available in a way that can be consumed by users.”
Bell and Howell, a US-based manufacturer of automated machinery that provides remote monitoring and click-and-collect services, also has a firm grasp on collecting and processing data in real time. The company began building an IoT-enabled system for real-time monitoring of machines five years ago.
It collects data from manufacturing equipment that enables it to prevent and repair faults remotely, or else save technicians’ valuable time with up-to-the-minute insights when they do need to fix an issue in person.
“It’s a new way of doing business, using digital technology to innovate your service business model,” says Haroon Abbu, Digital, Data and Analytics. “And because we have the cloud infrastructure, we can scale it easily.”
One key challenge that Abbu faced in scaling up real-time analytics was convincing technicians that it was a good idea. “To do things in real time requires a big culture shift,” he says.
“Putting together the technology that is required to do real-time monitoring was manageable but getting technicians to trust the models that we developed was harder.”
For enterprises pursuing an IoT-driven real-time data strategy, another potential challenge is the need to update machinery and devices. Newer equipment comes readily equipped with smart technology and sensors that can connect with existing data architecture, but it is costly. Older devices can be retrofitted with sensors, but this is also costly and potentially time-consuming.
“If you want to do real-time data on your legacy installed base, you cannot really do it without spending a lot of money to instrument them,” Abbu says. “You have to ask yourself: is it worth retrofitting old equipment with sensors, so we can make it smart enough to get the data we need? Or should those machines be replaced with newer, smarter, connected devices? And if so, what is the ROI?”
Startups Have an Advantage Over Established Enterprises
Because existing enterprises are encumbered by legacy systems, entrepreneurs have an opportunity to disrupt markets by starting companies that are primed to exploit real-time data.
“Legacy systems are always a challenge for any long-term organization,” says Mendez. “They weren’t really built for real time back in the day. That is of course unless you’re a fresh startup company, fully SaaS, with systems that were built for real-time APIs and interactions.”
But companies, new or old, that fail to ensure their insights are based on high-quality data will find their challenges compounded as they try to move to real time.
“If you don’t have your foundations right, baking in data quality by design to start with, then you’re going to be in a world of pain trying to retrospectively fix that information later,” says Mendez. “I’ve been involved in data remediations and transformations in the past where we’ve tried to uplift the data quality data migration and you’re in a world of pain.”
Another issue enterprises will face as they begin implementing real-time analytics is a lack of direct communication between data executives and senior leadership.
As the recent State of Data and Analytics ANZ showed again this year, many data leaders still do not report directly to their CEO. “This still frustrates me, and I think we are where the CIO role was 10 years ago,” Mendez says. “I know it hinders effective messaging about data at the board table.”