The vast majority of Business Intelligence and Analytics solutions in place today operate on out-of-date, backwards looking, historical data.

When you are operating on strategic timeframes and asking long-term questions, e.g. about last quarter or last weeks sales data, this is absolutely fine.  The benefits of analysing up-to-the-minute data are minimal.

However, there are many scenarios where businesses are looking to move towards much more real-time solutions, e.g.:

  • Using data for operational purposes, for instance guiding employees "next best action" continually throughout the day;
  • Real-time monitoring of critical KPIs and indicators in order to identify problems and opportunities early;
  • Optimising or facilitating the customer experience in real time;
  • Security, compliance, regulatory or safety controls where the business is at risk due to slow data processing.

In all of these situations, the value of data decays over time.  The earlier we can get it into the hands of our employees, algorithms and customers the better - even down to millisecond granularity.

The Challenges Associated With Real-Time Data

Moving from traditional Business Intelligence towards more real-time processing is however a challenging technical problem to solve.

Much of the data and analytics world is based on centralised data warehouses, with infrequent batch ETL jobs loading data on an hourly or even daily basis.  Once the data is loaded, the main consumption model is through reports and dashboards which may be infrequently accessed by senior leaders in the business.

Businesses need to modernise from this situations towards "streaming" events in real-time, where they are bought into into centralised data lakes and stream processors where they can be analysed and processed proactively and automatically.

Many scenarios also need to be very accurate, in some instances providing exactly once processing such that we never lose or never double process a message.  To achieve this, every part of the technology stack needs to be reliable to failure.

The analytics we need to perform over real-time data could be complex in nature. For instance, we might need to aggregate data and ask questions across data streams and across time windows and spanning both historical and real-time data. We also need to deal with edge scenarios such as errors, anomalies, duplicate or late arriving data.

Individually all of these are solveable, but to deploy real-time data processing with good performance, reliability and which provides the type of complex analytics that we need to can be a large undertaking.

Why Now?

Companies across industries are investing in becoming more data-driven, using their data more intelligently and in real-time to improve their customer experience and business efficiency.  Well-trodden case studies include technology companies. such as Google, Amazon, Uber or Netflix who use their vast quantities of data to offer amazing digital experiences.  Companies that don't make this leap will over time likely fall behind in terms of customer experience and operational efficiently.

As this is a complex technology modernisation journey, we believe it is worthwhile starting early.  Look to implement one or two streaming use cases in the business, create a Microservice to drive some real time change, and offer your employees a taste of real time data on the user expeirence.  The benefits will immediately become apparent as a new feature or two hits production.

This Post Requires A Membership

Sign Up

Already A Member? Log In

© 2022 Timeflow Academy.