It’s time for Slow Business Intelligence

I have been aware for a few months of the Slow Web Movement, which, like other parts of the slow movement, advocates a cultural shift toward slowing down life’s pace. More specifically it is about getting more value and less volume from the web, getting quality information when it is appropriate and not being overwhelmed with a constant flood of real-time reports. Reading Jack Cheng’s Slow Web and Joshua Gross’ The Future Is Not Real Time blog posts prompted me to add these thoughts.

Firstly a story: I was delivering a business intelligence solution for a large European Telco and one of the ‘key requirements’ was to have ‘near real-time’ reporting of the response to adverts being run on television. The specific requirement from the marketing director was to know in real-time how many people were prompted to ring a call centre after an advert was screened. The challenge I laid down to him was “What are you going to do with this information? You are sitting at home at 10pm and the advert shows on TV, your smart phone buzzes five minutes later and tells you that 50 people have rung into the call centre using the promotion code.” The conversation wasn’t a smooth one – “I understand marketing – do you?” but I pushed “Was he going to ring up the TV station and get them to switch to an alternate advert if the numbers were higher/lower than he wanted? Or perhaps he was going to ring the manager of the call centre and get them to bring in more staff or release some staff?” When I pressed he saw my point – he was going to get the information and then review it with his staff over the coming weeks to plan future campaigns. What he didn’t admit was that to some extent having real-time reports was about prestige among his peers in the organisation.

The cost of delivering his requirement would have been huge – developing business intelligence is still a costly exercise and the overhead of introducing real-time requirements is massive. There are genuine useful examples where real-time analysis is vital but treat them as special, don’t build your entire business intelligence environment to support it. And don’t allow a system to be built for prestige – ask what the value is.

Of course this concept isn’t new – in March 2006 I presented From Volume to Value: What Next Generation Telco Data Warehouses Must Do to Provide Value to the Business which looked at reducing the volume of information produced in favour of delivering more value with that information.

One of the enduring ideas from this presentation is the use of RSS as a delivery mechanism for Business Intelligence. Building RSS feeds is easy, and that includes creating feeds with attachments. What is more – the ability to read an RSS feeds is almost ubiquitous – you’ll find RSS readers on you Windows, Linux or OS X computer as well as on your Android and iOS handset and many other places besides. RSS has a rich optional metadata set that can optionally include categorisations, publication times, authors and many other elements that allow a user to effectively filter what they need. A metadata rich, RSS feed that uses PDF attachments creates a widespread distribution mechanism that greatly reduces the need for software licences for managers that will rarely log into expensive desktop tools. It puts the information directly into the users hands and allows them to handle it when they want to or need to.

Good business intelligence is about supporting business users to make good business decisions – it is not a technological end in itself. Sometimes a tortoise with timely information is going to perform better overall than a hare that shots off at the start and has to re-assess further down the line – just like in Aesop’s Fable

7 thoughts on “It’s time for Slow Business Intelligence

  1. I think this is why the repeated use of operational spreadsheet applications are still such a critical part of the business information stack. They may be slower but they can be tailored to deliver exactly the analysis that the user wants rather than some approximation that somebody else thought was what they would want.

  2. David Birmingham said on LinkedIn:

    There’s extraordinary value in right-timing the delivery of the information products. I cannot count how many folks told me up front that they wanted their data processed in “real time” when they would be using it only for weekly reports, if that. Per the article, the glitter of having a rapid turnaround was good for peer-applause but not really a good idea for business.

    One international distributor, with offices near us in Dallas, spent millions of dollars to real-time-deliver all of their information sources, only to learn that none of them were truly synchronized in real time. Some of them could be synchronized daily, others weekly, others less often. Two of their systems had been previously designed to deliver in “time offsets” to keep pressure off the networks. Now these time-offsets would work against their real-time needs.

    Or did they? “Real time needs” rarely work out to be necessary in real time. If the environment delivers data on an hourly basis, and each delivery is such a small percentage difference that it doesn’t “move the needle” on the dashboards, this is just a waste of time and energy.

    Let’s also not forget that the propensity for injecting error rises with the aggressiveness of the frequency of delivery. Once injected, how do we roll back bad data that was delivered three hours ago, when all subseuqent data deliveries were computed against it, and now we have no choice but to back-out all of it? The frenetic pace of real-time might be necessary for the tiniest percentage of cases, but before going down that road, qualiify the reporting and delivery needs of the delivery points.

  3. Gary Nuttall said on LinkedIn:

    In simple terms there’s usually little to gain from having a BI capability that operates at a faser pace than the business’s operating model. I’ve certainly seen cases in the retail sector where “near real-time” reporting (data available within 15 minutes of transactions occuring) was mandated for a supply chain project. Sounds great until you realise that the lead time from order placement with supplier todelivery in store was 4 – 6 weeks.and their production schedule was refreshed monthly. Merchandisers/marketeers thought it was fabulous having the ability to track and respond to trade within 20 minutes…..without realising that nobody else in the supply chain was able to modify their processes to handle such a massive change in velocity (and overcame the variability through effective inventory management).

  4. Martijn ten Napel said on LinkedIn:

    David, the title of your piece nearly threw me off, but I am glad I read it. ‘It is time to reflect on the added value of using BI’ is more appropriate. Most BI professionals will recognise the situation you describe.

    I really like your idea of RSS as a distribution mechanism. I think there are some issues with security and targeted audience, but that could probably be fixed. But in the basis very good thinking!

  5. Martijn, Hi I see we both know Mark Vanhommerig. I only briefly touched on RSS in the article. For security we serve the RSS feed as a .htaccess authenticated feed so it has a username/password in front of it – most RSS readers allow this type of authentication. We then use the username to select only the articles for which they are approved.

    We also have made a full mapping of the RSS spec to a table structure that allows us to distribute reports and also data quality information including data quality graphics. This is a relatively small Ruby on Rails application that generates the RSS feed and also allows manual entries into the feed. We can also populate the underlying data structures with batch ETL jobs. This allows mass updates into the feed.

    As an example I have 10,000 data quality monitoring points that get tested every day on one project. The data quality is categorised on a number of factors including a RAG (Red/Amber/Green) status, which source system it came from, which functional business area it belongs to, etc. The user can build up a feed that gives them all the items they are allowed to see (based on username) that are, for example, Red, ERP System and Finance. As a result an individual data steward sees only 5 or 6 items each morning that they have to deal with, the rest of the noise has been effectively filtered out. The system costs little to implement as there are no product licences, it only delivers the required actionable information and distributes it widely across the organisation.

  6. Chris Barker said on LinkedIn

    Where and how would you draw the line between SCADA/operations and BI. Is it as simple as actionable events for the context of the business role or are there additional features that would sway this such as measurable business metrics?

  7. Chris,

    As we push towards more real-time Business Intelligence (BI) the distinction blurs but I suspect that if I need to classify the differences it is about the degree of processing that is needed before one can use the data. If an event happens on a machine and that is passed to a controlling computer and acted upon then we are clearly in SCADA/operational. If the event is fed onto a subsequent system and combined with other data then it is heading into the area of BI. Often the definition will be as simple as which system an organisation implements the functionality in.

Leave a Reply to David Walker Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.