Why we founded AgileData

09 Apr 2020 | AgileData Journey, Blog

My co-founder Nigel and I have been working in the data and analytics domain for over 30 years (well I have, he is slightly younger).

We have both held multiple roles through these years, Nigel primarily in the engineering space and myself in the pre-sales then architecture and finally agile coaching space.

We have worked off and on together as consultants on a myriad of customer data and analytics projects.

After a number projects together we came to the conclusion that the data and analytics market had a dirty little secret, and that secret is there is a massive rate of failure for data and analytics projects.

And this secret is not a particularly well hidden secret or a secret that only we have discovered.

In 2011 Gartner stated:

“Between 70% to 80% of corporate business intelligence projects fail, according to research by analyst firm Gartner.”

And with the advent of the Big Data hype and the Data Scientist unicorn teams-of-one, it hasn’t made the failure statistics any better:

“In 2016, Gartner estimated that 60 percent of big data projects failed. A year later, Gartner analyst Nick Heudecker‏ said his company was “too conservative” with its 60 percent estimate and put the failure rate at closer to 85 percent. Today, he says nothing has changed.”

Even an organisations data and analytics program doesn’t fail, the results it delivers are often well below the stakeholders expectations.

In our experience we often see a common scenario we call the 3-2-1 pattern:

3 million dollars, 2 years ,1 report

In this scenario a project to deliver a new data and analytics capability in an organisation ends up costing 3 million dollars, takes 2 years to deliver and after those 2 years results in 1 report in production. The reported “success” of the project by those involved is that a “platform” or “capability” has been built that can be used in the future to deliver data and analytics faster.

And according to Gartner only 20% of these projects will result in the 1 report being delivered, for 80% of the projects even the 1 report will not be achieved.

The 3-2-1 pattern is prevalent whether you buy off-the-shelf data and analytics software and try to implement it, or cobble together a bunch open source solutions with a dedicated data engineering team, or write something from scratch.

Nigel and I have seen projects where it is more akin to 60-4-0.

60 million dollars, 4 years , not one report or dataset or analytical model delivered in production.

So why does this happen? We think it comes down to one core theme:

Friction

Friction between the business stakeholders, the delivery team, the vendors. Where each group has a different articulation of how success is defined, what needs to be done and how to do it.

Friction between the various technology components. Integrating multiple technology components is technically challenging. Whether it is integrating the latest cool open source technology into your stack, trying to make an end-to-end vendors software components work seamlessly together (lets face it they have often been acquired and are cobbled together on the vendor side), or making on-premise based software that has been “cloud washed” actually work on AWS, Azure or GCP.

Friction between methodologies and architectures. In the deep dark past there were arguments on Kimball vs Imon, now people argue Agile vs Waterfall, Dimensional vs Data Vault, SQL vs No SQl, Data Lake vs Data Warehouse.

Friction between new and old patterns. The Big Data hype kicked off another round of technology drive transformation, on the tails of the self -service visualisation wave. As a result a lot of the mature patterns we had in the data and analytics domain were abandoned, for example data governance. I am old enough to remember the days of OLAP innovation, and the disruption to these patterns that occurred then as well.

The examples above result in a massive level of uncertainty and a lack of agreed or known ways of working, which causes this friction.

An analogy would be a set of experienced chef’s in a kitchen being given a whole set of new “innovative” cooking equipment and told that they could no longer use the recipes they knew worked. They would have to experiment and find new ways of cooking, while also delivering perfect meals to the customers at a fraction of the original costs.

So how do we fix this problem of Friction?

We don’t believe it is by becoming luddites and ignoring the new approaches that have appeared, we believe those architectures, technologies, approaches and ways of working can add massive value to the data and analytics domain.

We believe we can reduce the friction by adopting two things:

Software as a Service (SaaS)

In a previous role a long long long time ago, I was a pre-sales consultant woking in the finance software space. This was at the time when you had to buy seperate accounts payable, accounts receivable and general ledger software and install and configure it on-premise.

In the same decade we used to build custom Customer Management Relationship (CRM) solutions.

Now we use SaaS solutions to help us perform these tasks, we use Xero, Salesforce, Jira, Office 365 and Gmail.

There are a whole lot of benefits with a SaaS solution:

  • Your team can start working straight away, no waiting 3 to 6 months for a architecture / solution design and then install etc;
  • Technology and architecture decisions have already been made;
  • The SaaS platform automagically scales up and down when you need it to;
  • You only pay for what you use;
  • When new technology or architecture appear the SaaS leverages them without you having to do the work.

Reusable proven patterns

There are number of proven patterns that have been used in the data and analytics space over the last 20 years.

These might include technical patterns, for example how to manage changing data, or ways of working patterns, for example iteratively delivering value to stakeholders early or defining data requirements based on core business events.

Simply Magical Data

Nigel and I believe the combination of a SaaS data platform and baked in reusable patterns will remove the complexity and friction required to collect, combine and present data to your users to enable them to make better decisions using this data.

And so we founded AgileData to achieve this.