5 Reasons Why Advanced Analytics Projects are Failing, and Potential Solutions
By Dirko Hay, CEO, StreamBurst (Pty) Ltd As this Covid-19 pandemic continues, companies are making renewed efforts to invest huge amounts in advanced analytics technologies and are commissioning projects to buffer and protect themselves against the current economic environment, and to weather potential further storms ahead. But many of these advanced analytics projects are still not giving the ROI that they promise. Many do not end up in production and many fail outright. In 2019, Andrew White from Gartner (1) predicted the following: “Through 2020, 80% of AI projects will remain alchemy, run by wizards whose talents will not scale in the organization. Through 2022, only 20% of analytic insights will deliver business outcomes.” Here are some possible reasons why some of these initiatives are not successful: 1. A Lack of Comprehensive and Cohesive Analytics Strategy Having a comprehensive and cohesive analytics strategy to address modern day data drivers like business competitiveness, maximizing return on 5G and IoT initiatives in a changing landscape, and getting value from continuous digitization projects is key to delivering optimal ROI on data assets. Sadly, a big number of companies still have fragmented, siloed strategies to address this challenge and do not see or understand the value of an integrated, cohesive strategy that can significantly improve decision making. This is exacerbated by internal politics and failure to agree on key strategic analytical initiatives that can drive the company forward. The key here is to focus and prioritize initiatives that have executive buy-in and stem from a cohesive business strategy, and that can deliver high impact rapid results through smaller incremental steps. This will allow a culture where ‘fail fast’ can be tolerated and will ultimately increase the overall success rate of analytics, AI and machine learning projects. 2. Analytic Process Automation While many vendors today are talking about analytic process automation (2), very few offer the capabilities to move a company forward on the road to automated analytical processes. This has become a big differentiating factor in speed of execution of analytics projects and improves the overall competitiveness of a company as it increases decision-making capability, streamlines overall processes, and rapidly brings about a data-driven culture. The Pareto 80/20 principle where 80% of effort is still spent on data cleansing, blending and ETL/ELT related processes is still in effect in many analytics shops today. Although inroads have been made into changing this modus operandi, the key here is that this should be turned upside down and 20% of effort should be spent on cleansing and wrangling work and the balance of 80% on actually analysing and visualising the data. This can only be achieved through end-to-end analytic automation processes with software tools that allow for fast efficient blending, wrangling, collecting, and pre-processing of data. There are several software tools available on the market today that cater for this (2), and they are not necessarily only in IT hands, but also empower the business with self-service capabilities. 3. Neglecting of DataOps DataOps these days is more of a buzzword than an embedded practice of any sorts in the analytics domain and failure to embrace the inherent principles might cause challenges that could have been avoided with a well-executed plan. We will define DataOps as the ability to effectively execute and monitor any data related processes, workflows or infrastructure, effectively, with rapid speed, accuracy and automation capabilities, with minimal errors, and ensuring proper test procedures in development, with the capability to move analytical models and reporting swiftly into production for operationalisation purposes. DataOps is becoming more important as companies scale their analytical operations. More employees become analytical minded and require access to the capabilities that advanced analytics offers. Understanding what DataOps contributes and implementing the key principles will in future become a key differentiator for optimising advanced analytical projects and speed of execution in this domain. 4. Failure to Operationalise Data Science Models Effectively Many data science teams are excellent in developing data science models but lack the capabilities to operationalise them fast and effectively. In many cases, the development of models takes a fraction of the time that it takes to operationalise them, and this mainly stems from the lack of capabilities to put all the pieces together required for operationalisation and to create real value for the company from the models. Another factor is that by the time these models are production ready, the business value might have diminished. Having software to automate model operationalisation capabilities rapidly to production and to embed the models into key business process will expedite ROI on advanced analytics initiatives and ensure sustainability. 5. Failure to Adopt Open-Source and Other Advanced Technologies While some companies have the appetite to move unrelentingly forward in the pressing need to adopt modern technologies like streaming, real-time, event-driven databases, graph databases, and open-source technologies, others are sitting on the sidelines and waiting to see if it delivers value in vertical horizontal markets. Reasons for this are many, but one reason is fear of failure that comes about through being an early adopter in the past. Many of these technologies are however not new, and have been extensively tested in companies like Facebook, Uber, Netflix, Google and some of the largest fortune 500 companies in the world. They deliver real-time event-driven insights with sub-second response times, monetization of 5G and IoT initiatives, and the ability to analyse up to trillions of records. These capabilities will become key drivers of business competitiveness in the next 5 years as companies look to drive costs down and increase revenue and competitiveness. The reality today is that up to 85% of data science and machine learning models still do not make it to production and many advanced analytical projects end up in smoke. Failure to address these reasons for lack of success will cause executives to stop spending required funds to move analytics forward. There are, however, positive signs that this trend is changing, and Covid-19 has rapidly increased improvement and innovation initiatives in this domain. References Andrew White, Our top data and Analytics Predicts for 2019, Gartner Blog Network. Alteryx : What is Analytic Process Automation?