Why do "Self-Service BI" projects fail?
We are living in 21st century where our most important assets are time and information. These are the commodities which outweigh all other variables in value creation. The last decade has shown us an emergence of disruptive tools which have made self-service BI possible and prove their value by providing an optimal ROI and lower TCO. Self-service BI tools have significantly reduced the time to complete a BI project. Now, the end users are in the driver seat and they are leveraging Self-service BI tools like Tableau to create meaningful visualizations and share it with the relevant stakeholders.
Over the last couple of years, we have observed a trend where companies have migrated from traditional BI to Self-service BI. Self-service BI is intuitive and disruptive to the extent that it has become as game changer. It has changed the way projects are managed and drastically reduced the involvement of IT. Though it has been a boon it’s no fairy tale. There have been a lot of shortcomings in adopting self-service BI and it has not been an easy road. According to a report by Gartner 70 to 80 percent of BI projects fail. A lot of these projects are self-service BI projects.
Let us see the reasons behind the failure of these self-service BI projects and how they can be avoided:
Data is the Oil
As per an article published in May 2016 on the Northeastern University blog, around 2.5 Exabytes of data is produced every day. Every 3 years the data gets doubled. A lot of times companies focus so much on BI and analytics that they miss the obvious entity i.e. the data. A lot of Self-service tools are advertised in such a way that they claim to start BI immediately and can provide ROI. The ground reality is very different as data plays a crucial role and the shape of the data defines the success or failure of a BI project. Tools can only provide meaningful information when data is of good quality. Many of the BI projects fail due to the fact that not much emphasis is given to data before the start of a project. It makes sense to take a step back before the starting a project and check data integrity & data quality. Though Self-service BI tools gives an additional advantage that we can start reporting right away as these tools don't take months or weeks for installation & setup. We can start creating visualizations & reports in a matter of hours/days. I have seen many cases where IT teams are directly connecting to transactional systems for reporting and hence they struggle with database downtimes and BI performance issues. This approach is not recommended. The recommended way is to de-normalize the data and create data marts or data warehouses. This gives a better performance and better data governance.
With tools like Tableau, Qlikview, Power BI etc. Self-service BI has become a reality. Now end users does not have to depend on IT for reporting & analytics needs. Though the main challenge still lies and that is user adoption. A Lot of times in a conglomerate one division does not know what tools are being used in another division or one department is not aware of the tool being used in other. This creates siloed efforts which do not translate into an actual ROI. The business impact is lost somewhere and stakeholders don't see the real value-add of self-service BI. The key here is to advocate and collaborate with end users, departments, divisions etc. to bring everyone on the same page and create a data literacy culture. Emphasis should be given to help users adopt new self-service BI tools, provide them training and encourage them to use it. By doing this the end users & stakeholders will see the benefits of leaving legacy tools like MS Excel, Flat files etc. and move to disruptive tools like Tableau. According to a survey by Wizdee in 2016, 71% of the people say Self-service BI tools has brought an acceleration to their learning and responsiveness.
Gone are the days when waterfall model was good enough for implementing BI projects. With new challenges and a versatile IT environment, the need of the hour is to have an agile approach. Visualization and data management have to go hand in hand. Typically, BI team sits idle till the time data warehouse is finished and after completion, BI team would create reports/dashboards. Going this way is tricky as many times teams figure out that they have missed some data sets or the requirements have changed. This leads to changes in business KPI’s and the team would have to wait until these changes are incorporated in the data model. This whole process can be a challenge and sometimes be a never-ending story. Hence, BI projects often take multiple years to complete and would eventually lead to an inflated TCO with elongated timelines and hence, defeating the purpose of the BI initiative. I have seen many companies jumping to self-service BI and take traditional SDLC route to get stuck at a later stage. Companies should adopt an Agile framework to realize an ROI from start.
To summarize, Self-service BI is here to stay and it makes a lot of sense in aligning your organization by implementing the best practices of self-service BI. The key is to find the right balance between the data governance and giving the end-users the freedom to explore. There has been an upheaval in BI industry as a lot of self-service BI tools are coming in the market. It would definitely make sense to be prudent in selecting the right tool as no tool is 100% perfect. A right tool can definitely provide a good ROI and a great use case.
Self-service BI will not only thrive but will provide the real value if implemented properly.
About the author
Romit is a Senior Consultant at USEReady. With 7+ years of techno functional experience and an inclination towards Financial, Insurance, Healthcare, Retail, Education, Energy sectors, Romit is a pro at conducting root cause analysis of analyzing business to assist critical decision making. Visit Romit on LinkedIn