A Thought On The Disruptive Innovation Trap…

In the search to constantly innovate and try to “disrupt” the market we often lose focus on the basics, where lots of value may still be untapped or providing a product/service/journey that we think will be beneficial because it has a bit of technology, AI or analytics behind it. Using technology to lead these ambitions is often the fallacy that proves to be the undoing. Disruptive Innovation stems from the entrepreneurial domain that references innovation that breaks down the barriers of complexity and prices, making it accessible to the broader market. We often loosely use innovation to refer to anything “new” that is attempted, such as enhancing a product or process.  In the book “The Innovators Dilemma”, we are presented with two types of innovation that companies deal with. Sustainable innovation allows companies to incrementally improve operations to remain competitive and drive business as usual. In contrast, Disruptive innovation uncovers new categories of customers by harnessing technology. While both types may be good for the prospects of a company, the difference between the two is important. The reason for this, is that each scenario requires a unique strategic approach, and this is essential in realizing the associated benefits. It is also important in determining how we react to competitors in our market. For example, disruption takes time which can often cause competitors to go unnoticed until their innovations are revealed to the market. Consider the depiction of the theory of disruptive innovation, in figure one. Research suggests that companies generally focus on sustaining innovations in a few well-established value areas (1),  This creates a gap of unfulfilled needs (2), that develops over time for a subset of customers (that can also evolve into widespread needs). This creates the opportunity for competitors to disrupt (3), generally with business models that are agile enough to take the opportunity while incumbents may not be able to shift fast enough to counter. The result (4) is the disruption. Figure 1: Theory of Disruptive Innovation (Source: MIT Sloan Management Review) While this theory doesn’t capture the realities of sustaining innovation completely, it highlights the omnipresent threat of developing products or services devoid of customer needs and expectations, in the pursuit of technological advancement or improvement. The Customer Experience (CX) that you intend on driving, develops from customer needs – it is a pivotal element in both types of innovation. This provides a steer that will allow you to target unmet customer needs and sustain innovation without overlooking key existing customer needs. Starting with your CX in mind, allows you to avoid costly technology decisions that may add little value in the future. Data has a pivotal role to play in this process, in order to understand your CX – you need to be able to measure it in a way that makes sense. While legacy measures such as Customer Satisfaction Scores (CSAT) and Net Promoter Scores (NPS) are able to tell a small part of your CX, the important parts (what your customer doesn’t tell you) largely goes unnoticed until it becomes an issue. This is critical, as customers are no longer comparing CX within industries but across. For example, the ease of signing up for a streaming service is being compared to that of signing up to a bank account. The miss match in the unfilled expectation opens the door for competitors. Using the understanding from designing your CX journeys, you are able to pinpoint what is important to measure to ensure each handover is successful without actually asking your customer. Companies that have accomplished this have outperformed in their respective markets and have higher success rates launching new products, as they are geared toward customer needs and not just chasing an innovation milestone. About the Author Dr. Durrel Ramrathan Senior Analytics & AI Leader MultiChoice

A Thought On The Disruptive Innovation Trap…

In the search to constantly innovate and try to “disrupt” the market we often lose focus on the basics, where lots of value may still be untapped or providing a product/service/journey that we think will be beneficial because it has a bit of technology, AI or analytics behind it. Using technology to lead these ambitions is often the fallacy that proves to be the undoing. Disruptive Innovation stems from the entrepreneurial domain that references innovation that breaks down the barriers of complexity and prices, making it accessible to the broader market. We often loosely use innovation to refer to anything “new” that is attempted, such as enhancing a product or process.  In the book “The Innovators Dilemma”, we are presented with two types of innovation that companies deal with. Sustainable innovation allows companies to incrementally improve operations to remain competitive and drive business as usual. In contrast, Disruptive innovation uncovers new categories of customers by harnessing technology. While both types may be good for the prospects of a company, the difference between the two is important. The reason for this, is that each scenario requires a unique strategic approach, and this is essential in realizing the associated benefits. It is also important in determining how we react to competitors in our market. For example, disruption takes time which can often cause competitors to go unnoticed until their innovations are revealed to the market. Consider the depiction of the theory of disruptive innovation, in figure one. Research suggests that companies generally focus on sustaining innovations in a few well-established value areas (1),  This creates a gap of unfulfilled needs (2), that develops over time for a subset of customers (that can also evolve into widespread needs). This creates the opportunity for competitors to disrupt (3), generally with business models that are agile enough to take the opportunity while incumbents may not be able to shift fast enough to counter. The result (4) is the disruption. Figure 1: Theory of Disruptive Innovation (Source: MIT Sloan Management Review) While this theory doesn’t capture the realities of sustaining innovation completely, it highlights the omnipresent threat of developing products or services devoid of customer needs and expectations, in the pursuit of technological advancement or improvement. The Customer Experience (CX) that you intend on driving, develops from customer needs – it is a pivotal element in both types of innovation. This provides a steer that will allow you to target unmet customer needs and sustain innovation without overlooking key existing customer needs. Starting with your CX in mind, allows you to avoid costly technology decisions that may add little value in the future. Data has a pivotal role to play in this process, in order to understand your CX – you need to be able to measure it in a way that makes sense. While legacy measures such as Customer Satisfaction Scores (CSAT) and Net Promoter Scores (NPS) are able to tell a small part of your CX, the important parts (what your customer doesn’t tell you) largely goes unnoticed until it becomes an issue. This is critical, as customers are no longer comparing CX within industries but across. For example, the ease of signing up for a streaming service is being compared to that of signing up to a bank account. The miss match in the unfilled expectation opens the door for competitors. Using the understanding from designing your CX journeys, you are able to pinpoint what is important to measure to ensure each handover is successful without actually asking your customer. Companies that have accomplished this have outperformed in their respective markets and have higher success rates launching new products, as they are geared toward customer needs and not just chasing an innovation milestone. About the Author Dr. Durrel Ramrathan Senior Analytics & AI Leader MultiChoice

Adding “A P” to the Marketing Mix

One would think that running a business is easy – just maximise revenue and minimise costs! However, this simple formula is incomplete without the essence of what makes your business unique – it’s not just about the product, but also about the price, the place and the positioning. These “4 P’s” were coined by Neil Borden in 1964 as he illustrated how companies can use marketing and advertising to generate revenues. We add two more letters to this – “Analytics” and “People”. Holistically, this new mix now centres around customer experience, ensuring that any activity of the business is firmly centred around the customer and underpinned by data and analytics. Product and Customer Experience Products (or solutions in a B2B context) can be thought of as solutions to a problem. Do you know what problems your customers face? Is your solution the right fit to solve that problem? By solving a problem, you not only meet a basic need but also open up avenues to prevent the problem from recurring in the future -this value-enhancing activity attracts and keeps customers to your brand. This is where most data scientists spend their time – building models that predict customer take-up rates and churn rates. A question I will leave you with: if all companies are building these models, what’s next? Price and Customer Experience Pricing is tricky – it’s moved away from the cold mathematics of finding the optimal monetary amount where your revenue exceeds your cost as customers want to get value from what they purchase. Prices are simply the intersection of supply and demand and unfortunately can have an extreme impact on perceived value. For example, a bottle of water at either R1 or R100 would raise eyebrows (for different reasons), indicative that people intuitively place a value on the goods and services that they purchase. Value is also driven by quality – one would expect to pay a higher price for a higher quality product or service. At the very least, you should be able to price your products and services at the point where you have a positive margin, while still maintaining a competitive edge. However, the business that will succeed combines this optimisation with what the customer expects and perceives. Do customers feel that they get value for money after purchasing from you? This value doesn’t only have to be in the price – a simple “thank you” or confirmation of the good purchase decision or tips on how to use the product or service effectively goes a long way in creating value. Data drawn from research generate valuable insights into “segmenting” your customers into smaller, more tangible groups, after which you can apply different treatment strategies to ensure that each group is at its optimal. Place and Customer Experience As we have learnt with lockdown, businesses limited by geographical location face dire consequences if their clientele cannot reach them. This was actually an opportunity to innovate, by assessing whether your products and services can generate the same appeal virtually. Naturally, this is a trial and error experiment with many as some customers may simply not be digitally inclined whereas some products can “never” be sold online. This rise in digitalisation of businesses also generated significantly more data for use. The use of digital and geographic data is currently on the rise, as more businesses see the value of placement in offering enhanced interactions with their customers. Imagine a world where your route in the mall is “laid out” for you based on your previous purchase history, hobbies and interests Promotion and Customer Experience Marketing, in a post POPIA world, is no longer about spamming potential customers. It’s about gaining trust with your existing customers by only communicating what is relevant and timely. For prospective customers, your brand and corresponding advertising also need to be targeted at who you wish to attract. To determine your existing and potential clientele, it helps to not only segment your customers but also combine their psychographic data (values, needs, wants, aspirations) with their interactions and purchasing history. People and Customer Experience It is naturally important to focus on delivering on what the customer wants, but an often overlooked factor is ensuring that you have the right people to execute on that promise. Irrespective of the size of your business, your business is unique – it has a DNA, a culture, values and principles that differ from any other. Finding others who are aligned to this experience is critical to ensure success – it never ends well to hire a salesperson who hates talking to people! How do you know you have the right people? While interviews and personality traits can inform you of who to choose, you as the entrepreneur must know what your business’s DNA looks like. Getting these 5P’s is by itself not a winning recipe. However, by focusing on putting your customers (and employees) first, it can certainly get you into the competition. How far you progress through it depends on how capable and changeable you are. About the Author Professor Yudhvir Seetharam Head of Analytics: Insights and Research FNB

Adding “A P” to the Marketing Mix

One would think that running a business is easy – just maximise revenue and minimise costs! However, this simple formula is incomplete without the essence of what makes your business unique – it’s not just about the product, but also about the price, the place and the positioning. These “4 P’s” were coined by Neil Borden in 1964 as he illustrated how companies can use marketing and advertising to generate revenues. We add two more letters to this – “Analytics” and “People”. Holistically, this new mix now centres around customer experience, ensuring that any activity of the business is firmly centred around the customer and underpinned by data and analytics. Product and Customer Experience Products (or solutions in a B2B context) can be thought of as solutions to a problem. Do you know what problems your customers face? Is your solution the right fit to solve that problem? By solving a problem, you not only meet a basic need but also open up avenues to prevent the problem from recurring in the future -this value-enhancing activity attracts and keeps customers to your brand. This is where most data scientists spend their time – building models that predict customer take-up rates and churn rates. A question I will leave you with: if all companies are building these models, what’s next? Price and Customer Experience Pricing is tricky – it’s moved away from the cold mathematics of finding the optimal monetary amount where your revenue exceeds your cost as customers want to get value from what they purchase. Prices are simply the intersection of supply and demand and unfortunately can have an extreme impact on perceived value. For example, a bottle of water at either R1 or R100 would raise eyebrows (for different reasons), indicative that people intuitively place a value on the goods and services that they purchase. Value is also driven by quality – one would expect to pay a higher price for a higher quality product or service. At the very least, you should be able to price your products and services at the point where you have a positive margin, while still maintaining a competitive edge. However, the business that will succeed combines this optimisation with what the customer expects and perceives. Do customers feel that they get value for money after purchasing from you? This value doesn’t only have to be in the price – a simple “thank you” or confirmation of the good purchase decision or tips on how to use the product or service effectively goes a long way in creating value. Data drawn from research generate valuable insights into “segmenting” your customers into smaller, more tangible groups, after which you can apply different treatment strategies to ensure that each group is at its optimal. Place and Customer Experience As we have learnt with lockdown, businesses limited by geographical location face dire consequences if their clientele cannot reach them. This was actually an opportunity to innovate, by assessing whether your products and services can generate the same appeal virtually. Naturally, this is a trial and error experiment with many as some customers may simply not be digitally inclined whereas some products can “never” be sold online. This rise in digitalisation of businesses also generated significantly more data for use. The use of digital and geographic data is currently on the rise, as more businesses see the value of placement in offering enhanced interactions with their customers. Imagine a world where your route in the mall is “laid out” for you based on your previous purchase history, hobbies and interests Promotion and Customer Experience Marketing, in a post POPIA world, is no longer about spamming potential customers. It’s about gaining trust with your existing customers by only communicating what is relevant and timely. For prospective customers, your brand and corresponding advertising also need to be targeted at who you wish to attract. To determine your existing and potential clientele, it helps to not only segment your customers but also combine their psychographic data (values, needs, wants, aspirations) with their interactions and purchasing history. People and Customer Experience It is naturally important to focus on delivering on what the customer wants, but an often overlooked factor is ensuring that you have the right people to execute on that promise. Irrespective of the size of your business, your business is unique – it has a DNA, a culture, values and principles that differ from any other. Finding others who are aligned to this experience is critical to ensure success – it never ends well to hire a salesperson who hates talking to people! How do you know you have the right people? While interviews and personality traits can inform you of who to choose, you as the entrepreneur must know what your business’s DNA looks like. Getting these 5P’s is by itself not a winning recipe. However, by focusing on putting your customers (and employees) first, it can certainly get you into the competition. How far you progress through it depends on how capable and changeable you are. About the Author Professor Yudhvir Seetharam Head of Analytics: Insights and Research FNB

Vanity, Sanity, Reality: What’s Really Important for Your Data Strategy

By Jason Foster – Founder & Chief Executive, Cynozure Group (USA) Revenue = Vanity One of the biggest mistakes is to focus on your company’s top line: revenue. Oh the allure of revenue! It’s easy to focus on the revenue number as it’s the biggest one on the P&L and it hasn’t yet been reduced by the various business costs and expenses. It’s an impressive sounding number too “we’ve grown our revenue by x% this year” can sound great. Also, it’s relatively easy to understand as it’s money into the business, it’s transactional, so there are minimal calculations to do. It is however, a vanity metric because it doesn’t come with any context. That context being how it compares to previous years, how it compares to the market, but most importantly: whether it generated any profit for the business. If you made £1m but it cost you £1.5m do so, then you’ve made a loss and have no money to reinvest in the business to drive growth. Revenue growth is important of course as its the way you will create more value in the business in the future, assuming subsequent profits. Profit = Sanity To that end the saying continues, profit is sanity. Profit is essentially what is left once you’ve subtracted all the various costs and expenses of the business. Profit is a much better indicator of the health of a business; and is more commonly used to gauge the valuation of a business. It’s a much better indicator of financial performance. Generating profits is what keeps you sane. The problem with profit though, is it isn’t actually real until it’s turned into actual money. So on paper you might have sold £1m of services and products, and it may have cost you £300k to deliver that revenue. Leaving you £700k of profit on paper. However, the actual receipt of that money is what’s important. Depending on payment terms and when money actually arrives that profit is just on paper. Cash = Reality That’s where this saying ends. With saying cash is reality. Actual physical money in the bank, and more importantly the flow of real cash into the business (not on paper revenue or profits), is what really makes or breaks a business. The money is there, you can see it, you can spend it, you can pay your staff, you can buy more stock, you can invest in growth. It’s the peace of mind required to keep a business grounded in reality. A positive cash flow is a really powerful indicator of a healthy business as it shows that you can generate revenue, collect the money, pay your costs and expenses and still be left with money. Cash is the lifeblood of the business. I’ve applied the same thinking the world of data, analytics and artificial intelligence to better help us understand what’s really important, and the reality. So it goes like this… AI = Vanity Like revenue there is an allure surrounding Artificial Intelligence (or any other latest technology solution). It’s the big one. It’s what people talk about and crave because it sounds impressive. “Yes we applied AI to that problem” “we’ve bought a platform that’s AI driven” “our software is an AI Driven platform for fixing the business”. To be at the forefront of innovation can be very important for you or your business and that innovation in, and of, itself can support your ambitions. It might attract funding, budgets, great talent. And whilst that is great, in those instances you are essentially using it as a vanity tool. Look, vanity works in the same way making revenue works; but like revenue, AI on its own is only part of the picture and needs context. Data = Sanity So continuing the analogy data is therefore sanity. More accurately, good clean trusted data is sanity. Without this your AI, analytics and insights are nothing. The lifeblood of your organisation is data. It’s the horizontal that cuts across the end-to-end value chain of your business. It gets created in every transaction and interaction that your customers, employees, partners and stakeholders have with you. Well captured, consolidated, modelled and activated data is what allows the value and benefits of having that data to be realised, and plugged into artificially intelligent systems to drive benefit. Data, when done right, is the sanity of your business. It allows you to manage processes, operate effectively, interact with customers, assess financial performance (remember revenue, profit cash?). It has the power to transform the business. Looking after it well ensures you don’t break regulatory or other legal obligations; knowing you’re on top of data will keep you sane for sure. But here’s the thing. Data is nothing really. Like profit, it doesn’t give you anything until its turned into value. You could have the best data, the best data pipelines, platform, people to manage it and model it, but without actions being taken or change being made its just data sat in systems. Business Outcomes = Reality That’s where my saying ends: business outcomes are reality. Actual business improvement through the application of data, analytics and AI. Data products that guide decision making, and people making quicker decisions – more right, more often – to improve revenue profit cash customer service business operations environmental outcomes societal improvements That’s reality. And it’s not always positive – I can find plenty of people that are impressed with the AI created and the data held by Facebook, but I can find as many (if not more) people who judge the outcomes of that ecosystem at scale to be detrimental, if not disastrous, for the world. To summarise: What happens as an outcome to your data strategy is what really matters, what really drives improvement; and like cash really is the lifeblood of the business. Listen to the podcast here.

Vanity, Sanity, Reality: What’s Really Important for Your Data Strategy

By Jason Foster – Founder & Chief Executive, Cynozure Group (USA) Revenue = Vanity One of the biggest mistakes is to focus on your company’s top line: revenue. Oh the allure of revenue! It’s easy to focus on the revenue number as it’s the biggest one on the P&L and it hasn’t yet been reduced by the various business costs and expenses. It’s an impressive sounding number too “we’ve grown our revenue by x% this year” can sound great. Also, it’s relatively easy to understand as it’s money into the business, it’s transactional, so there are minimal calculations to do. It is however, a vanity metric because it doesn’t come with any context. That context being how it compares to previous years, how it compares to the market, but most importantly: whether it generated any profit for the business. If you made £1m but it cost you £1.5m do so, then you’ve made a loss and have no money to reinvest in the business to drive growth. Revenue growth is important of course as its the way you will create more value in the business in the future, assuming subsequent profits. Profit = Sanity To that end the saying continues, profit is sanity. Profit is essentially what is left once you’ve subtracted all the various costs and expenses of the business. Profit is a much better indicator of the health of a business; and is more commonly used to gauge the valuation of a business. It’s a much better indicator of financial performance. Generating profits is what keeps you sane. The problem with profit though, is it isn’t actually real until it’s turned into actual money. So on paper you might have sold £1m of services and products, and it may have cost you £300k to deliver that revenue. Leaving you £700k of profit on paper. However, the actual receipt of that money is what’s important. Depending on payment terms and when money actually arrives that profit is just on paper. Cash = Reality That’s where this saying ends. With saying cash is reality. Actual physical money in the bank, and more importantly the flow of real cash into the business (not on paper revenue or profits), is what really makes or breaks a business. The money is there, you can see it, you can spend it, you can pay your staff, you can buy more stock, you can invest in growth. It’s the peace of mind required to keep a business grounded in reality. A positive cash flow is a really powerful indicator of a healthy business as it shows that you can generate revenue, collect the money, pay your costs and expenses and still be left with money. Cash is the lifeblood of the business. I’ve applied the same thinking the world of data, analytics and artificial intelligence to better help us understand what’s really important, and the reality. So it goes like this… AI = Vanity Like revenue there is an allure surrounding Artificial Intelligence (or any other latest technology solution). It’s the big one. It’s what people talk about and crave because it sounds impressive. “Yes we applied AI to that problem” “we’ve bought a platform that’s AI driven” “our software is an AI Driven platform for fixing the business”. To be at the forefront of innovation can be very important for you or your business and that innovation in, and of, itself can support your ambitions. It might attract funding, budgets, great talent. And whilst that is great, in those instances you are essentially using it as a vanity tool. Look, vanity works in the same way making revenue works; but like revenue, AI on its own is only part of the picture and needs context. Data = Sanity So continuing the analogy data is therefore sanity. More accurately, good clean trusted data is sanity. Without this your AI, analytics and insights are nothing. The lifeblood of your organisation is data. It’s the horizontal that cuts across the end-to-end value chain of your business. It gets created in every transaction and interaction that your customers, employees, partners and stakeholders have with you. Well captured, consolidated, modelled and activated data is what allows the value and benefits of having that data to be realised, and plugged into artificially intelligent systems to drive benefit. Data, when done right, is the sanity of your business. It allows you to manage processes, operate effectively, interact with customers, assess financial performance (remember revenue, profit cash?). It has the power to transform the business. Looking after it well ensures you don’t break regulatory or other legal obligations; knowing you’re on top of data will keep you sane for sure. But here’s the thing. Data is nothing really. Like profit, it doesn’t give you anything until its turned into value. You could have the best data, the best data pipelines, platform, people to manage it and model it, but without actions being taken or change being made its just data sat in systems. Business Outcomes = Reality That’s where my saying ends: business outcomes are reality. Actual business improvement through the application of data, analytics and AI. Data products that guide decision making, and people making quicker decisions – more right, more often – to improve revenue profit cash customer service business operations environmental outcomes societal improvements That’s reality. And it’s not always positive – I can find plenty of people that are impressed with the AI created and the data held by Facebook, but I can find as many (if not more) people who judge the outcomes of that ecosystem at scale to be detrimental, if not disastrous, for the world. To summarise: What happens as an outcome to your data strategy is what really matters, what really drives improvement; and like cash really is the lifeblood of the business. Listen to the podcast here.

BI in the Moment

By Michiel van Staden, Data Analytics Lead, ABSA Leadership spend most of their day in meetings, making decisions. Operations are mostly busy doing operations, taking decisions. In the spaces between, there is hopefully some time for all to engage the relevant colleague communication platforms, decision-making. When running, we run. Every now and then we might pause to catch our breath, engage those around us. Whilst running, there are apps that feeds us relevant info on our progress as we go and on pausing, gives us just what we need to decide on the way forward. Coming from 13 years in data analytics experience across fraud prevention, credit risk and operations to digital and marketing, I’d like to talk about practical ways to feed decision-making in workplace meetings, operations, and the spaces between, with relevant information. Reports Coming From Systems Within our organisation, we have many different systems collecting information. Some of these have been developed to engage potential new customers, others to process applications. There are systems to manage accounts whilst the relationship is in good standing, and still others for when that relationship goes through challenges. Based on the relevant function, each one of these processes would generate datasets that would then be stored accordingly. Much of this does now reside on the same centralised data warehouse, but there are still nuances in terms of dataset-specific access and formatting. Often this would dictate the way reports are developed. Datasets relating to applications would for example form the basis for a sales report, often taking shape around the pieces of information that happen to have been stored. In developing this report, the temptation is often great to include as much information as available, some of which might even not be accurately captured or well understood. From this position, getting the business to actively access and use these reports is an uphill battle. Even for those very close to the relevant system process, field names as captured in the data can be totally unfamiliar, whilst in parallel they are still struggling with navigating the reporting tool and trying to make sense of the overwhelming amount of information. Already being pressed for time, this does not bode well for adoption, but with time it is possible to get there. Demand is there to track business performance, and thus leadership is forced in a sense to make the reports work. As stakeholders continue to engage with the reports, questions do emerge around the fringes.  For example, enquiring what role prospecting to potential new customers played in sales and what happened with the relationship post sale. At this stage there might very well already be prospecting and existing customer relationship reports. In some cases, the data specialist responsible for sales reporting might also be close to those, but more probably not. Whether to merge these reports or keep them running in parallel, with the increasing likelihood of overlaps as they grow, can be a very complicated problem to solve. Can the different data sources be compiled practically into one report? Are the other reports being used? Will the overlapping numbers be consistent or conflicting? When not handled well, you can easily end up with a myriad of reports, containing duplication and inconsistencies, whilst becoming increasingly too large, complex, and unwieldy to be of any practical use. Add to this ad hoc requests for specific pieces of information increasingly landing on data specialists laps because stakeholders are not able to effectively find the information themselves. The Business Does Not Know What it Needs When you do ask the business what intelligence they need, they are not able to tell you.  Not having a view of what is possible and available in terms of reporting makes it very difficult to devise specific requirements for what the business will need and be able to practically use. As data specialists, our first task is to get to know the data on offer very well. We have to figure out how to access the data coming from different systems, need to ask the questions towards understanding exactly what each piece of information means and ultimately need to become very comfortable in weaving data from these different sources together into an end-to-end picture. Additionally, it is also key to understand the business. What is the business strategy? How does it make its money? What are the key processes? What does the business offer its customers? And lastly, take time to listen to and understand your audience. What does their typical day look like? What challenges are they facing? How do they make decisions?  What are their thoughts around data? How comfortable are they in working with data? Only once you’ve got a very clear understanding of all of these components can you engage your stakeholders very practically, giving them the business intelligence they need to do their jobs better and make informed decisions. BI in the Moment Knowing exactly what data is available and which processes are core to the specific business area, sit with your key audience to understand what the key meetings are in their schedules and unpack what information would better enable them to make the critical decisions in those sessions. Give them what they need. Also spend time in the various operational functions. Unpack what systems they are using and how they make decisions within those processes on a day to day basis. Give them what they need. Understand your data Understand your business Understand your audience Give them what they need In the spaces between there might still be additional, maybe slightly more generic information that could give your audience the latest on business progress or performance, informing their decision-making more holistically. Actively work with your stakeholders towards narrowing down a handful of key metrics, giving them what they need at a glance, in an email or messaging body. Clicking through to reports on a daily basis quickly becomes very tedious and discouraging. Before adding any new metrics or views to a dashboard, review all the existing ones. Are they still relevant and value adding? Or is the report starting to slide into becoming less valuable?  Then something has got to give. Nobody wants to keep generating reports nobody uses, and nobody wants to be inundated with reports they cannot use. Give them what they need. BI in the moment.

BI in the Moment

By Michiel van Staden, Data Analytics Lead, ABSA Leadership spend most of their day in meetings, making decisions. Operations are mostly busy doing operations, taking decisions. In the spaces between, there is hopefully some time for all to engage the relevant colleague communication platforms, decision-making. When running, we run. Every now and then we might pause to catch our breath, engage those around us. Whilst running, there are apps that feeds us relevant info on our progress as we go and on pausing, gives us just what we need to decide on the way forward. Coming from 13 years in data analytics experience across fraud prevention, credit risk and operations to digital and marketing, I’d like to talk about practical ways to feed decision-making in workplace meetings, operations, and the spaces between, with relevant information. Reports Coming From Systems Within our organisation, we have many different systems collecting information. Some of these have been developed to engage potential new customers, others to process applications. There are systems to manage accounts whilst the relationship is in good standing, and still others for when that relationship goes through challenges. Based on the relevant function, each one of these processes would generate datasets that would then be stored accordingly. Much of this does now reside on the same centralised data warehouse, but there are still nuances in terms of dataset-specific access and formatting. Often this would dictate the way reports are developed. Datasets relating to applications would for example form the basis for a sales report, often taking shape around the pieces of information that happen to have been stored. In developing this report, the temptation is often great to include as much information as available, some of which might even not be accurately captured or well understood. From this position, getting the business to actively access and use these reports is an uphill battle. Even for those very close to the relevant system process, field names as captured in the data can be totally unfamiliar, whilst in parallel they are still struggling with navigating the reporting tool and trying to make sense of the overwhelming amount of information. Already being pressed for time, this does not bode well for adoption, but with time it is possible to get there. Demand is there to track business performance, and thus leadership is forced in a sense to make the reports work. As stakeholders continue to engage with the reports, questions do emerge around the fringes.  For example, enquiring what role prospecting to potential new customers played in sales and what happened with the relationship post sale. At this stage there might very well already be prospecting and existing customer relationship reports. In some cases, the data specialist responsible for sales reporting might also be close to those, but more probably not. Whether to merge these reports or keep them running in parallel, with the increasing likelihood of overlaps as they grow, can be a very complicated problem to solve. Can the different data sources be compiled practically into one report? Are the other reports being used? Will the overlapping numbers be consistent or conflicting? When not handled well, you can easily end up with a myriad of reports, containing duplication and inconsistencies, whilst becoming increasingly too large, complex, and unwieldy to be of any practical use. Add to this ad hoc requests for specific pieces of information increasingly landing on data specialists laps because stakeholders are not able to effectively find the information themselves. The Business Does Not Know What it Needs When you do ask the business what intelligence they need, they are not able to tell you.  Not having a view of what is possible and available in terms of reporting makes it very difficult to devise specific requirements for what the business will need and be able to practically use. As data specialists, our first task is to get to know the data on offer very well. We have to figure out how to access the data coming from different systems, need to ask the questions towards understanding exactly what each piece of information means and ultimately need to become very comfortable in weaving data from these different sources together into an end-to-end picture. Additionally, it is also key to understand the business. What is the business strategy? How does it make its money? What are the key processes? What does the business offer its customers? And lastly, take time to listen to and understand your audience. What does their typical day look like? What challenges are they facing? How do they make decisions?  What are their thoughts around data? How comfortable are they in working with data? Only once you’ve got a very clear understanding of all of these components can you engage your stakeholders very practically, giving them the business intelligence they need to do their jobs better and make informed decisions. BI in the Moment Knowing exactly what data is available and which processes are core to the specific business area, sit with your key audience to understand what the key meetings are in their schedules and unpack what information would better enable them to make the critical decisions in those sessions. Give them what they need. Also spend time in the various operational functions. Unpack what systems they are using and how they make decisions within those processes on a day to day basis. Give them what they need. Understand your data Understand your business Understand your audience Give them what they need In the spaces between there might still be additional, maybe slightly more generic information that could give your audience the latest on business progress or performance, informing their decision-making more holistically. Actively work with your stakeholders towards narrowing down a handful of key metrics, giving them what they need at a glance, in an email or messaging body. Clicking through to reports on a daily basis quickly becomes very tedious and discouraging. Before adding any new metrics or views to a dashboard, review all the existing ones. Are they still relevant and value adding? Or is the report starting to slide into becoming less valuable?  Then something has got to give. Nobody wants to keep generating reports nobody uses, and nobody wants to be inundated with reports they cannot use. Give them what they need. BI in the moment.

5 Reasons Why Advanced Analytics Projects are Failing, and Potential Solutions

By Dirko Hay, CEO, StreamBurst (Pty) Ltd As this Covid-19 pandemic continues, companies are making renewed efforts to invest huge amounts in advanced analytics technologies and are commissioning projects to buffer and protect themselves against the current economic environment, and to weather potential further storms ahead.  But many of these advanced analytics projects are still not giving the ROI that they promise. Many do not end up in production and many fail outright. In 2019, Andrew White from Gartner (1) predicted the following: “Through 2020, 80% of AI projects will remain alchemy, run by wizards whose talents will not scale in the organization. Through 2022, only 20% of analytic insights will deliver business outcomes.” Here are some possible reasons why some of these initiatives are not successful: 1. A Lack of Comprehensive and Cohesive Analytics Strategy Having a comprehensive and cohesive analytics strategy to address modern day data drivers like business competitiveness, maximizing return on 5G and IoT initiatives in a changing landscape, and getting value from continuous digitization projects is key to delivering optimal ROI on data assets. Sadly, a big number of companies still have fragmented, siloed strategies to address this challenge and do not see or understand the value of an integrated, cohesive strategy that can significantly improve decision making.  This is exacerbated by internal politics and failure to agree on key strategic analytical initiatives that can drive the company forward. The key here is to focus and prioritize initiatives that have executive buy-in and stem from a cohesive business strategy, and that can deliver high impact rapid results through smaller incremental steps. This will allow a culture where ‘fail fast’ can be tolerated and will ultimately increase the overall success rate of analytics, AI and machine learning projects. 2. Analytic Process Automation While many vendors today are talking about analytic process automation (2), very few offer the capabilities to move a company forward on the road to automated analytical processes. This has become a big differentiating factor in speed of execution of analytics projects and improves the overall competitiveness of a company as it increases decision-making capability, streamlines overall processes, and rapidly brings about a data-driven culture. The Pareto 80/20 principle where 80% of effort is still spent on data cleansing, blending and ETL/ELT related processes is still in effect in many analytics shops today.  Although inroads have been made into changing this modus operandi, the key here is that this should be turned upside down and 20% of effort should be spent on cleansing and wrangling work and the balance of 80% on actually analysing and visualising the data.  This can only be achieved through end-to-end analytic automation processes with software tools that allow for fast efficient blending, wrangling, collecting, and pre-processing of data. There are several software tools available on the market today that cater for this (2), and they are not necessarily only in IT hands, but also empower the business with self-service capabilities. 3. Neglecting of DataOps DataOps these days is more of a buzzword than an embedded practice of any sorts in the analytics domain and failure to embrace the inherent principles might cause challenges that could have been avoided with a well-executed plan. We will define DataOps as the ability to effectively execute and monitor any data related processes, workflows or infrastructure, effectively, with rapid speed, accuracy and automation capabilities, with minimal errors, and ensuring proper test procedures in development, with the capability to move analytical models and reporting swiftly into production for operationalisation purposes. DataOps is becoming more important as companies scale their analytical operations. More employees become analytical minded and require access to the capabilities that advanced analytics offers. Understanding what DataOps contributes and implementing the key principles will in future become a key differentiator for optimising advanced analytical projects and speed of execution in this domain. 4. Failure to Operationalise Data Science Models Effectively Many data science teams are excellent in developing data science models but lack the capabilities to operationalise them fast and effectively. In many cases, the development of models takes a fraction of the time that it takes to operationalise them, and this mainly stems from the lack of capabilities to put all the pieces together required for operationalisation and to create real value for the company from the models. Another factor is that by the time these models are production ready, the business value might have diminished.  Having software to automate model operationalisation capabilities rapidly to production and to embed the models into key business process will expedite ROI on advanced analytics initiatives and ensure sustainability. 5. Failure to Adopt Open-Source and Other Advanced Technologies While some companies have the appetite to move unrelentingly forward in the pressing need to adopt modern technologies like streaming, real-time, event-driven databases, graph databases, and open-source technologies, others are sitting on the sidelines and waiting to see if it delivers value in vertical horizontal markets. Reasons for this are many, but one reason is fear of failure that comes about through being an early adopter in the past. Many of these technologies are however not new, and have been extensively tested in companies like Facebook, Uber, Netflix, Google and some of the largest fortune 500 companies in the world. They deliver real-time event-driven insights with sub-second response times, monetization of 5G and IoT initiatives, and the ability to analyse up to trillions of records. These capabilities will become key drivers of business competitiveness in the next 5 years as companies look to drive costs down and increase revenue and competitiveness. The reality today is that up to 85% of data science and machine learning models still do not make it to production and many advanced analytical projects end up in smoke.  Failure to address these reasons for lack of success will cause executives to stop spending required funds to move analytics forward. There are, however, positive signs that this trend is changing, and Covid-19 has rapidly increased improvement and innovation initiatives in this domain. References Andrew White, Our top data and Analytics Predicts for 2019, Gartner Blog Network. Alteryx : What is Analytic Process Automation?

5 Reasons Why Advanced Analytics Projects are Failing, and Potential Solutions

By Dirko Hay, CEO, StreamBurst (Pty) Ltd As this Covid-19 pandemic continues, companies are making renewed efforts to invest huge amounts in advanced analytics technologies and are commissioning projects to buffer and protect themselves against the current economic environment, and to weather potential further storms ahead.  But many of these advanced analytics projects are still not giving the ROI that they promise. Many do not end up in production and many fail outright. In 2019, Andrew White from Gartner (1) predicted the following: “Through 2020, 80% of AI projects will remain alchemy, run by wizards whose talents will not scale in the organization. Through 2022, only 20% of analytic insights will deliver business outcomes.” Here are some possible reasons why some of these initiatives are not successful: 1. A Lack of Comprehensive and Cohesive Analytics Strategy Having a comprehensive and cohesive analytics strategy to address modern day data drivers like business competitiveness, maximizing return on 5G and IoT initiatives in a changing landscape, and getting value from continuous digitization projects is key to delivering optimal ROI on data assets. Sadly, a big number of companies still have fragmented, siloed strategies to address this challenge and do not see or understand the value of an integrated, cohesive strategy that can significantly improve decision making.  This is exacerbated by internal politics and failure to agree on key strategic analytical initiatives that can drive the company forward. The key here is to focus and prioritize initiatives that have executive buy-in and stem from a cohesive business strategy, and that can deliver high impact rapid results through smaller incremental steps. This will allow a culture where ‘fail fast’ can be tolerated and will ultimately increase the overall success rate of analytics, AI and machine learning projects. 2. Analytic Process Automation While many vendors today are talking about analytic process automation (2), very few offer the capabilities to move a company forward on the road to automated analytical processes. This has become a big differentiating factor in speed of execution of analytics projects and improves the overall competitiveness of a company as it increases decision-making capability, streamlines overall processes, and rapidly brings about a data-driven culture. The Pareto 80/20 principle where 80% of effort is still spent on data cleansing, blending and ETL/ELT related processes is still in effect in many analytics shops today.  Although inroads have been made into changing this modus operandi, the key here is that this should be turned upside down and 20% of effort should be spent on cleansing and wrangling work and the balance of 80% on actually analysing and visualising the data.  This can only be achieved through end-to-end analytic automation processes with software tools that allow for fast efficient blending, wrangling, collecting, and pre-processing of data. There are several software tools available on the market today that cater for this (2), and they are not necessarily only in IT hands, but also empower the business with self-service capabilities. 3. Neglecting of DataOps DataOps these days is more of a buzzword than an embedded practice of any sorts in the analytics domain and failure to embrace the inherent principles might cause challenges that could have been avoided with a well-executed plan. We will define DataOps as the ability to effectively execute and monitor any data related processes, workflows or infrastructure, effectively, with rapid speed, accuracy and automation capabilities, with minimal errors, and ensuring proper test procedures in development, with the capability to move analytical models and reporting swiftly into production for operationalisation purposes. DataOps is becoming more important as companies scale their analytical operations. More employees become analytical minded and require access to the capabilities that advanced analytics offers. Understanding what DataOps contributes and implementing the key principles will in future become a key differentiator for optimising advanced analytical projects and speed of execution in this domain. 4. Failure to Operationalise Data Science Models Effectively Many data science teams are excellent in developing data science models but lack the capabilities to operationalise them fast and effectively. In many cases, the development of models takes a fraction of the time that it takes to operationalise them, and this mainly stems from the lack of capabilities to put all the pieces together required for operationalisation and to create real value for the company from the models. Another factor is that by the time these models are production ready, the business value might have diminished.  Having software to automate model operationalisation capabilities rapidly to production and to embed the models into key business process will expedite ROI on advanced analytics initiatives and ensure sustainability. 5. Failure to Adopt Open-Source and Other Advanced Technologies While some companies have the appetite to move unrelentingly forward in the pressing need to adopt modern technologies like streaming, real-time, event-driven databases, graph databases, and open-source technologies, others are sitting on the sidelines and waiting to see if it delivers value in vertical horizontal markets. Reasons for this are many, but one reason is fear of failure that comes about through being an early adopter in the past. Many of these technologies are however not new, and have been extensively tested in companies like Facebook, Uber, Netflix, Google and some of the largest fortune 500 companies in the world. They deliver real-time event-driven insights with sub-second response times, monetization of 5G and IoT initiatives, and the ability to analyse up to trillions of records. These capabilities will become key drivers of business competitiveness in the next 5 years as companies look to drive costs down and increase revenue and competitiveness. The reality today is that up to 85% of data science and machine learning models still do not make it to production and many advanced analytical projects end up in smoke.  Failure to address these reasons for lack of success will cause executives to stop spending required funds to move analytics forward. There are, however, positive signs that this trend is changing, and Covid-19 has rapidly increased improvement and innovation initiatives in this domain. References Andrew White, Our top data and Analytics Predicts for 2019, Gartner Blog Network. Alteryx : What is Analytic Process Automation?

Please fill out the form to view the Conference Agenda pdf

Community Snapshot

Complete the fields below in order to access our community snapshots.

This field is for validation purposes and should be left unchanged.

Community Snapshot

Complete the fields below in order to access our community snapshots.

This field is for validation purposes and should be left unchanged.

Find out more about Planner’s School

Find out more about Planner’s School