BAD BUSINESS ANALYTICS MAY BE HAZARDOUS TO YOUR WEALTH.
Let's take a few real-life examples:
- A project to improve sales forecasting where the accuracy of the forecast was not measured either before or after the project.
- A project to maximize trailer loading (get more tonnage into freight trailers) with such a bad optimization model that it missed most of the opportunity.
- A system to improve On-Shelf-Availability (the % of product actually on shelf in grocery stores) built entirely from arbitrary rules with no measurement, at all, of...On Shelf Availability. Check out my post Point of Sale Data – Supply Chain Analytics for more details on On Shelf Availability.
- Statistical inventory models to identify how much inventory you really need built entirely without statistics. (Managing hundreds of $millions in inventory value)
- Countless excel models that calculate nothing of value.
I could go on..
In many cases the issue is that the people assigned to the task do not have the skills to wield the tools they need. The trailer loading project listed above was developed without real understanding of how to build an optimization model. The developer had found an extended version of Excel's "Solver" tool on the internet (a good small to medium scale optimizer from Frontline Systems). Unfortunately the Excel model was bad enough that Solver could only find the optimal solution to the wrong question: the model ran without throwing an error; it was a small improvement on what went before; the results were implemented; and the opportunity to do it right (worth $millions) was lost for a few years.
In other cases, and I saw a new one just this last week, the software tools leave out the diagnostics you need to tell whether the model is good. Predictive Analytics tools packaged for business use (like price/promotion modeling packages, sales forecasting tools) tend to do this. I can only assume that this is to prevent confusing the user.
Before you use any tool's output to make critical decisions, someone with good modeling skills (perhaps your Primary Analytical Practitioner) needs to check that your models are sound.
As my father taught me: “If a job is worth doing, it's worth doing well.” How can that not be true when your financial results depend on getting it right ?
In many cases the issue is that the people assigned to the task do not have the skills to wield the tools they need. The trailer loading project listed above was developed without real understanding of how to build an optimization model. The developer had found an extended version of Excel's "Solver" tool on the internet (a good small to medium scale optimizer from Frontline Systems). Unfortunately the Excel model was bad enough that Solver could only find the optimal solution to the wrong question: the model ran without throwing an error; it was a small improvement on what went before; the results were implemented; and the opportunity to do it right (worth $millions) was lost for a few years.
In other cases, and I saw a new one just this last week, the software tools leave out the diagnostics you need to tell whether the model is good. Predictive Analytics tools packaged for business use (like price/promotion modeling packages, sales forecasting tools) tend to do this. I can only assume that this is to prevent confusing the user.
Before you use any tool's output to make critical decisions, someone with good modeling skills (perhaps your Primary Analytical Practitioner) needs to check that your models are sound.
As my father taught me: “If a job is worth doing, it's worth doing well.” How can that not be true when your financial results depend on getting it right ?
No comments:
Post a Comment