The First Certainty Principle: C~ 1/K ; Certainty is inversely proportional to knowledge.
A person who really understands data and analysis will understand all the pitfalls and limitations, and hence be constantly caveating what they say. Somebody who is simple, straightforward, and 100% certain usually has no idea what they are talking about.
The Second Certainty Principle: A ~ C ; The attractiveness of results is directly proportional to the certainty of the presenters.
Decision-makers are attracted to certainty. Decision-makers usually have no understanding of the intricacies of data mining. What they often need is simply someone to tell them what they should do.
Note that #1 and #2 together cause a lot of problems.
The Time-Value Law: V ~ 1/P ; The value of analysis is inversely
proportional to the time-pressure to produce it.
If somebody want something right away, that means they want it on a whim not real need. The request that comes in at 4:00 for a meeting at 5:00 will be forgotten by 6:00. The analysis that can really effect a business has been identified through careful thought, and people are willing to wait for it. (A cheery thought for those late-night fire drills.)
The First Bad Analysis Law: Bad analysis drives out good analysis.
Bad analysis invariably conforms to people's pre-conceived notions, so they like hearing it. It's also 100% certain in it's results, no caveats, nothing hard to understand, and usually gets produced first. This means the good analysis always has an uphill fight.
The Second Bad Analysis Law: Bad Analysis is worse than no analysis.
If there is no analysis, people muddle along by common sense which usually works out OK. To really mess things up requires a common direction which requires persuasive analysis pointing in that direction. If that direction happens to be into a swamp, it doesn't help much.
Monday, March 31, 2008
Sunday, March 30, 2008
LTV at LTC
I've written quite a bit about unsuccessful Information Engineering projects; now I want to write about a successful one.
How can you change a company? Give people the information they need to make decisions they never thought they could and that changes how they think about the enterprise. The trouble is, any organization will put up a lot of resistance to change.
In 2002 I managed a Life-Time Value (LTV) project at a Large Telecommunications Company (LTC) that did change the enterprise. LTV is an attempt to measure the overall economic impact of each customer to the enterprise over their expected life. Ideally this is concrete numeric data so we can ask “Is this customer worth $300 in new equipment for them if they will stay with us for two more years”?
The LTV project allowed people to think about the business in new ways, the project was embraced by the Chief Marketing Officer, and the project saved $15 million each year in direct marketing costs while adding to the revenue from marketing programs simply by not spending money to retain customers that LTC was losing money on.
There are a lot of articles about how to do LTV calculations. This time I want to talk about all the corporate politics around sheparding the LTV project to success.
How can you change a company? Give people the information they need to make decisions they never thought they could and that changes how they think about the enterprise. The trouble is, any organization will put up a lot of resistance to change.
In 2002 I managed a Life-Time Value (LTV) project at a Large Telecommunications Company (LTC) that did change the enterprise. LTV is an attempt to measure the overall economic impact of each customer to the enterprise over their expected life. Ideally this is concrete numeric data so we can ask “Is this customer worth $300 in new equipment for them if they will stay with us for two more years”?
The LTV project allowed people to think about the business in new ways, the project was embraced by the Chief Marketing Officer, and the project saved $15 million each year in direct marketing costs while adding to the revenue from marketing programs simply by not spending money to retain customers that LTC was losing money on.
There are a lot of articles about how to do LTV calculations. This time I want to talk about all the corporate politics around sheparding the LTV project to success.
Wednesday, March 26, 2008
Data-Driven Organizations are a Bad Idea
Consider: it really takes only a few facts to make a decision, but it takes a wealth of insight to know what the relevant facts are for the decision.
In a data-driven company, every single analysis generates facts, and every single one of those facts indicates a possible decision. In a data-driven organization people really have very little guidance to make decisions. Even worse, the uncertainty that all the possible decisions that could be made drives people to ask for more analysis. More analysis means more facts generated which means more possible decisions suggested, which means an even greater confusion and the problem gets worse. The end result is that decisions get made for really very arbitrary reasons, usually the last fact someone say before they were forced to decide. I think it's better to rely on intuition and experience that to try to make sense out of a sea of random, contradictory facts.
What works is to have a decision-driven organization. Understand what kind of decisions the organization needs to make, understand the basis on which these decisions should be made and be explicit about it, and then once that blueprint for decision-making has been made then build the information needed for the decision.
In a data-driven company, every single analysis generates facts, and every single one of those facts indicates a possible decision. In a data-driven organization people really have very little guidance to make decisions. Even worse, the uncertainty that all the possible decisions that could be made drives people to ask for more analysis. More analysis means more facts generated which means more possible decisions suggested, which means an even greater confusion and the problem gets worse. The end result is that decisions get made for really very arbitrary reasons, usually the last fact someone say before they were forced to decide. I think it's better to rely on intuition and experience that to try to make sense out of a sea of random, contradictory facts.
What works is to have a decision-driven organization. Understand what kind of decisions the organization needs to make, understand the basis on which these decisions should be made and be explicit about it, and then once that blueprint for decision-making has been made then build the information needed for the decision.
Tuesday, March 25, 2008
I don't like books
I'm not that big a fan of data mining books. Every article I've read, or book I've read, or class I've taken, has been about what works. About the only way to find out what doesn't work is to have project blow up on you and be sweating blood at 2 a.m. trying to figure out why all the nice algorithms didn't work out the way they were supposed to.
Friday, March 21, 2008
Good Data, Bad Decisions
Barnaby S. Donlon in the BI Review (http://www.bireview.com/bnews/10000989-1.html) gives a good description of how data goes to information, to knowledge, and then to decisions. He's saying all the right things, and all the things I've been hearing for years, but you know -- I don't think it works anything like that.
When we start with the data, it's all too much. It's too easy to generate endless ideas, endless leads, endless stories. I've seen it happen when an organization suddenly gets analytic capability.
Before, the organization was very limited in it's abilities to make decisions because they had limited information. The organizational leaders have ideas, and because of the lack of information they have no way of deciding what is a good idea or a bad idea. After the organization starts an analytic department, then suddenly every idea that the leadership gets can be investigated. The paradoxical result is that the leadership still can't make informed decisions. Every idea generates an analysis, and virtually every analysis can generate some kind of results. Without data, the result is inertia; with too much data the result is tail-chasing.
The right way to do this is to begin with the end. Think about the decisions that need to be made. Then think about how to make those decisions in the best possible way. Starting with the end means the beginning -- the data, the analysis, the information -- is focused and effective.
When we start with the data, it's all too much. It's too easy to generate endless ideas, endless leads, endless stories. I've seen it happen when an organization suddenly gets analytic capability.
Before, the organization was very limited in it's abilities to make decisions because they had limited information. The organizational leaders have ideas, and because of the lack of information they have no way of deciding what is a good idea or a bad idea. After the organization starts an analytic department, then suddenly every idea that the leadership gets can be investigated. The paradoxical result is that the leadership still can't make informed decisions. Every idea generates an analysis, and virtually every analysis can generate some kind of results. Without data, the result is inertia; with too much data the result is tail-chasing.
The right way to do this is to begin with the end. Think about the decisions that need to be made. Then think about how to make those decisions in the best possible way. Starting with the end means the beginning -- the data, the analysis, the information -- is focused and effective.
Labels:
business intelligence,
data,
decisions,
information
Information Design: What does it take to be successful?
All of the examples that I have given are of poor information design. Some of them have had more or less success, but they all had substantial flaws. There's a reason I'm saying that information design is a missing profession.
Why is it so hard? First off, true information design projects are fairly rare. BI is usually about straightforwards reporting and ad-hoc analysis. People don't get much of a chance to practice the discipline.
Information design requires a lot of other disciplines. It takes statistics but isn't limited to statistics. Data mining can help but can easily bog down a project in complicated solutions. It requires being able to think about information in very sophisticated ways and then turn around and think about information very naively.
It requires knowing the nuances of an organization. Who are the clients? The users? What is the organizational culture? What does the organization know about itself? What does the organization strongly believe that just isn't so? It's not impossible for an outside consultant to come in and do information design, but it is impossible for a company to come it with a one-size-fits-all solution. When it comes to information design, one size fits one.
Because the profession of information design hasn't been developed yet, it isn't included in project plans and proposals. For two of the projects above information design wasn't even thought of and for the third it wasn't done well because the clients true needs weren't uncovered.
Why is it so hard? First off, true information design projects are fairly rare. BI is usually about straightforwards reporting and ad-hoc analysis. People don't get much of a chance to practice the discipline.
Information design requires a lot of other disciplines. It takes statistics but isn't limited to statistics. Data mining can help but can easily bog down a project in complicated solutions. It requires being able to think about information in very sophisticated ways and then turn around and think about information very naively.
It requires knowing the nuances of an organization. Who are the clients? The users? What is the organizational culture? What does the organization know about itself? What does the organization strongly believe that just isn't so? It's not impossible for an outside consultant to come in and do information design, but it is impossible for a company to come it with a one-size-fits-all solution. When it comes to information design, one size fits one.
Because the profession of information design hasn't been developed yet, it isn't included in project plans and proposals. For two of the projects above information design wasn't even thought of and for the third it wasn't done well because the clients true needs weren't uncovered.
Labels:
data mining,
design,
engineering,
information,
statistics
Thursday, March 20, 2008
Daily Churn: The Project was a Complete Success and the Client Hated Us
The story eventually had a less than desirable ending. After producing accurate daily forecasts for months our work was replaced by another group's work, with the predictions that were much higher than ours. It turned out that having attrition sometimes higher than predictions and sometimes lower was very stressful to upper management and what they really wanted to be told wasn't an accurate prediction of attrition but that they were beating the forecast.
Ultimately the problem was a large difference between what management wanted and what they said they wanted. What management said they wanted was an attrition forecast at a daily level that was very accurate. To this end my group was constantly refining and testing models using the most recent data we could get. What this meant was that all the most recent attrition programs were already baked into the forecasts.
What management really wanted to be told was the effect of their attrition programs, and by the design of the forecasts there was no way they could see any effect. It must have been very disheartening to look at the attrition forecasts month after month and being told in essence your programs were having no effect.
What my group should have done is to go back roughly a year, before all of the new attrition programs started, and to build our forecasts using older data. Then we could make the comparison between actual and forecasts and hopefully see an effect of programs.
Surprisingly, I've met other forecasters that found themselves with this same problem: their forecasts were accurate and they got the project taken away and given to a group that just made sure management was beating the forecast.
Ultimately the problem was a large difference between what management wanted and what they said they wanted. What management said they wanted was an attrition forecast at a daily level that was very accurate. To this end my group was constantly refining and testing models using the most recent data we could get. What this meant was that all the most recent attrition programs were already baked into the forecasts.
What management really wanted to be told was the effect of their attrition programs, and by the design of the forecasts there was no way they could see any effect. It must have been very disheartening to look at the attrition forecasts month after month and being told in essence your programs were having no effect.
What my group should have done is to go back roughly a year, before all of the new attrition programs started, and to build our forecasts using older data. Then we could make the comparison between actual and forecasts and hopefully see an effect of programs.
Surprisingly, I've met other forecasters that found themselves with this same problem: their forecasts were accurate and they got the project taken away and given to a group that just made sure management was beating the forecast.
Subscribe to:
Posts (Atom)