Friday, April 4, 2008

Predicting Lost Luggage

I read an interesting article on prediction markets by Gary Stix in the March, 2008 issue of Scientific American. The bulk of article discusses the success rate of the Iowa Electronic Markets in predicting election results based on buying and selling “securities” – portfolios of contracts for both candidates. In presidential elections from 1988 to 2004, the Iowa Electronic Markets have predicted final results better than the polls three times out of four.

The article provides a great description of how the market works. It also highlights other prediction markets that allow speculators to predict almost any conceivable event, from a Chinese moon landing by 2020 (Foresight Exchange) to Katie Couric departing from CBS News (Intrade) to the first human-to-human transmission of avian flu (Avian Influenza Prediction Market).

While these events are important, and might be fun to risk a few dollars on prediction, I was most interested in the internal markets that are being established to gauge the success of business efforts:

“Attracted by the markets’ apparent soothsaying powers, companies such as Hewlett-Packard, Google and Microsoft have established internal markets that allow employees to trade on the prospect of meeting a quarterly sales goal or a deadline for release of a new software product. As in other types of prediction markets, traders frequently seem to do better than the internal forecasts do.”

I wonder whether an internal prediction market may have help with the disastrous opening of Heathrow Airport’s new Terminal 5. Despite headlines like this:



they clearly weren’t ready for their opening week - hundreds of cancelled flights, thousands of lost bags, and a financial and PR nightmare for British Airways and BAA.



There has been a lot of Monday-morning quarterbacking (or the equivalent soccer term) about the decision to open the new terminal in “big bang” fashion. Critics have suggested a phased approach might have reduced the problems, and citied other major infrastructure projects (like the new St. Pancras rail station) as examples. I’m guessing that the executive team considered both options and researched other airline terminal openings before making their decision. (I remember when the new Pittsburgh airport opened in 1992; the last flight landed at the old airport about 10 pm, and army of people and moving vans transferred all the operations equipment to the new terminal about a mile away, and the first flight landed at the new airport at 6:00 am. Despite some initial problems with the automated baggage-handling systems, this big-bang approach went much more smoothly that Heathrow’s.)

Would an internal market, reflecting the collective knowledge of the Heathrow employees, have predicted such a chaotic opening? Experts still don’t know exactly how prediction markets work. I’m wondering whether the accuracy might have something to do with the “degree of influence” the market participants have over the outcome.

For many events – like predicting the amount of snowfall in Central Park, or the outcome of the NCAA tournament games – a trader has no influence over the outcome and is, effectively, guessing.

For other events – like predicting an election outcome or the success of a new movie – a trader has limited influence. An individual vote influences election results (unless you’re a Republican living in Massachusetts). A person can attend the opening of a movie and tell all their friends how great it was.

Most intriguing are those events where traders have significant or considerable influence over the outcome – the sales manager responsible for meeting the quarterly target, the project manager trying to launch on time, or the baggage handlers at Heathrow who not only have to use the new systems but have to show up at a new location before they even see their first bag of the day.

Is there a correlation between “amount of influence” and “accuracy of prediction?” Can markets provide field-level insight that executives can’t (or won’t) see? If a “Terminal 5” market had existed and “successful opening” contracts were trading at low prices, would BA chief executive Willie Walsh have used this information to delay the opening, conduct more testing, and phase-in new operations over time?

Does your company use prediction markets? Have they been successful?

NCAA Update: Well, the Selection Committee looks pretty good as – for the first time in NCAA men’s basketball history – all four No. 1 seeds are in the Final Four. Would a prediction market have helped? According to this news story,

“…of the 500,000 fans playing on CBSSports.com, more than 51,000 correctly predicted the final four teams…”

Assuming that some of those 10% were basketball junkies while others picked their brackets based the team’s jersey colors, can we can draw any conclusions about a “wisdom of the crowd” factor in the NCAA tournament?

Thursday, April 3, 2008

Reliant Energy, Kelly Blue Book and Infosys to Speak on Analytics at DIG 2008

I am pleased to announce that Reliant Energy, Kelly Blue Book and Infosys will be speaking on the topic of Insights from Advanced Analytics at DIG 2008. Each organization will be presenting case studies on how they leverage analytics to drive better decisions.

Christi Megow and Jason Stults from Reliant Energy will discuss how they are using performance models to drive the operations planning process. Reliant Energy provides electricity and energy services to retail and wholesale customers in the United States. Reliant Energy uses driver trees, dashboards and scorecards to define and communicate corporate strategy to stakeholders and help drive the planning, forecasting, reporting and analysis process for decision-making.

Bruce Hoffman will discuss how Kelly Blue Book uses data gathered from their website to better predict market trends and vehicle pricing. KBB.com has over 12 million site visits per month that are captured, cleansed and consolidated for analysis. Using the clickstream data, KBB is able to provide better analytics both internally and externally to their OEM clients.

Our final presenter in the analytics theme is Romil Bahl from Infosys. In this session, Mr. Bahl will describe how Infosys aligns strategic priorities and objectives with performance measures. Using analytic techniques, Infosys is able to model in real-time expected financial performance for strategic initiatives. Infosys integrates these analytics across industry and horizontal business units to enhance ‘One Infy’ experience and deepen Infosys transformational capabilities.

Wednesday, April 2, 2008

A View From the Top – Chevron CIO Profile

Last week I posted a topic on the CIO from Google, Doug Merrill. This week, I want to highlight the CIO from Chevron, Gary Masada. He was interviewed in the Wall Street Journal and there were a few comments I wanted to highlight and discuss.

Off the top, Mr. Masada discusses the information overload that organizations are facing. From the interview, Mr. Masada used the following analogy when responding to the question of the biggest challenge that he is facing as the CIO:

“Getting our arms around all the information we have. We’re basically creating the Library of Congress every day or so…”

The conversation continues on to discuss how Chevron addresses the exponential explosion of information and how users find what they need.

“People have to do something to help themselves, which is organize their information so that it can be found….But it’s a pain. And besides that, you’re probably just going to make another mess the minute it’s clean…what I really need is a disciplined way to make sure that I don’t repopulate it the very next day.”

The idea of “discipline” also relates to having governance processes in place to create the consistency needed. My question for the readers is how does this apply in an E2.0 environment? In examples of collaborative environments, are their governing groups or is the mass considered the governing body? Can we apply the models from areas like open-source development or Wikipedia internally to the organization?

“Another part of it is tagging the information in ways that make it easier to find [by adding so-called metadata that describes what’s in a file in more detail].”

The point about tagging is interesting and is happening as a first step in many organizations as they move to E2.0. Certainly the aspect of tagging information will help improve accessibility through enhanced meta-data. The concept of individuals helping themselves is a first step. Getting this to happen across the entire organization in a collaborative environment instead of just a few select individuals should keep the “house” clean.

On the topic of technology that is targeted for consumers and the impact on corporate technology, Mr. Masada responded with the following:

“Web 2.0 and Facebook and all of that are here there are real. The issues when you [start to bring these technologies into the workplace] are things like privacy, security and liability. The young generation doesn’t even know it’s an issue…these younger folks are more comfortable sharing information than older people.”

I actually chuckled when I read this…mainly because it is true. There is a considerable generation gap and I would say that all of the social platforms have taken off through high school and college students (I am proud to say that my 17 year old nephew didn’t know what I meant when I asked him for his Twitter username so I could send him a message). Is this a barrier for companies to adopt E2.0? Will it take a period of time to work out the generational gaps that exist or convince the “older” generation that E2.0 can create value? And how are organizations addressing privacy, security and liability issues? These are all questions that need to be addressed quickly to keep up the momentum with E2.0.

The final topic that Mr. Masada discusses is how Chevron matches technology investments with a business problem.

“We have a process that we call Everest…it’s a way of checking all of our IT projects against the major business strategies...A lot of projects come from the bottom up. They’re productivity-related and they may help. But if the core business strategy is not to invest in that sector of the business but to invest somewhere else…it’s not the best use of IT resources. Yet in many cases, IT resources just stay where they’ve always been.”

This is a perfect example of linking strategy to the resource allocation and initiative management activities. If organizations aren’t applying the appropriate stratex budgets and aligning initiatives to strategic objectives, opportunities will be missed. If resources and budgets are limited, reducing or eliminating initiatives that are not considered strategic or a core business function will free up resources to fund new strategy initiatives. It is good to see that organizations like Chevron have an IT governance process that aligns IT investments with the strategies of the business.

I would appreciate comments on the E2.0 questions, since these seem to be initial barriers for organizational adoption.

Tuesday, April 1, 2008

Opening Day (Part 2)

To help me get ready for the upcoming baseball season, I recently purchased the 2008 Baseball Prospectus, an annual almanac of analysis and predictions from the folks who brought us 21st century metrics like VORP (Value Over Replacement-level Player) and BABIP (Batting Average on Balls In Play).

I got to thinking about the evolving perception of analytics in a baseball context. While mentioning those metrics at the water cooler of 2008 might get you some confused looks, you're probably not in danger of being stuffed in a locker, because analytics have begun to earn respect from a wider baseball audience. How, exactly, did this happen? Off the top of my head, I came up with four possible drivers of this change (not mutually exclusive):

  1. Boil the frog slowly – If On-Base Percentage really correlates better with runs scored than batting average, well, who am I to argue? And if that's true, then maybe I ought to listen to some of your other ideas...

  2. Myth-busting – Some assertions have been controversial (e.g. “There’s no such thing as a clutch player”), but maybe they’re plausible and interesting enough to get attention

  3. Case studies – Michael Lewis’ best-seller Moneyball and the 2004 and 2007 Boston Red Sox (World Series winners) have shined a public light on organizations that succeeded with analytics-friendly leadership

  4. Audience evolution – People in general have better quantitative reasoning skills than they did, say, 20 years ago, and so are more open to evidence-based insights

Here’s my question, and it’s not about baseball: In your organization and mine, a major barrier to extracting value from analytics is a rejection of the methods and implications from (for lack of a better term) the “old school” crowd. What’s the best way to make the case for analytics in your organization? Take on a single cherished nugget of conventional wisdom and prove it wrong? Or is that too risky? Is it better to plug along cautiously, incrementally adding some objectivity and trickling new metrics into the soup until the organization is ready? Or is it the Moneyball approach – find one manager willing to try the Kool-Aid and make something happen?

Opening Day (Part 1): Roger Clemens is Innocent

I’m excited to join such a distinguished group of bloggers, and proud to reveal a new analytic insight in my first post.

It seems to be the consensus among baseball fans that Roger Clemens used performance enhancing drugs in the latter stages of his Hall of Fame career. Accusers support their claims by referring to his apparent improvement when he left the Boston Red Sox for the Toronto Blue Jays before the 1997 season, right before his age-34 season. But advanced analytics tell a different story altogether.

Using some new functionality in the latest version of SAS, I created two new metrics. One is something I call Adjusted Prevented Runs (In League). Basically, it controls for a handful of factors not addressed in the pitching metrics most favored by sabermetricians today and in one number tells you how effective a starting pitcher is, relative to the rest of the league, in keeping the opposing team from scoring. I call the second metric Factored Outs Over League, because in terms of pitching performance, while keeping opposition runs down is the ultimate goal, the way to achieve it is by getting outs. It’s like a pitcher’s version of On Base Percentage.

Because of the relationship between preventing runs and getting outs, the best way to get an overall view is to multiply the two metrics. And when you look at the data, Roger Clemens didn’t actually register a significant change in his performance from his Boston years to the Toronto years.

Thus, with the help of APR(IL)*FOOL, the power of analytics is demonstrated once again: this time, to clear the name of an innocent American.

Evolution of On-Demand BI

So I have been seeing a lot of the “On-Demand Business Intelligence” marketing term being tossed around. Just to validate the infectious nature of the term, right or wrong, try doing a quick Google search on the phrase “On Demand BI”. For me it returned 9,120 results in less then 0.27 seconds. When I looked at the top 50 results, they ranged from written articles to software vendors to some topics inappropriate for the DIG blog space. (As a side note, I think I found a new feature for Google to add, or I have missed it. I would like to be able to go to the end of the search results. I was curious to see what was at the bottom of the list).

So I filtered out the software vendors, since the purpose of my search wasn’t to evaluate the technologies available, although it is something I would like to do in a future post. Instead, I wanted to see what the “experts” were saying about business intelligence on-demand.

The first observation is that less then 15% of the results were from the past year. I can conclude that one of three things occurred. One, the marketing dollars are starting to dry up and there is only so much that can be spent on certain buzz words. Second, the industry has moved on to a different set of marketing terms that will help create hype cycles along with fear, uncertainty and doubt. Or third, and what can only be the real reason in the drop of web content over the past 12 months, we have finally arrived. BI on demand is a reality and there is no reason to discuss it anymore (insert sarcasm here).

So what is “on demand BI”? Simply put, it is hosted business intelligence solutions or “Software as a Service”, SaaS for short. The most successful and well-known SaaS vendor is Salesforce.com and their AppExchange platform.

The first question I asked myself was “So what are the benefits of on-demand business intelligence”? I started back to the earliest Google results I could find from 2006. Here are some of the highlights that I found in different articles selected at random.

In a Computer World posting back in October of 2006, Jerri Ledford made the following points on the topic:

“…it's business intelligence when and where you need it, without all of the difficulties of building the solution yourself, or hosting it yourself, or even maintaining it yourself.”

“on-demand BI is gaining ground, because it's appealing to smaller companies that can't afford to invest in a full-blown BI solution”

“…the benefits become clear (usually). For example, on-demand BI usually means you have the answers to your BI questions when you need them (and where you need them) without having to devote an entire army of IT professionals to pulling those answers from mountains of data.”

Another posting on TechLink by Amit Kesarwani in December 2006 had this to say:

“On-Demand BI provides much more benefits than operational Software as a Service due to the complexity and high cost of BI systems. On-Demand BI providers can easily offer economies of scale and shared cost using multi-tenant architecture but without jeopardizing security”

Mr. Kesarwani goes on to compare “Traditional BI Deployments” to “On-Demand BI Deployments”. Here are a couple of the comparison points.

Traditional BI Deployments: Complex to use and deploy, Expensive Customization, Long Project Life Cycle, Lack BI best practices, High Risk”

Usability for On-Demand BI Deployments: Easy to use and deploy, Low Cost Customization, Rapid time to market, Provides best practices, Low Risk”

And finally, this post from Darren Cunningham on the SuccessForce Community Blog:

“I think the renewed focus on simplicity in the BI market is accelerating the shift from on-premise to on-demand solutions and this will ultimately help drive greater end-user adoption and improve organizational decision making.”

So, let’s jump ahead to some more recent postings in 2007 and 2008. The first was from a research report by the Aberdeen Group on what is driving BI user adoption. Here are a couple of the highlights:

“This benchmark study finds that organizations falling into the ‘Best-in-Class’ category are addressing the skill set shortage by establishing training programs, contracting with 3rd party consultants and solution providers, or implementing On-Demand options such as hosted BI, SaaS (Software as a Service), and BI appliances.”

And finally, I found the following on CIO.com. This article was posted 4 days ago!

“The oft-cited concerns regarding on-demand and SaaS applications (integration, customization, security) typically don't emanate from the business side of an organization. Typically, they come from IT groups already under intense pressure from project backlogs and a lean number of staffers, who most likely don't have BI development skills.”

“With easy-to-install on-demand applications, IT's role as gatekeeper is minimized, say analysts. By 2012, Gartner's Schlegel predicts that emerging technologies such as on-demand and SaaS BI tools will make users ‘less dependent on central IT departments to meet their BI requirements.’”

“One major sticking point for IT usually involves the security of corporate data as it moves outside of IT's control. But executives and analysts say that the potential business benefits of quicker access to BI data, coupled with the robustness of third-party providers' security mechanisms, may outweigh concerns.”

So what can we glean from these articles? What are some common themes? What has changed over the course of three years?

There are some pretty big promises being made with a lot of the positive rhetoric. Promises like fast deployments, higher user adoption and elimination of your IT organization. Okay, I made that last one up but I was just checking to see if you were paying attention. If you peel away the SaaS vision and happy talk, there are some key themes that should be takeaways, for both on-demand and on-premise BI deployments.

First, user adoption is a significant issue for BI applications. So what is driving low user adoption? The issue that continues to bubble to the top is complexity of the tools. Does SaaS solve this problem? Possibly, but BI systems still comprise of complex tools to answer complex questions. The advantage of SaaS is the total cost of ownership will be significantly less, since the per user license cost is lower and if you only have a handful of people actually using it, then you would hope you only pay for those users. This will be much cheaper then predicting you will have 1000 users for on on-premise solution and paying all of those users before you have the software installed.

Second, the simple and easy deployments seem to popup consistently. I do agree that removing the complexity of hardware and software procurement will reduce some upfront angst, but do we really think that the primary reasons why BI applications go amuck (i.e. poor data, inconsistent agreement to business requirements to name a few) will magically disappear? I am not a pessimist by nature, but instead a realist. These issues need to be address no matter how you physically deploy your BI solutions.

Finally, security of data is a consistent concern associated with hosting a BI application. It was pointed out in 2006 and 2008 and I will predict it will be a concern in 2012. That being said, how many organizations have had data compromised when it was behind their own firewalls? Maybe it will be safer if someone else takes care of it for you.

So who should consider on-demand BI? The quick answer is everyone. It is a viable approach and a majority of the software industry is moving to SaaS architectures. But to be more specific, I would say there are three considerations when selecting the type of BI deployment. The first is the size of your organization. Smaller organizations that currently have no BI solution and the economics make more sense should consider an on-demand deployment. For those firms that have limited business intelligence skills in IT, should also consider BI SaaS. Finding talented BI skills has been a lingering issue for years. And finally, consideration should be made on the security of your data being hosted externally to your organization. As I said earlier, it may be more secure when you don’t host it!

To read some additional thoughts beyond my rants, check out Timo Elliot's BI blog posting on the topic.

What are you thoughts? Are you using or considering on-demand BI? If so, what made your deployment successful versus an on-premise deployment (assuming it was successful)?

Monday, March 31, 2008

Centre Pompidou as an E2.o Analogy!

I'll begin with my conclusion. The transparency enabled by E2.o could unleash a thrust of unstoppable execution within our organizations. Do we dare? Can we handle it? Let me share my thoughts.

In the recent April 2008 HBR piece by David Collins and Michael Rukstad titled, Can You Say What Your Strategy Is?, the authors offer the following visualization of the importance of having a clearly articulated strategy:

Think of a major business as a mound of 10,000 iron filings, each one representing an employee. If you scoop up that many filings and drop them onto a piece of paper, they’ll be pointing in every direction. It will be a big mess: 10,000 smart people working hard and making what they think are the right decisions for the company—but with the net result of confusion…

If you pass a magnet over those filings, what happens? They line up. Similarly, a well-understood statement of strategy aligns behavior within the business. It allows everyone in the organization to make individual choices that reinforce one another, rendering those 10,000 employees exponentially more effective.

The article goes on to delineate the elements of a clear strategy and outlines both the analysis and supporting documents that should be shared throughout one’s organization to take advantage of that strategy (you should also check out Dave Norton and Bob Kaplan's Strategy Map). Collins and Rukstad then make some observations that scream E2.o. However, I believe that there is an oversight (my opinion!) that pulls up shy of the true opportunity that we have. The article reads:

The process of developing the strategy and then crafting the statement that captures its essence in a readily communicable manner should involve employees in all parts of the company and at all levels of the hierarchy. The wording of the strategy statement should be worked through in painstaking detail. In fact, that can be the most powerful part of the strategy development process. It is usually in heated discussions over the choice of a single word that a strategy is crystallized and executives truly understand what it will involve.

The authors begin the preceeding vignette talking about employees in all parts of the company helping to develop the strategy and, then, they end the same paragraph with only "the executives being involved in the heated discussions" that allow them [executives] to truly internalize what the strategy will involve? Is there no way that we can allow the heated discussions to be transparent throughout the organization?

Imagine if we could have the good, the bad, and the ugly of the strategy dialog open to the organization - like the Centre Pompidou in Paris – where all of the supporting structure and systems, such as the escalators, are exposed to the outside world.

Color-coded ducts are attached to the building's west façade... blue for air, green for fluids, yellow for electricity cables and red for movement and flow. The transparency of the west main façade allows people to see what is going on inside the centre from the piazza, a vast esplanade that the architects conceived of as an area of continuity, linking the city and the centre. [quote link - Centre Pompidou website, picture link - Wikipedia]

I can only dream of being in an organization that is designed to run itself with the transparency and intentionality to always “link the city and the centre”. I believe that a liberal use of E2.o applied to the strategy development process could lead to a profound mobilization of the human capital within our organizations. But I understand the fears that run wild within our corporations -> our competitiors will get our information!

As Andrew McAfee concludes at the end of a post in 2006 on E2.o Insecurities,

Imagine two competitors, one of which has the guiding principle "keep security risks and discoverability to a minimum," the other of which is guided by the rule "make it as easy as possible for people to collaborate and access each others' expertise." Both put in technology infrastructures appropriate for their guiding principles. Take all IT, legal, and leak-related costs into account. Which of these two comes out ahead over time? I know which one I'm betting on.

In conclusion, the transparency enabled by E2.o could unleash a thrust of unstoppable execution within our organizations. Do you agree?

What do you think? Have you seen any real-life successes in "open strategy" dialog, please tell us.