Saturday, May 3, 2008
So what does the watch actually do?
“With no display for the hours, minutes or seconds, the Day&Night offers a new way of measuring time, splitting the universe of time into two fundamentally opposing sections: day versus night.”
Day versus night, huh? And it sells out for an unbelievable price?
I'm going to start working on a new dashboard project, directed at CEOs of multi-billion dollar firms.
It won't have KPI's, trends, links or navigation; all it will do is flash two words - either "MAKING MONEY" or "LOSING MONEY".
Next I'll develop a separate version to sell sports teams - a new scoreboard that flashes only whether the home team is "WINNING," "LOSING" OR "TIED".
Thursday, May 1, 2008
From their short article, I believe that a knowledge harvest is a simple but purposeful and interactive approach to a postmortem analysis or debriefing. The basic idea is that the intentional review of a business occurrence or process will yield helpful information or insights for the future; hence - a knowledge harvest!
However, there is a twist. The authors say that the first step in the process is to recruit a set of “knowledge seekers” who want to learn from the harvest. They go on to characterize these people.
Because seekers are self-interested, they ask tough, exploratory questions of knowledge originators, extracting important nuances – not only about how a project was executed but also about how costs built up, how knowledge might be applied elsewhere, what worked and what didn’t, and so on.
A knowledge facilitator leads these seekers through a process of interacting with the knowledge originators to derive key information and valued insights. The knowledge facilitator then works with the seeker to package the content and distribute it around the company.
My question is whether or not our E2.0 applications are focused enough on these knowledge seekers. Do we have people who are clearly articulating what they need to know in order to do their jobs better? Do our apps help to connect these knowledge seekers with the appropriate knowledge originators within the business? I have a feeling that a lot of our Web 2.0 content is produced by knowledge facilitators who are doing screen scrapes from knowledge originators with no idea whatsoever of the needs of knowledge seekers! What do you think?
I do believe that we have the tools and technologies but I’m not sure that we have them working together to support this interesting approach of a knowledge harvest.
Wednesday, April 30, 2008
One interesting topic that Crovitz raises is the difference between using the traditional form of predicting political results, statistical polling, and using a prediction market that trades future results like stocks. There are plenty of examples that prove that a properly formed market will provide more accurate results then a statistical polling sample set.
Are you convinced yet that prediction markets can be an effective tool for your organization? Have you identified any areas, either internally or externally where a prediction market can more accurately predict an outcome?
Tuesday, April 29, 2008
Here’s an interesting application of heat map visualization. It’s from Purdue University’s Project Vulcan,
Again, some of the results are expected – "carbon dioxide emissions are high where there are lots of people spending lots of time in their cars" – but not overly insightful. More interesting, however, are the discoveries that researchers have made from analyzing the data in graphical form. There’s an excellent summary in the April 27, 2008 issue of the Boston Globe and two results stand out:
“When you rank America’s counties by their carbon emissions, San Juan County, NM – a mostly empty stretch of desert with just 100,000 people – comes in sixth, above heavily populated places like Boston and even New York City. It turns out that San Juan County hosts two generating plants fired by coal, the dirtiest form of electrical production in use today.”
And the heat maps shows a small, bright-red area (high carbon emissions) in the northwest corner of New Mexico surrounded by wide expanses colored green.
“Purdue researchers discovered higher-than-expected emissions levels in the Southeast, likely due to the increasing population of the Sun Belt, long commutes, and the region’s heavy use of air conditioning. According to Kevin Gurney, assistant professor of atmospheric science at Purdue and the project leader, this part of the map also overturns the prevailing assumption that industry follows population centers: In the Southeast, smaller factories and plants are distributed more evenly across the landscape. Cities, meanwhile, prove less damaging than their large populations might suggest, partly thanks to shorter commutes and efficient mass transit.”
Monday, April 28, 2008
Today at Bank Systems and Technology, there’s an article on the increasing importance of analytics to the banking industry. The story is fairly typical in the genre – “we used to manage by gut, but better information about our customers can help us in so many ways!”
What caught my attention was that quite a few of the contributed quotes came from places on the org chart that just don't exist at most organizations – the “Director of Statistics and Modeling” and the “Department of Insight and Innovation” to name two. These references were threaded alongside a frequent comparison of “mature” analytics areas, such as credit card predictive modeling and “growing” areas, such as customer attrition modeling. This might suggest that organizations who create a dedicated function related to analytics and related disciplines are more successful at spreading the competency internally than those organizations that leave it to chance. This is certainly the position put forth by Thomas Davenport in Competing on Analytics, and is certainly intuitive in some respects.
It’s easy to envision a success story for such a group – evangelizing the power of analytics, introducing new skills to functions without a historical strength in analysis, etc. But what are the likely barriers and points of failure? How can an organization considering such an investment get ahead of the curve and mitigate the risk?
I’d speculate there are a handful of key reasons for struggle or failure:
- Lack of a starting point / quick win “pilot” - Perhaps it is difficult for a Center of Excellence-type structure to get off the ground without one demonstrated benefit within the first year or so
- Insufficient data trail - For businesses or domains without a solid trail of transactional information, it might be tougher to get started (there goes my idea for a chain of cash-only restaurants with no POS system)
- Lack of data architecture / infrastructure investment - If a new analytics team’s first report includes a request for $5 million just to organize the data, rough roads may be ahead
- Active resistance to the scientific approach - If a CEO is commonly heard to say “you guys think too much,” is that an organization likely to be hospitable to analytics?
What do you think is the biggest barrier? One I didn’t identify? What are the keys to success in building an organization's overall competency in analytics?