Jeffrey's background is in number crunching, which is a required skill at the blackjack tables. Jeffrey will discuss how to use data driven analysis to make smarter business decisions. The Protrade fantasy sports site leverages Jeffrey's proprietary analysis tools and metrics to predict performance. His approach is to help the audience to move away from using their "gut" to make decisions and drive decisions based on quantifiable data. We are looking foward to Jeffrey's key note and expect the venue in Las Vegas to fit the bill.
Friday, March 14, 2008
The City Performance Reporting (CPR) tool provides performance results and outcomes organized by 8 citywide themes, such as Education and Public Safety. The purpose of CPR provides transparency and accountability for city services to the "users" of these services...New York city residents.
Some of the features that CPR provides includes aggregation of metrics, year over year performance, traffic lighting for variances, metric definitions and graphical representation of data. One additional feature that I thought was interesting was the general trends up or down for the aggregate of all performance metrics. For example, the number of measures that are improving versus the number of measures that are declining (screenshot to the left).
In comparison to some of the solutions that I have seen deployed, this one is pretty basic. That being said, the fact that the city has identified the performance measures that are critical for city services and has changed processes to capture these metrics, it is an important first step. To then "expose" this information to the general public in a live reporting system shows that they have committed to hold themselves accountable.
Based on reading the cryptic URLs while clicking through the different reports, it looks like CPR is using the Oracle Siebel Analytics tool. Just a guess.
Thursday, March 13, 2008
Two of these fans, who also happen to be business professors, have developed an analytical model using SAS® software to predict “at-large” teams – those schools who do not receive an automatic bid to the tournament.
Jay Coleman, an operations management professor at the University of North Florida in Jacksonville, and Allen Lynch, an economics professor at Mercer University in Macon, Georgia, built a model that has achieved an impressive 94% accuracy rate in predicting tournament teams.
The actual selections are made by the NCAA Tournament Selection Committee, and will be announced this weekend. Coleman and Lynch used historical results from this Committee, along with 42 pieces of information to build their model. Interestingly, they found that only 6 items are significant in determining whether a team gets an at-large bid:
1) RPI (Ratings Percentage Index) Rank
2) Conference RPI Rank
3) Number of wins against teams ranked from 1-25 in RPI
4) Difference in number of wins and losses in the conference
5) Difference in number of wins and losses against teams ranked 26-50 in RPI
6) Difference in number of wins and losses against teams ranked 51-100 in RPI
Here a link to their website; and a 2-minute video about their model.
As they mention in the video, predictive models have many applications in the business world. These models can be difficult to build (the DanceCard model has been refined over 14 years) and validate (we don’t have the equivalent of a 10-member committee announcing their results live on CBS). But simplifications may exist (of 42 drivers in the DanceCard model, only 6 are significant) so don’t be afraid of the complexity.
Advances in analytical software, coupled with the increased availability of data, make predictive models a powerful tool to use in optimizing your business. And we have the benefit of a real-life market to test our ability to predict the future.
Today’s burning hoops question: Will Ohio State, who lost to Florida in last year’s championship game, even make it into this year’s field with a 19-12 record and an RPI of 48? DanceCard will have their final prediction later this week.
What’s your burning business question? Have you tried to build a predictive model to answer this question? How well did you do?
Wednesday, March 12, 2008
Top 5 reasons why many data warehouse managers fail to deliver successful data warehouse initiatives:
Data Quality: Quality of source system data that is to be integrated into the data warehouse is “overrated” and thus time to resolve is “underestimated”
- Bad information in means bad information out. The CPM applications that will source data from the warehouse will suffer diminishing adoption if not addressed upstream
- Data integration strategy must include methodology to address erroneous data
- Significant level of involvement from business and IT to help resolve (decision and execution of) challenges
Data Integration: Lack of robust data integration design results in incomplete and erroneous data and unacceptable load times
- What happens when you are the process of loading data and you start receiving exceptions to what is expected? Is data rejected and you are now faced with the dilemma of partial data loads? How do you avoid manual intervention?
- What checks and balances do you have in place that ensure what you are extracting from source systems is being populated into the target? Can you audit your data movement processes to ensure completeness as well as satisfy regulatory obligations?
- Your processes can handle the data volumes you are dealing with today but can they handle the data volumes of tomorrow? How easy is it to reuse existing processes when adding additional source systems/subject areas to your Warehouse?
Data Architecture: Creating a solution that is not able to scale after an initial success will result in a redesign of the architecture
- After the first success the business will quickly want to extend the usage of the solution to a greater number of users, will the performance continue to live up to expectations?
- As users mature and adoption improves so will the complexity of information usage, i.e. more advanced queries, can the design continue to perform as expected?
- Increased usage and maturity results in the demand to integrate into the solution additional data sources/subject areas. Is the architecture easily extensible?
Data Governance & Stewardship: With no controls established around data usage, its management and adherence to definitions, data silos and erroneous reporting begin to reappear
- Stakeholders must be identified and give decision rights to help improve the quality and accuracy of your common data
- Practices around the managing of standard definitions of common data and business rules applied must be established
- Understand who is responsible for the data and hold them accountable
Change Management: Not preparing an organization to utilize what is being built results in the investment in data warehouse not being fully realized and thus deemed a failure due to low user adoption
- “Build it and they will come”; providing information access does not necessarily equate to information usage.
- Helping the business understand how they can leverage these newly available data often results in changes to the way that they work. “Day in the life of” today vs. “day in the life of” tomorrow
- Education and training programs are required
- Integrated project teams (business and IT) are essential to the success of data warehouse initiatives, with individuals becoming champions within the organization for change and adoption
When a community sets out to address complex problems such as … the effort usually ends up going nowhere.
The quiet failure of such initiatives is often attributed to human nature, or to some flaw in the process that shaped the effort. But in fact, the problem usually starts when the project organizers compose their first list of proposed participants. The organizers ask themselves: Who are the power brokers around town? Who are the key players? …
Thereafter, the whole effort will operate on the unspoken presumption that influence derives primarily from positional power, and that positional power translates into the ability to get things done.
That assumption is as naive as the belief that a company’s organization chart –– with its boxes and circles, its dotted lines showing who reports to whom — provides an accurate picture of how the organization actually works. Like org charts, “most powerful” lists reveal nothing about the human qualities of those who occupy senior positions, the web of personal relationships upon which they can draw, or the trust they inspire (or don’t inspire) in other people.
She goes on to share both her understanding of and experiences for how networks permit and accelerate the flow of information and her approach to identifying people who have the capacity for “fruitful collaboration”. These are people who transcend workplace silos and collaborate freely across traditional community boundaries - to get things done.
The glory of the whole thing is that these most likely unheard-of-people are the ones that can bring together the skills, knowledge, and approach to operate without hesitation of differing hierarchies or CULTURES.
My question is can we transfer these ideas inside of the walls of the traditional organization? Can E2.0 be a catalyst for bringing these folks to the fore of our organizations? I think a good reading of this article alongside Gary Hamel’s chapter on W.L. Gore [The Future of Management, chapter 5] should be required reading for E2.0 practitioners. It is a challenge to be achieved. Let me know your thoughts.
If it’s not a dialog, why waste the time.
Tuesday, March 11, 2008
I am pleased to announce that Jevon MacDonald and Thomas Vander Wal will be leading a pre-conference clinic session on Enterprise 2.0 at DIG 2008. Jevon and Thomas bring both depth and practical knowledge in the area of enterprise social software.
"Enterprise 2.0 Jumpstart", as Jevon and Thomas has aptly named the clinic, will focus on the effect of Web 2.0 technologies inside today's established organization. They will introduce core E2.0 concepts, practical examples and first steps associated when implementing Enterprise 2.0 software inside your organization. The session is scheduled to run between 3 to 4 hours, which should provide attendees enough time to dig deep into E2.0 content with Jevon and Thomas.
I would also recommend taking a look at Jevon's FastForward blog. I am a follower and have it listed in my personal blog roll.