Saturday, March 8, 2008

Importance of having accurate data to drive decisions

I recently read an article in the Boston Globe discussing the unexpected rise in costs related to the state's universal healthcare plan. The program requires that every single state resident have healthcare coverage (a soon-to-be national topic based on the outcome of the presidential elections). What the article highlights is that the anticipated costs of the program could double and that the state has not budgeted for the increase in costs. The primary driver of the increased budget is that the state underestimated the number of state residents that do not have healthcare coverage. The legislature had two numbers to use to drive the budget model, one source being the state's estimate of 460,000 and the second source being the US Census Bureau estimate of 748,000. Unfortunately for the state, they used a number somewhere in between and they are now realizing that their budget will be short.

Beyond the political nature of the article, what I find interesting about this situation is you can clearly see the importance of having accurate data when making a decision. More appropriately said, the state had already made a decision to provide universal healthcare for every state resident, but because of the poor quality of data, how they allocated resources (i.e. money) is being significantly impacted.

Projected Enrollment by Year

Friday, March 7, 2008

Mass Collaboration meets the Experts

Yesterday, Newsweek posted a web exclusive, Revenge of the Experts, delineating and heightening the debate over the value of user generated content. The subtitle reads:

The individual user has been king on the Internet, but the pendulum seems to be swinging back toward edited information vetted by professionals.

The article goes on to challenge the exclusive role of mass collaboration and begins to define the growing role of experts in generating trusted web content.

My gut is that we need both and we should begin to think through better ways to integrate the two. However, to start, we need to look at the PURPOSE of the content we are looking to generate and evaluate the factors that would produce the best outcome – which always varies on the basis of being “free, perfect, and now”. In each situational context, we can make decisions based on a common set of factors that can be weighed and traded off as we consider the options we have to produce the best outcome within the constraints that are in front of us.

For instance, if I need to quickly understand the stats behind the Patriots recent championship seasons, it might make a lot of sense to ask the web where I would probably get a get an accurate, fast, and cheap response. On the other hand, if I am looking to understand the genuine subtleties of the relationship between Iran and the U.S., it would behoove me to engage some educated perspectives of history, politics, religion, and culture to weave an intricate but delicate understanding.

This leaves us with the need to define the factors for evaluating content and/or its the process of its development. My guess is that there is plenty of existing thought on this topic. For instance, the framework would need to consider facts and the accuracy thereof, opinion and the diversity of perspective, and context and the narrative of the times or specific situation. Once we identify these factors, we can begin an educated dialog on how to balance and integrate mass collaboration with professional expertise and focus – possibly for both the content and the process of generating the content! Who knows, we might be able to have a 1+1=3 result for both Web 2.0 and E2.0 applications.

What are your thoughts? We would value your comments on this topic and any references that you might be willing to share.

Thursday, March 6, 2008

Does your business have a canary?

Ronald J. Baker, in his book “Measure What Matters to Customers,” draws a parallel between leading indicators and the canaries used by coal miners to alert them to the presence of noxious gases. If the amount of carbon monoxide reached a heightened level in an underground cavern, the canaries would stop chirping, have trouble breathing, and in some instances even die. This early-warning system gave the miners the time they needed to evacuate the mine.

Baker suggests that, in addition to lagging measures (which follow changes in a business cycle) and coincident measures (which run in sync with a business cycle), firms should identify leading measures which anticipate changes in a business cycle.

Leading measures are harder to identify – whether financial or operational. Traditional reporting paradigms (P&L, Balance Sheet, and Statement of Cash Flows) focus primarily on lagging, financial measures of business performance. Identification of leading indicators requires development of a “theory of the business” in order to find those measures that are correlated with desired performance but can be measured prior to the results rather than afterwards.

Earlier this week, in an article in the Wall Street Journal about the slowdown in the construction industry, it was reported that “the American Institute of Architect’s monthly index of billings at architectural firms was down 14% in January from its peak in July. That means fewer construction projects will start this year, said AIA Chief Economist Kermit Baker.” In this example, the index of billings plays the role of the canary – an early-warning indicator about future negative performance. The challenge is: what do you do with this information? Can the AIA use this data to make operational adjustments which will increase billings and therefore increase future construction projects?

Does your business have a canary? What leading indicators do you use to predict future performance? How did you develop those indicators?

***
Mark Lorence bio – I’m a Director in the Strategy Practice at Palladium with 18 years of consulting experience in large-scale systems implementation, planning & budgeting solutions, and Balanced Scorecard implementations. My current area of focus is incorporating analytics into traditional Balanced Scorecard projects, integrating strategic planning and business planning processes, and augmenting these solutions with next-generation dashboards. I am a lifelong Pittsburgh Steelers fan now living in Boston and admit to a small amount of childish satisfaction from the results of this year’s Super Bowl.

Building an enterprise semantic layer

I recently read a blog post on FastForward by Paula Thorton mentioning a Reuters technology infrastructure called Calais. The purpose of Calais, to put it in simple terms, is to provide a service to automatically put context to unstructured data. The unstructured data could be in the form of news articles, blog postings, or any other text based content. The Calais web service would then identify the entities, facts and events based on the natural language descriptions in the text. The service would then return a descriptive model of the unstructured data.

The reason why this is important or at least is generating excitement is that many believe Web 3.0 will be based on this type of inference of content. Web 2.0 is highly dependent on many individuals applying context themselves through concepts like tagging, social sharing and general broader collaboration.

Now, the majority of the excitement and focus is on the user base that is currently driving the Web 2.0 movement. As someone who has done a majority of my professional work inside the four walls of an enterprise in areas such as data architecture, data integration, business intelligence and corporate performance management, I see incredible opportunity in something like Calais. A majority of the effort associated with building internal measurement systems like dashboards and management reporting applications is in developing a single semantic layer of metadata. Aside from the effort to develop the semantic layer, it is typically inconsistent because the organization lacks the ability to agree on a common business taxonomy that describes the enterprise.

When we think about bringing Web 2.0 technologies (social networks, wikis, prediction markets, blogs) to the enterprise, the critical first step is building out a metadata layer that puts descriptors on data to create information and starts to establish context. There is plenty of unstructured data floating around organizations in the form of documents, internal web sites, email communications and traditional knowledge centers. In addition, there is an inordinate amount of structured data, which I would argue, can have very little value when it lacks context. It requires speaking to an info-worker who can explain the report, dashboard or data set. These sets of structured data are typically living/trapped within silos of different functions such as finance, marketing, sales and operations. If you look at the collective structured and unstructured data/information that an organization captures across disparate groups/functions, being able to "infer" the entities, facts, and events will start to build the context needed to make better informed decisions. Add the ability to link people through the social network of an organization to share and disseminate information, and you are starting to see the value in implementing a Web 2.0 platform across the enterprise.

Wednesday, March 5, 2008

Getting Value from the Masses

This week, I noticed the blog post by Scott Gavin [Enterprise 2.o Evangelist] sharing about the recent release of Ubuntu Brainstorm.

Ubuntu, the user friendly Linux distribution, launched Ubuntu Brainstorm this week. Inspired by IdeaStorm from Dell, the Ubuntu community can now suggest ideas and vote online. Its goal is to have a better idea of what Ubuntu users would like to see in upcoming Ubuntu releases.

As a user you can add your ideas or vote for your preferred ones, add comments and see their implementation status. The best and most popular ideas quickly rise to the top and can be creamed off for inclusion in future releases.

Upon reading the post, my mind immediately jumped to some unresolved questions about the ultimate utility of these online sites for engaging customers in co-creative efforts. Armed with this question in mind, I jumped online to refind an article that I had only recently stumbled upon by Wired author, Ryan Singel. In his article, “The Wiki That Edited Me” (Sept 7th, 2006 - WIRED), Singel recapped his experience of putting a draft story of his out on a Wiki site for mass contributions and edits.

As to the outcome, he says, “Certainly the final story is more accurate and more representative of how wikis are used.”

He asks rhetorically, “Is the story better than the one that would have emerged after a Wired News editor worked with it?”

I think not”.

The edits over the week lack some of the narrative flow that a Wired News piece usually contains…”

This outcome was my exact expectation. I didn’t expect mass collaboration to provide a narrative flow fitting for Wired! I did expect it [mass collaboration] to deliver accuracy and perspective to the topic – as it did.

Our challenge is to figure out how to bring “focus” and “narrative” to the value-added benefits of mass collaboration. I realize that this language may be too traditional but?!

What are your thoughts? How do corporations get optimal value from E2.o mass collaboration?

Tuesday, March 4, 2008

Welcome to the Data Theme

How often do you sit back and spend a moment to actually think about how much data we actually generate and use on a day to day basis? Whether it be in the office sending an email to a colleague, or updating a financial spreadsheet for your CFO we are using and generating data. Do you ever think that when you simply stop on the way home to fill up your car with gas, or scroll through the TV guide looking for your favorite show to TIVO, you are generating and using data?

With advances in processor speeds, data storage capabilities, and application technologies our capacity to generate and capture data is forever increasing. Many organizations are leveraging this data, turning it into usable, sustainable information that can be used as an asset to help gain competitive advantage. The majority of organizations however are simply overwhelmed as to what to do and where to start.

Over the next couple of months I welcome you to join me as we explore the theme of Data. We will look to discuss a variety of topics that relate to the challenges faced by organizations who are working to develop an "information architecture for the 21st century". Not only will we discuss the traditional well heeled topics such as "Data Strategy", "Data Quality" and "Data Integration", but also others that often influence the success of organizational initatives due to lack of prioritization or simple underestimation, such topics may include "Data Governance", "Master & Meta Data Management", and "Managing Unstructured vs Structured Data" to name a few. Of course if there is a topic of interest that you wish to discuss I fully encourage the suggestion.

***

Glyn D. Heatley bio - I'm a Director and Leader in the Information Strategy and Architecture Practice at The Palladium Group. I bring over 13 years of experience in delivering large scale Corporate Performance Management solutions with a primary focus in Data Management and Business Intelligence. Over the years I've gained experience in all areas of the Data Warehouse Life Cycle including Requirements Gathering, Solutions Architecture, Data Architecture, Data Integration, Business Intelligence and Project Management.

Sunday, March 2, 2008

E2.o enables Business Design Innovation

This is my inaugural post as the facilitator of both the E2.o theme of the TalkDIG blog and of the upcoming DIG conference in May, 2008. I must say it is such an exciting time to be talking about information and decision-making. The plethora of technology available to us is unprecedented. And the opportunity that it allows for new management thinking is confounding.

Revolution?
Simply put, I think that we have the makings for management revolution! And I side with the writings of gurus/consultants Eric Beinhocker, Lowell Bryan, and Gary Hamel who each tout (from my vantage) the possibilities available to those organizations that embrace a more evolutionary approach to business markets and a more "social" approach to management (my words).

The E2.o theme of this blog will be all about exposing and sharing the business information available to our companies in such a way that more eyeballs, and therefore smarts, can be harnessed toward innovating products, innovating customer experiences, and ultimately innovating all aspects of business design. The future is one of co-creation with people inside and outside our organizations - and we have the ability to encourage this trend using the information and Web 2.o tools currently available to us.

Prepare those Minds
Remember the saying by Louis Pasteur, “In the field of observation, chance favors the prepared mind.” Well, if we believe that the business climate is chaotic, then we better make sure that our people have prepared minds to detect and respond to their environment. I will go as far as suggesting that we pull out the texts of Stafford Beers on cybernetics despite the US nationalistic taboos that he might incite. I think he was on to a new paradigm of management many years ago. But enough said, I’m excited to be sharing my thoughts on E2.o and I hope that many people with more smarts and experience than me will join the dialog so that we can all grow and learn as a community.

***
George Veth bio - I have been consulting to large companies in the domain of Business Intelligence, Corporate Performance Management, and Strategy Execution for 15 years. Over a year ago, I left my consulting post and have just recently joined a startup, BigTreetop, which is looking to spur on Experience Co-creation in small and medium sized businesses. The BigTreetop (Web 2.o) platform is created to enable your favorite local businesses to share their plans and questions with their community in order to leverage the experiences and insights of their customers and partners to continuously evolve to the current needs of the marketplace. We’ll see!

Welcome to the Analytics Theme

How do you make important decisions? Do you trust your gut - or crunch the numbers? Flip a coin - or build a spreadsheet? Ask your spouse - or ask your SPSS?

Does your company make decisions the same way? Decision-making is becoming a key management competency driven by globalization, complexity, and risk. Should we be making these bigger, harder, riskier decisions the same way we've decided things in the past?

In the Analytics Theme at DIG we're going to discuss how decision-making can be improved by developing performance models and applying different analytical techniques to those models.

These techniques - decision trees, probability and statistics, simulation, regression, and optimization - may be ideas you vaguely remember from your Management Science 101 class, or they may be things you and your company are doing on a daily basis. Either way, we want to talk about them.

Once limited to the "quant jocks" with their cumbersome analytical software packages, these techniques are now widely-available thanks to advances in software tools and increased availability of data. And they're being used in some fascinating ways.

We'll hear some of these stories from our speakers and clinicians, but we're hoping to hear the best ones from you. How are you using the tools? How are you applying the techniques? And how are you improving decision-making through analytics?

To get started on this section of the blog, check out CNN's coverage of the Texas and Ohio primaries this Tuesday. I've been watching the campaigns with interest and have been fascinated with CNN's "Delegate Counting Map." They have the typical color-coded states (or counties, depending on the view) showing the results, but are able to run numerous simulations of future scenarios just by tapping a few icons..."If Obama wins the remaining states 55-45, here's what the delegate-count will look like heading into the Pennsylvania primary..."

An intuitive user interface, lots of good data, and the ability to quickly run simulations - that's a powerful analytical environment. Wouldn't it be great to apply the same ideas to your monthly reporting environment?

Charles Fishman from Fast Company to kick off DIG

I am pleased to announce that Charles Fishman, Fast Company magazine writer and best selling author of "The Wal-mart Effect", will be a key note speaker at DIG 2008 conference in Las Vegas, Nevada. Charles will open the conference on day 1 and discuss what makes an organization a "Fast Company".

Mr. Fishman has written many investigative articles and publications on organizations ranging from NASA to Wal*mart. In his best selling book "The Wal-mart Effect", Charles uncovers the enormity of Wal-mart and the impact that it has on consumers, towns and cities where they open a store, and the businesses that manufacture the products that stock the shelves. Some interesting statistics from the book on the shear size of Wal-mart and the impact it has on the economy:
  • $23,212 per minute: profits at Wal-mart, every minute of every day
  • 127 million: number of Americans who shop at Wal-mart each week
  • 500: typical number of jobs created by a new Wal-mart Supercenter
  • 450: typical number of retail-related jobs eliminated in a community 5 years after a new Wal-mart opens a Supercenter


Charles has been recognized numerous times for his writing, including a Gerald Loeb Award for distinguished business journalism and best magazine story of 2004 from the New York Press Club. Charles received a BA from Harvard University and has written for the Washington Post, editor for Orlando Sentinel Sunday magazine, and assistant managing editor at the News & Observer in Raleigh, North Carolina.

Welcome Charles and we look forward to seeing you in Las Vegas.