Saturday, March 29, 2008

Using E2.o in Business Intelligence or Business Intelligence in E2.o?

I recently did a Google search to see if I could find anyone who has written on Business Intelligence and Enterprise 2.0. I came across a series of articles written by Colin White on the b-eye Network on just such a topic. I commend Mr. White in doing the difficult job associated with building out a framework for E2.0 and BI. I now have the advantage of commenting and building upon what he has written. Hopefully my comments in this post will not be taken in anyway disparaging what Mr. White has written. I am responding with hopefully semi-coherent thoughts, since the article created some level of emotion that I felt compelled to make this post. As I said, I have the easy job of writing the blog post!

The article starts out stating “This series of articles examines the use of Enterprise 2.0 in business intelligence (BI)”. This may be semantics, but I see the need to make the statement in reverse, “the use of Business Intelligence in Enterprise 2.0”. When I envision an Enterprise 2.0 platform, I see business intelligence as a plug-in to the E2.0 platform. It is analogous to how Facebook works. The Facebook platform provides the foundation for the social network that exists between people. Applications are then distributed and shared across the “social graph” as Facebook calls it. Relate this back to E2.0 and BI. Business Intelligence “objects”, such as a report or metric, can be distributed and shared in the same fashion. This is why I see the statement as the use of business intelligence in Enterprise 2.0.

Mr. White continues on to discuss the current challenge that BI deployments face, which is user adoption. He points out that this is driven by two primary issues: complexity of the BI tools and the ability for business users to understand the data. I agree that these two issues are #1 and 1A in why BI applications fail. So how can E2.0 address these two issues? The first is that enterprise 2.0 technologies…wikis, blogs and social networks…are inherently easier to understand and use. This is because the tools are built around driving collaboration and discussion. Having a conversation is easy, assuming the two people talking speak the same language. Enterprise 2.0 is really putting that dialog in a massively collaborative environment that is a many to many discussion. Where E2.0 can really start to drive adoption of BI is around the second issue of understanding data. By providing a platform that allows users to provide context and meaning to data, you start to address the understanding gap that exists. Simply viewing a report with data will certainly cause confusion without context. What better way to do this then by using a collaborative environment where users can provide commentary and textual descriptions.

Another concept that Enterprise 2.0 can bring to business intelligence is the “Amazon Effect”. For example, users who viewed this report also look at these other reports. Or users rated this metric a 4 out 5 when it comes to value in understanding performance, and here is why. These may be silly examples, but the concept of tapping into the collective intelligence of an organization to drive better decision making can have a powerful impact.

A challenge I foresee is how can we link the E2.0 world of unstructured data and the BI world of structured data? This is quickly becoming a challenge that is being addressed in a couple of different ways. There are niche software vendors that are providing information access and search platforms that integrate structured and unstructured data. Endeca, Attivio, Fast Search & Transfer and Northern Light, to name a few, all provide technology platforms that address this subject area. Glyn Heatley, a TalkDIG contributor, happened to provide a post on the topic of structured and unstructured data this morning. I also did a post a couple of weeks ago on the topic of building out an enterprise semantic layer through interpretative technology that can deduce the objects and their relationships based on the natural language inside of text.

On the topic of structured and unstructured data, the article discusses issues associated with accuracy, stating that “Information is never 100 percent accurate”. For business intelligence applications and deployments, success will be driven by the overall trust in the data being presented. Structured data such as revenue will need to be accurate. In an E2.0 environment, different challenges will exist. Since there is no governance in place for the unstructured data, focus will need to be on the validity associated with the context being provided.

The article concludes with Mr. White’s seven components of Enterprise 2.0 for Business Intelligence. These components include information collaboration, exploration and analysis, integration, syndication and delivery, a rich user interface, a web-oriented architecture and adopting open source solutions. Mr. White will discuss each of these components in separate articles during the series and has already completed two additional articles in the series.

Finally, the article discusses how the BI vendors are addressing the adoption of E2.0. Mr. White points out that the major BI vendors have been slow to consider and adopt the E2.0 technologies into their respective platforms. In my opinion, this is a good thing and let me explain. If you go back to how I started this post, I made the statement that I believe BI is an application service that plugs into an E2.0 platform. If you agree with that philosophy, then the BI vendors need to be looking at other enterprise 2.0 platform providers. The issue to date is that there are very little standards that would provide the plug and play of applications into an E2.0 platform, although Open Social is starting to gain some momentum. But if you take another look at what has made Facebook wildly popular and successful, it was the recent decision to open up the platform to outside developers to build applications to run inside Facebook. This is what is needed for E2.0 to elevate within the enterprise and start to see the user adoption expectations associated with business intelligence.

So what are your thoughts on the topic? I have in no way answered all the questions. If anything, I have probably only left more questions to be answered. Hopefully that is the case and we can have a dialog on the topic.

Thursday, March 27, 2008

What’s all the hype around unstructured data?

Check out the DM Review article by Michael GonzalesComprehensive Insight: Structured and Unstructured Analysisfor an introduction into the topic of unstructured data and releasing its potential.

Michael provides some insight into the challenges organizations face in dealing with unstructured data vs. structured and how technology has been evolving to help better leverage such information assets.

With an estimate that “more than 85 percent of all business information exists as unstructured data” it is no wonder that technology vendors are putting more focus on how to extend their products to make this information more accessible and usable.

Although the article gives some interesting insight into the evolution of the technologies it doesn’t provide any insight into how to actually integrate and store this data in the traditional data warehouse. How does one integrate such unstructured data in the form of documents, images, video content, and other multimedia formats? Is such data actually relevant to data warehouses and CPM processes? Perhaps not the actual content but perhaps the metadata associated with the content (e.g. x number of documents types, average occurrence of y in videos of type z, number of emails on subject w, etc).

Vendors that are beginning to address the storage and integration of such unstructured data into existing solutions are primarily the large database vendors. Certainly “Big Blue” (IBM) boasts support for analysis of unstructured data with its DB2 Warehouse 9.5 product offering and Microsoft SQL Server 2008 is touted by Microsoft to “provide a flexible solution for storing and searching unstructured data”.

Although the advances in vendor technologies are providing a means of storing such information in a manner that makes it accessible, is the typical organization yet ready to focus its resources on doing so? When so many have yet to fully realize the benefits of provisioning to the business traditional structured data, e.g. Financial, Operational, Customer, etc, you have to beg the question as to whether this should yet be a high priority?

What is your organization doing? Have you implemented any creative solutions? What are the demands from the business?

And now a word from our sponsors....

I am happy to announce that Microsoft, Oracle and Business Objects/SAP are Platinum Sponsors for the Decision, Information and Governance conference. We are excited to have all three organizations as sponsors. Each vendor provides a technology platform that addresses the three themes of the conference: creating one version of the truth, insights from advanced analytics, and Enterprise 2.0. As part of the conference agenda, Palladium will facilitate a "Future Visions Technology Panel" with all three vendors. The panel of vendor experts will address questions and opportunities associated with technology market consolidation and share their vision for Enterprise 2.0.



Founded in 1975, Microsoft® is the worldwide leader in software, services and solutions that help people and businesses realize their full potential. Microsoft Business Intelligence is a complete, fully-integrated offering, enabling all decision-makers to drive increased business performance at strategic, tactical and operational levels. Microsoft BI enhances the performance of all employees by providing reliable access to information they need to make informed decisions, and respond appropriately to changing conditions that impact their business. With Microsoft Business Intelligence you can take advantage of your information assets, create competitive advantages, improve customer satisfaction and make well-informed decisions.



Oracle’s business is information—how to manage it, use it, share it, protect it. For nearly three decades, Oracle (NASDAQ: ORCL), the world’s largest enterprise software company, has provided software and services that enable organizations to get the most accurate and up-to-date information from their business systems. Today, Oracle has over 275,000 customers—including 98 of the Fortune 100—in more than 145 countries.





Whether you are measuring profitability, achieving cost-reducing synergies, accelerating time to market, or simply reporting results in a more compliant and timely manner, SAP solutions for enterprise performance management can help you deliver answers, drive decisions, take action, and manage your business. With these solutions, you can capitalize on the value of your corporate data to drive organizational alignment, increase business agility, and create competitive advantage by controlling performance. You have the visibility required to understand current performance, the tools you need to develop effective strategies, and the resources to drive execution across all levels of your organization.

Wednesday, March 26, 2008

Correlation, Causation, and Flat Tires


I was out of town this week and received a call from my wife. The rear tire on her car was flat, she couldn’t figure out how to change it, and ultimately called AAA. The culprit turned out to be a nail. “The tow truck guy said he’s seeing lots of these in our town. He thinks it has to do with all the home construction that’s going on.”

Always on the lookout for good causation / correlation examples, I apologized for not being home to change the tire myself and quickly Googled “flat tire correlation.” The first hit I got was from a discussion forum for BMW owners. The thread was discussing whether high-performance tires were more prone to flats. “I suspect there is a correlation between flats in general and construction activity in your area,” reported chuck92103.

Interesting, but was it chance, coincidence, or a pattern?

I took the car to NTB this morning to have the tire repaired, and the serviceman – un-prompted - provided more evidence for my fledgling theory. “Yep, we’ve been getting between 15 and 20 nails per day,” he said. “We’ve seen a lot more since all the home construction started back up.”

Well, that was all the proof I needed. Now, in addition to the sub-prime crisis, the high cost of gasoline, and whether my kid’s Thomas the Tank Engine is covered in lead paint, I had a new problem to worry about. Thousands of rogue nails, escaping from construction sites, hiding along the roads, and leaping up to impale themselves in the tires of unsuspecting minivan drivers throughout Metrowest Boston.

“I think I’ll take the train into town on Friday,” I thought. That is, until I saw this news item from yesterday’s paper. A freight-car loaded with building materials broke loose from a siding at a lumber yard, rolled three miles down the main track, and collided with a commuter rail train. 150 people were injured (fortunately, none seriously).

Could it be any more obvious? Increased home construction requires more lumber. More lumber means more freight cars. More freight cars increases the probability that one will break loose and (somehow) thwart the devices intended to prevent runaways. And more runaways, of course, means your ride home may be interrupted with potentially disastrous consequences. Not to mention all those flat tires.

My conclusion? Stop the McMansion-ization of the suburbs, increase transportation safety throughout the region!

Correlation analysis can be a powerful tool in determining the root causes of business performance. But like any tool, it has its limitations. Have you ever tried to correlate outputs and inputs and arrived at an unusual result?


NCAA update: The ScoreCard algorithm missed both first-round upsets. I correctly predicted Villanova over Clemson. But the Pitt Panthers guaranteed I wouldn’t have my office pool winnings available to pay for the flat tire by dropping their second-round game to Michigan State. Human intuition – 50%. Computer – 0%. Man does not win by analytics alone…


A New Type of CIO

I read an interesting article last week in the Wall Street Journal about Douglas Merrill, Google's Chief Information Officer. To say the least, Merrill is not your typical CIO. Granted, Google is not your typical type of organization. Here are a few snippets and comments from the article that I found interesting.

"Mr. Merrill's group lets Google employees download software on their own, choose between several types of computers and operating systems...."

This isn't totally off the wall, but certainly unique. They are a company focused on innovation and technology, so it isn't surprising that they wouldn't "constrain" anyone from getting the resources they need.


"We're a decentralized technology organization...Google's model is choice. We let employees choose from a bunch of different machines and different operating systems, and [my support group] supports all of them. It's a bit less cost-efficient - but on the other hand, I get slightly more productivity from my [Google's] employees."

The decentralization and the idea of choice certainly go hand in hand. The cost-efficiency comment is interesting and it doesn't sound like something that they necessarily quantify. I also assume that Mr. Merrill is being extremely sarcastic when he says he gets slightly more productivity out of Google employees. Based on the number of tools that Google produces for consumers, I would say that they are quite productive and innovative. I made the comment the other day to someone that I can't keep up with all the new things that they constantly release! On the topic of security, Mr. Merrill had the following response.


"The traditional security model is to try to tightly lock down end-points...we put security into the infrastructure. We have programs in our infrastructure to watch for strange behavior."

"When I talk to Fortune 100 CIOs, they want to understand, 'What is your security model? Is it really as reliable? What's the catch?' I already had to build security standards because search logs are really private. Very few [Google employees] have access to consumer data, [and those who do] have to go through background checks."

What I find most interesting about Mr. Merrill's comments is the fact that fortune 100 CIOs immediately have questions about security, primarily because of Google's approach to employee "choice". Google has shown that they can provide a best-of-breed infrastructure that is stable and secure. My guess is that most CIOs struggle to successfully provide a decentralized environment without ending up on the front page of the newspaper because their systems have been
compromised.

The last comment I will make about Mr. Merrill is that he has a fascinating academic background, especially for a CIO. He studied social and political organization at the University of Tulsa and a masters and doctorate degree in Physcology from Princeton University. His IT knowledge came through practical experience at RAND, Price Waterhouse and Charles Schwabb. Certainly politics and physcology play into the role of a successful CIO.

When I look at Google's approach to information technology and management, their approach is around the principles of the organization: decentralized, autonomy and a focus on innovation. Those principles may not make sense for every organization, but they clearly work for Google. Do I think that every physcology major is going to be the next CIO at company XYZ? No. Do I think that there are lessons to be learned from Google and Douglas Merrill, absolutely.

If you are interested in listening to Mr. Merrill speak to how Google eats their own dog food, take a look at the YouTube video below. He and CEO Eric Schmidt discuss how they use Google Apps internally in an effective and collaborative manner while meeting business and consumer needs, including security. Google Apps is a great example that relates back to the three themes of DIG: having organized and accessible data, providing robust analytic tools, and providing a massively collaborative environment.


Comments are welcome. I would be interested to hear related experiences and if you feel that IT is helping enable your business the way that Google is enabling theirs.

Tuesday, March 25, 2008

Creating an Effective Decision Making Environment

Decisions. Information. Governance. What is the point?

The point is to establish the infrastructure and processes for an organization to learn, adapt, and ultimately make effective and timely decisions. Organizations are constantly faced with a myriad of decisions from the many tactical decisions we make every day to the few strategic, "bet-the-farm" decisions we make less frequently. Decisions are rarely made with perfect information. The real challenge is to create the right environment in order to make the best decisions possible given the circumstances and to ensure that the sum total of the decisions made moves the organization in the direction it seeks to go. Put another way, an effective decision making environment is guided by the goals and culture of the organization and leverages the right technology and processes.

Mark Kozak-Holland wrote a great series of ten articles for DM Review from November 2005 to December 2007 to illustrate this point. The series is titled "Winston Churcill's Decision-Making Environment" and provides a vivid picture of an effective decision making environment. Kozak-Holland takes us back to the early days of World War II, shortly after Churchill has been swept to power and faces a daunting task - how to protect Britain from an imminent invasion by a numerically superior German force with about half of the aircraft needed to defend the homeland.

In the articles, Kozak-Holland does a great job of capturing the key components of the British decision making machine - Bentley Prior (RAF Fighter Command), Bletchley Park (code breakers), Whitehall (fighter supply chain), and Storey's Gate (Churchill's headquarters) - and the tight communication and process interrelationships among the different components. What emerges is a picture of a highly effective decision making environment that is operating under extreme conditions. However, through the use of the leading technology and operational processes of the day, the environment performs brilliantly and allows the British to prosecute the early campaign in a way far beyond the limited resources available.

The articles provide an excellent example of how an effective decision making environment can be a significant competitive asset. While most of our organizations face less dire circumstances than Churchill faced, Kozak-Holland's articles are certainly worth the read.

I would like to hear how you feel the principles raised are evident (or not) in your organization.

Sunday, March 23, 2008

E2.o as the Catalyst for Organizational Evolution

Last month, Fast Company’s Fast 50 highlighted Google as the top selection in their list of the world’s most innovative companies. This past weekend, I read through Harvard Business Review’s April 2008 cover story titled Reverse Engineering Google’s Innovation Machine by Bala Iyer and Tom Davenport.

Noting these two recent cover stories from two of the more popular business management periodicals of our day, I would say that Google has captured the attention of the best of today’s business management minds. We are preoccupied with the company because they are unflinchingly trying (successfully at that) to manage complexity using a nontraditional management approach - an approach which encourages flexibility and widespread experimentation in the face of chaos. The HBR article goes on to highlight six traits that Google has embraced to “Build Innovation into Organizational Design”.

The article states, “Innovating on internet time requires dynamic capabilities to anticipate market changes and offer new products and functions quickly. Google has made substantive investments in developing the capacity to innovate successfully in this fast-changing business environment. The company is pioneering approaches to organizational culture and innovation processes …”

One of the six traits that the article highlights is Google’s ability to “Use Data to Vet Inspiration”. The company is known to be very analytical but the article cites their advanced use of both internal prediction markets (300) and an idea management system to vet their thoughts. Google is using the foremost of E2.o tools to democratize the management input process and to fully harness the intellectual capital of their entire company.

This makes Google very interesting to me. First because I am an advocate of E2.o and want to study any business that is attempting to incorporate social computing to augment business management!

However, the second reason is because I want to understand Google’s management approach, itself - a unique culture that encourages innovation - or adaptation. To this point, I refer to Eric Beinhocker’s book, The Origin of Wealth, in which he expounds on organizational adaptation. In our world of increasing complexity, we must use all of the information available to us as a means to educate our organizations to the evolving dynamics of the marketplace. Only by understanding one’s changing market does an organization stand the chance to survive and adapt to the next generation. This organizational learning is fueled by equipping all aspects of a business to “exploit and explore” their interactions within their current marketplace.

Briefly, I see E2.o tools as the catalyst to our getting to this fully-enabled next generation approach to business management. With E2.o tools, our organizations can harness, prioritize, and make sense of all of the information that is bombarding us on a daily basis. By doing so, we can begin to adapt continuously. Google may actually be leading the charge on something quite remarkable.

Join the dialog. What do you think?

(As an aside, Google will be joining us this May at DIG2008. I am excited to hear from Bo Cowgill on the company's use of internal prediction markets. Check out the short case study overview.)