Mind+Machine. Vollenweider Marc

Чтение книги онлайн.

Читать онлайн книгу Mind+Machine - Vollenweider Marc страница 3

Mind+Machine - Vollenweider Marc

Скачать книгу

style="font-size:15px;">      My heartfelt thanks to Evalueserve's loyal clients, employees, and partner firms, without whose contributions this book would not have been possible; to our four external contributors and partners: Neil Gardiner of Every Interaction, Michael Müller of Acrea, Alan Day of State of Flux, and Stephen Taylor of Stream Financial; to our brand agency Earnest for their thought leadership in creating our brand; to all the Evalueserve teams and the teams of our partner firms MP Technology, Every Interaction, Infusion, Earnest, and Acrea for creating and positioning InsightBee and other mind+machine platforms; to the creators, owners, and authors of all the use cases in this book and their respective operations teams; to Jean-Paul Ludig, who helped me keep the project on track; to Derek and Seven Victor for their incredible help in editing the book; to Evalueserve's marketing team; to the Evalueserve board and management team for taking a lot of operational responsibilities off my shoulders, allowing me to write this book; to John Wiley & Sons for giving me this opportunity; to Ursula Hueby for keeping my logistics on track during all these years; to Ashish Gupta, our former COO, for being a friend and helping build the company from the very beginning; to Alok Aggarwal for co-founding the company; to his wife Sangeeta Aggarwal for introducing us; and above all to my wonderful wife Gabi for supporting me during all these years, actively participating in all of Evalueserve's key events, being a great partner for both everyday life and grand thought experiments, and for inspiring me to delve into the psychology of those involved at all levels of mind+machine analytics.

– Marc Vollenweider

      PART I

      THE TOP 12 FALLACIES ABOUT MIND+MACHINE

      The number of incredible opportunities with great potential for mind+ machine is large and growing. Many companies have already begun successfully leveraging this potential, building whole digital business models around smart minds and effective machines. Despite the potential for remarkable return on investment (ROI), there are pitfalls – particularly if you fall into the trap of believing some of the common wisdoms in analytics, which are exposed as fallacies on closer examination.

      Some vendors might not agree with the view that current approaches have serious limitations, but the world of analytics is showing some clear and undisputable symptoms that all is not well. To ensure you can approach mind+machine successfully, I want to arm you with insights into the traps and falsehoods you will very likely encounter.

      First, let's make sure we all know what successful analytics means: the delivery of the right insight to the right decision makers at the right time and in the right format. Anything else means a lessened impact – which is an unsatisfactory experience for all involved.

      The simplest analogy is to food service. Success in a restaurant means the food is tasty, presented appropriately, and delivered to the table on time. It's not enough to have a great chef if the food doesn't reach the table promptly. And the most efficient service won't save the business if the food is poor quality or served with the wrong utensils.

      The impact on a business from analytics should be clear and strong. However, many organizations struggle, spending millions or even tens of millions on their analytics infrastructure but failing to receive the high-quality insights when they are needed in a usable form – and thus failing to get the right return on their investments. Why is that?

      Analytics serves the fundamental desire to support decisions with facts and data. In the minds of many managers, it's a case of the more, the better. And there is certainly no issue with finding data! The rapid expansion in the availability of relatively inexpensive computing power and storage has been matched by the unprecedented proliferation of information sources. There is a temptation to see more data combined with more computing power as the sole solution to all analytics problems. But the human element cannot be underestimated.

      I vividly remember my first year at McKinsey Zurich. It was 1990, and one of my first projects was a strategy study in the weaving machines market. I was really lucky, discovering around 40 useful data points and some good qualitative descriptions in the 160-page analyst report procured by our very competent library team. We also conducted 15 qualitative interviews and found another useful source.

      By today's standards, the report provided a combined study-relevant data volume of 2 to 3 kilobytes. We used this information to create a small but robust model in Lotus 1-2-3 on a standard laptop. Those insights proved accurate: in 2000, I came across the market estimates again and found that we had been only about 5 % off.

      Granted, this may have been luck, but my point is that deriving valuable insight – finding the “so what?” – required thought, not just the mass of data and raw computing power that many see as the right way to do analytics. Fallacies like this and the ones I outline in this part of the book are holding analytics back from achieving its full potential.

      FALLACY #1

      BIG DATA SOLVES EVERYTHING

From Google to start-up analytics firms, many companies have successfully implemented business models around the opportunities offered by big data. The growing number of analytics use cases include media streaming, business-to-consumer (B2C) marketing, risk and compliance in financial services, surveillance and security in the private sector, social media monitoring, and preventive maintenance strategies (Figure I.1). However, throwing big data at every analytics use case isn't always the way to generate the best return on investment (ROI).

Figure I.1 Areas of Big Data Impact

      Before we explore the big data fallacy in detail, we need to define analytics use case, a term you'll encounter a lot in this book. Here is a proposed definition:

      “An analytics use case is the end-to-end analytics support solution applied once or repeatedly to a single business issue faced by an end user or homogeneous group of end users who need to make decisions, take actions, or deliver a product or service on time based on the insights delivered.”

      What are the implications of this definition? First and foremost, use cases are really about the end users and their needs, not about data scientists, informaticians, or analytics vendors. Second, the definition does not specify the data as small or big, qualitative or quantitative, static or dynamic – the type, origin, and size of the data input sets are open. Whether humans or machines or a combination thereof deliver the solution is also not defined. However, it is specific on the need for timely insights and on the end-to-end character of the solution, which means the complete workflow from data creation to delivery of the insights to the decision maker.

      Now, getting back to big data: the list of big data use cases has grown significantly over the past decade and will continue to grow. With the advent of social media and the Internet of Things, we are faced with a vast number of information sources, with more to come. Continuous data streams are becoming increasingly prevalent. As companies offering big data tools spring up like mushrooms, people are dreaming up an increasing number of analytics possibilities.

      One of the issues with talking about big data, or indeed small data, is the lack of a singular understanding of what the term means. It's good hype in action: an attractive name with a

Скачать книгу