Minding the Machines. Jeremy Adamson
Чтение книги онлайн.
Читать онлайн книгу Minding the Machines - Jeremy Adamson страница 8
Using a generalized, systematic, and sequential approach, adapted to the needs of the individual organization, is the best method to standing up a new team or restructuring an existing team. Once a base template has been established, careful reflection on organizational readiness and analytical maturity combined with regulatory requirements and immediate needs can help with developing a short-term roadmap in collaboration with executive sponsors. Though there is no approach that will work in every situation, these best practices can hopefully help you see a little further over the horizon.
For the current analytics leader, I hope that some parts of this book will challenge your views, other parts will confirm your experience, and the book as a whole will ultimately help you to build out a successful and engaged team.
Structure of This Book
The main body of this book has been organized within three key pillars: strategy, process, and people.
Strategy How to assess organizational readiness, identify gaps, establish an attainable roadmap, engage stakeholders, ensure sponsorship, and properly articulate a value proposition and case for change
Process How to select and manage projects across their life cycle, including design thinking, risk assessment, governance, and operationalization
People How to structure and engage a team, establish productive and parsimonious conventions, and lead a distinct practice with unique requirements
These pillars loosely follow the chronological and logical ordering of priorities with the creation or inheritance of an analytics team, with the understanding that this is an iterative and ongoing effort. The procedural requirements flow naturally from the strategy, and similarly, team structure and convention must be based on the processes that have been created.
Though this has been ordered to facilitate a front-to-back reading, subsections have been intentionally made self-sufficient to allow for ease of referencing, at the cost perhaps of occasional repetition.
Why Is This Book Needed?
It is my personal hope that this book will make creating and leading the function easier and help in some small way to advance the profession. Having been involved with or privy to rebooting these teams in several organizations, I have seen well-intentioned missteps repeated regardless of the maturity and sophistication of the company.
There are several underlying reasons why organizations and individuals struggle to get their hands around analytics.
Communication Gap
The business will rarely, if ever, have the analytics knowledge and vernacular required to clearly articulate its needs and to formulate a problem statement that naturally lends itself to an analytical solution. Whereas Kaggle competitions, hackathons, boot camps, and university assignments present problems with a well-formed data set and a clear desired outcome, business problems are fuzzy, poorly defined, and often posited without a known objective. As practitioners, it is our responsibility to find the underlying issue and present the most situationally appropriate and practical solution.
Advanced analytics and AI practitioners can often have the expectation that their stakeholder group will provide a solution for them. Just as a doctor cannot expect a patient to diagnose their own health issues and for the doctor’s approval an analytics team cannot expect a business unit to suggest an approach, provide a well-formed data set and an objective function, and request a model. What the business unit requests is very often not even what the analytics project lead hears.
Early in the project intake process, an analytics lead will meet with a business lead to discuss an opportunity. The business leader (actuarial, in this example) may say that they want a model that predicts the probability that a policyholder will lapse. The outcome that the leader is hoping for is a way to reduce their lapse rate, but what the analyst hears is, “Ignoring all other considerations, how can I best predict the probability of an individual lapsing?” If the practitioner executes on this misapprehension, the deliverable will have little use for the business; a prediction model of this sort has no operational value. This model would only work on a macro scale, and even if it could be disaggregated, the business would be making expensive concessions in the face of perceived threats.
Empathizing with the underlying needs of the business, understanding what success looks like for the project, and leveraging the domain knowledge of the project sponsor would have highlighted that the value in the analysis was further upstream. The factors driving lapse behavior were where the value to the business was and where an operationalizable change in process was possible.
As with the doctor analogy, it is through deep questioning, structured thinking, and the expert application of professional experience that the ideal path forward is uncovered. That path requires collaboration and the union of deep domain knowledge with analytical expertise.
Troubles with Taylorism
For every decision to be made there is a perception that there must be one optimal choice: a single price point that will maximize profit, a single model that will best predict lapse, or a single classification algorithm that will identify opportunities for upselling. The fact is that in effectively all cases these optima can never be known with certainty and can only be assessed ex post facto against true data. In professional practice as well as in university training, the results of a modeling project are typically evaluated against real-world data, giving a concrete measure of performance, whether AUC, or R squared, or another statistical metric.
This has created a professional environment where analysts can confidently point to a single score and have an objective measure of their performance. They can point with satisfaction to this measurement as an indicator of their success and evidence of the value they bring to the organization. Certainly, performant algorithms are an expectation, but without viewing the work through a lens of true accretive value creation, these statistical metrics are meaningless.
In the 1920s the practice of scientific management led to improvements in the productivity of teams by breaking the process into elements that could be optimized. Through a thorough motion study of workers at Bethlehem Steel, Frederick Taylor created and instituted a process that optimized rest patterns for workers and as a result doubled their productive output (Taylor, 1911). He advocated for all workplace processes to be evaluated in terms of their efficiency and all choice to be removed from the worker. This brutal division of labor and resulting hyperspecialization led to reduced engagement and produced suboptimal outcomes at scale when all factors were considered.
Practitioners need to avoid those actions and policies that create a form of neo-Taylorism within their organizations. Models that fully automate a process and embed simulated human decision making remove the dynamism and innovation that comes from having humans in the loop. It cements a process in place and reduces engagement and stakeholder buy-in. Analytics should support and supplement human endeavor, not supplant it with cold efficiency. It is essential that analytical projects are done within the context of the business and with the goal of maximizing the value to the organization.
Model accuracy needs to be secondary to bigger-picture considerations, including these:
Technical Implementation Is the architecture stable? Does it require