“Numbers have many charms, unseen by vulgar eyes, and only discovered to the unwearied and respectful sons of Art. Sweet joy may arise from such contemplations.”
- Charles Babbage circa 1825, recollecting the sentiments of French Mathematician Élie de Joncourt, circa 1735
Last week I was talking to a friend about our company’s recent challenges charting a course through the profit optimization our business. While he was clearly wedged in what appeared to be an intractable analytical challenge, I was awash with good vibes for some very non-analytical reasons.
First, I was delighted that we have reached a staffing level on our team to be able to afford research into difficult questions so vital to our continuing growth and leadership. Research is “a maze of twisty passages, all alike.” Mistakes and dead ends are a necessary part of the process. Banging into a few walls as we traverse this maze will make finding the exit all the more delightful!
Second, it gave me the opportunity to change the subject. In looking for solutions to complex, analytically-intensive problems, it is understandable to desire the ultimate profit-maximizing solution. But optimal solutions, as formally defined, are tricky in that they are required to be the best of all possible solutions. This is a tall order and a daunting task given the complexity of our systems and time we have to capitalize on our market position. By loosening our objectives we can seek to “tune” rather than “optimize.” We should lean heavily on data and analytics, but also prepare ourselves for insight through intuition. This got me thinking more over the weekend about the relationship between analytics and heuristics in untangling difficult problems.
It is often the complexity of systems paired with human intellectual and creative limits that prevent a complete understanding of the problems we face. Likewise, we may be limited in the prediction of the full scope and side effects of a software problem -- or its solution. Tightly coupled systems are bordering on incomprehensible in the best of times, and in times of failure, we are hobbled by a lack of experience and perspective to perceive their myriad causes and effects. Generally recognizing our limits, we deploy layered defenses: sniffers, alarms, gauges, visual dashboards, automated governors, shut-offs, and shunts. In essence, these devices are a simplifying, and necessarily imperfect, abstraction of the complete system. As we’ve found, even these defenses are imperfect and require other complex processes and systems. We are often times having to “debug our debugger” before we can dream of finding a solution! The same challenge can creep in when approaching an intellectual challenge. First, we have to understand our framework for problem solving.
Not surprisingly, organizational theorists and cognitive psychologists have heavily researched the topic of the limits of human comprehension of massive data sets, challenging problems, and complex systems. Even more obvious is that they don’t agree with each other when creating problem-solving frameworks. Ahh, the crucible of debate!
Let’s start with the organizational theorists. I recently read a summary of Normal Accidents: Living with High-Risk Technologies by Charles Perrow, professor emeritus at Yale University. Perrow has made a career studying failures in complex systems. He has researched the organizations and systems involved with nuclear power plant melt-downs, economic market disasters, terrorist attacks, and technology systems crashes. He concludes that many of these situations are “normal accidents” that have spiraled out of control with a cascading sequence of very simple errors within tightly-coupled systems. He concludes, not surprisingly, that we are more able to understand a failure in hindsight than we are to predict it. Creeping determinism rears its head again! Yet perhaps due to the seriousness of his subject matter, Perrow doesn’t trust heuristic approaches. His is an entirely forensic, number-crunching existence. All brain and no heart!
Enter the cognitive psychologists! Herbert Simon, known as the father of heuristics in decision making, and winner of the Turing Award in 1975 for contributions in artificial intelligence, was a interdisciplinary professor at Carnegie Mellon University. He defined heuristics as “methods for arriving at satisfactory solutions with modest amounts of computation.” Simon coined the term bounded rationality to describe the approach of heuristic problem solving in complex, time constrained situations. Bounded rationality describes compensation for human cognitive limitations. We employ intellectual short-cuts, such as a conceptual metaphor, to replace the actual problem at hand. Interestingly, we often unconsciously do this when solving difficult problems, in a process described by [the evidently quite prolific] Tversky and Kahneman (read more about them in my Fear of Falling and the Bane of High Expectations blog article) as attribute substitution.
But a conscious recognition of our bounded rationality allows us to leverage this as a problem solving approach. We are warmly swaddled in our humanity -- inextricably bounded by it -- but we can also be unleashed by it. It is our ability to draw upon past experiences and substitute abstracted simplicity in place of complexity -- what I’ll call metaphoric simplification (hey if Simon can coin a term, why can’t I?) -- that allows us to go beyond the limits of pure computation in discovery of new vistas of opportunity. Bounded rationality has been specifically cited as a decision making framework when mathematically optimal solutions are unavailable due to lack of information or time.
Indeed, “numbers have many charms.” The way we can experience these charms is to blend our number crunching and analytical talents with our human, heuristic nature. Tuning our systems to greatness will be accomplished by a series of wonderful surprises resulting from hard analysis on the data combined with flashes of inspiration.
Further reading
The Information: A History, a Theory, a Flood, by James Gleick, 2011
Normal accidents: living with high-risk technologies, by Charles Perrow, 1984
Heuristics Made Easy: An Effort-Reduction Framework, by Anuj K. Shah and Daniel M. Oppenheimer of Princeton University, 2008
The Paradox of Choice: Why More Is Less, by Barry Schwartz, 2003
Comments
Post a Comment