Tags
advertising, Business, creativity, Design, epistomology, innovation, learning, life, pattern language, politics
This is the second in a series of blogs that present Patterns in a Socio-technical Pattern Language of best practices for collaboration and coordination in complex societies. I intend to organize these in multiple ways (e.g., type of goal; where in a typical development process the pattern is most applicable; how large a collection of people the pattern is most applicable to, etc.). I am entering them in this blog in an order that reflects current events. For example, there seems to be a movement to deny reality outright and insist everyone simply believe what the leaders promulgate. This, to me, is outright evil. But even when people are acting with the best of intentions, it is natural to take short cuts. Those short cuts can make life seem more efficient in the short run, but it can also lead to serious issues in the longer term.
Reality Check
Author, reviewer and revision dates:
Created by John C. Thomas on 4 September, 2001
Revised, JCT, 17 December, 2001
Revised, JCT, 15 January, 2018
Synonyms:
Abstract:
In developing complex systems, it is often expedient to develop feedback loops based on ersatz measures of what we are really interested in assessing and controlling. While this seems expedient in the short term, it often leads to serious problems and distortions, particularly in times of crisis or transition when the correlation between ersatz measures and actuality substantially drifts or even suddenly disconnects. Actions can be based on these measures or models of reality rather than on reality(or more complete measures) and result in negative consequences. The solution is to perform regular “reality checks” to insure that measures or indicators of reality continue to reflect that reality.
Problem:
In developing complex systems, it is often expedient to develop feedback loops based on ersatz measures of what we are really interested in assessing and controlling. While this may seem expedient in the short term, it often leads to serious problems and distortions, particularly in times of crisis or transition when the correlation between ersatz measures and actuality substantially drifts or even suddenly disconnects. Actions can be based on these measures or models of reality rather than on reality. This can result in negative, even deadly, consequences.
Context:
Many problems were partly responsible for the disaster at the Three Mile Island. One crucial problem, in particular, arose from the design of a feedback loop. A switch was supposed to close a valve. Beside the switch was a light that was supposed to show that the valve was closed. In fact, rather than having the light go on as the result of actual feedback from the valve closure itself, the signal light was merely a collateral circuit to the switch. All it actually showed was that the switch had moved position (Wickens, 1984). Under normal operation; that is, when the valve was operating properly, these two events were perfectly correlated. At a critical point in the meltdown, however, the valve was not operating properly. Yet, the human operator believed that the valve was closed even though it had failed to close in reality. His resulting actions, taken on the basis of the assumption that the valve was closed, exacerbated the subsequent problems. My colleague, Scott Robertson, has recently posted an analysis of the recent error that resulted in the nuclear missile scare in Hawaii. (See link).
In running an application program several years ago, I was given a feedback message that a file was posted. In fact, it wasn’t. The programming team of the application, rather than checking to see whether the file was actually posted, merely relied on the completion of an internal loop.
In advertising campaigns, it is difficult to measure the impact on sales. Instead, companies typically measure the “recall” and “recognition” rates of ads. This may often be correlated with sales changes, but in some cases, the ad may be very memorable but give the customer a very negative impression of the company and decrease the chances of actually selling a product.
Historically, monarchs and dictators (and even would-be dictators) often surrounded themselves only with people who gave them good reports and support no matter how their decisions impacted the reality of their realm. Eventually, the performance of such people tends to deteriorate severely because their behavior is shaped by this ersatz feedback rather than by reality.
During the “oil crisis” in the seventies, oil companies relied on mathematical models of continually increasing demand. Year after year, for seven years, they relied on these models to predict demand despite the fact that, for all seven years, demand actually went down. The results are purported to have cost them tens of billions of dollars (Van der Heijden, 1996).
In some cases, the known existence of ersatz measures directly contributes to the destruction of the utility of these very measures. For example, if management decides the “easy way” to measure programmer productivity is “Lines Of Code,” once programmers discover this, the code base may grow quickly in terms of that measure, but not in terms of actual functionality.
In America @2018, many people view money as the only legitimate value of interest for countries, companies, or individuals. Measures such as the GDP and the stock market index are taken as adequate and complete measures of the economic well-being of the society. There is a sense that, since we spend the most on weapons and health care, we must perforce be the “safest” and “healthiest” nation on the planet. This is clearly not the case. Similarly, ads talk about a person’s “net worth” when what they are really talking about is merely a person’s net financial worth. “Worth” is not the same as “financial worth.”
A large research organization that I am familiar with used to have a large number of administrative assistants who helped arrange meetings, send in expense reports, and answer telephones. At some point, most of these administrative assistants were laid off and the tasks were now done by the researchers themselves who were typically not nearly so efficient at them. The researchers took at least as long to do them as had the administrative assistants. Accountants looked favorably at all the “money they had saved” because they could easily see that the line item for administrative assistants was far less costly than it had been. Not visible, of course, was the fact that the much more highly paid researchers were now doing the same work that had been done before by the administrative assistants, but they were doing it less efficiently and at a far higher cost.
Forces:
* Organizations are often hierarchically decomposed and bureaucratic. Therefore, it is often simplest to communicate with those close to us in the hierarchy and to build systems that rely for their model of reality only on things within the immediate control span of our small part of the organization.
* While more comfortable to limit system design and development to those things within one’s own team or department, it is often precisely the work necessary to capture more reality-based measures that will reveal additional challenges and opportunities in business process coherence.
*A more direct measure of reality is often more time-consuming, more costly, or more difficult than the measure of something more proximal that is often highly correlated with those aspects of reality of real interest.
*It is likely to be exactly at times of crisis and transition that the correlation be-tween proximal ersatz measures and their referent in reality will be destroyed.
*It is likely to be exactly under times of crisis and transition that people will tend to simplify their cognitive models of the world and, among other things, forget that the proximal measure is only ersatz.
Solution:
Therefore:
Whenever feasible, feedback should ideally be based on reality checks, not solely on ersatz measures. When this is too costly (as opposed to merely inconvenient or uncomfortable), then at least design systems so that the correlation between proximal measures and their referent in reality is double-checked periodically.
Examples:
Rather than rely solely on a circle of politically minded advisors, Peter the Great disguised himself and checked out various situations in Russia in person.
As reported by Paula Underwood (who was the designated storyteller for her branch of the Iroquois), her ancestors at one point felled giant trees for long houses in the Pacific Northwest. Later, when the tribe lived in the “Great Plains”, there were no trees of that size. The tribe began to doubt the existence of trees as large as what their oral history portrayed. In order to check on this, one brave spent many years walking back to that area and seeing with his own eyes that there were indeed trees as tall as had been portrayed in the oral history and then returning to the tribe to report back.
Resulting Context:
Ideally, over time, people who actually double-check reality will come to better understand when and how these reality checks will be necessary. They may also invent methods of making a check-in closer to what is really of interest more convenient or cheaper.
Related Patterns:
System as a Whole
Convergent Measures
Drawing the Line
Who Speaks for Wolf
Known Uses:
Richard Feynman, during the Manhattan project, noticed that the bureaucracy was worried about the possibility of accidentally stockpiling a critical mass of uranium. To prevent this, each section chief was required to insure that their section did not have a critical mass. To insure this, each section chief instructed each sub-section chief to insure that their subsection didn’t have a critical mass and so on, down to the smallest level of the bureaucracy. Upon hearing this plan, Feynman observed that neutrons probably didn’t much care whose subsection they reported to!
In another incident reported by Feynman, various bureaucrats were each trying to prove that they had better security than their peers. In order to prove this, they escalated the buying of bigger and thicker safes. The bigger and thicker the safe, the more bureaucrats felt that they had made their secrets secure. Feynman discovered that more than half of the super-safe safes had been left with the factory installed combinations of 50-50-50 and were therefore trivially easy to break into!
References:
Wickens, C. (1983). Engineering psychology and human performance. Columbus: Merrill, (p.1).
Van der Heijden, K. (1996). Scenarios: The art of strategic conversation. Chichester: Wiley.
Hutchings, E., Leighton, R., Feynman, R., and Hibbs, A. (1997). Surely, you’re joking Mr. Feynman. New York: Norton.
Underwood, P. (1993). The Walking People: A Native American oral history. San Anselmo, CA: Tribe of Two Press.
————————————
https://petersironwood.wordpress.com/2017/02/25/the-invisibility-cloak-of-habit/