The Spirometer

 
 
network-1576131482YNT.jpg
 
Through these devices we are tracked, monitored, profiled, and surveilled – and we agreed to it all. Every one of us has checked that box.
— De' Bryant

Strange how the familiar and benign can become scary if we look just below the surface. Writers for the old 80's t.v. show, Tales from the Dark Side, knew all about adjusting our angle of vision. They took banal, everyday events of a life to a whole new place. At the start of each episode, Donald Rubenstein intoned, Man lives in the sunlit world of what he believes to be reality. But...there is, unseen by most, an underworld, a place that is just as real, but not as brightly lit...a dark side.

What could be more a part of the fabric of the modern world than technology? We are living inside episodes of every science fiction, science-fact story ever told! We have portable communicators that talk back to us. Satellites can pinpoint our location with startling precision no matter where we are on the planet. Devices measure our heart rate, blood pressure, glucose levels, even our stages of sleep. Smart speakers can play our favorite music or tune in to our preferred radio station. The delete key has replaced white-out to correct typed text. Every conceivable modern convenience has a computer onboard: cars, stoves, lawn mowers, vacuum cleaners; if you choose, you can even have a conversation with your house.

Through these devices we are tracked, monitored, profiled, and surveilled – and we agreed to it all. Every one of us has checked that box. You know the one. It is the last barrier to getting access to something you want. You have to check that box and in doing so you also accept the Terms of Agreement though few of us have ever read them. In the process we have perpetuated the exponential spread of technology, which has become so ubiquitous that we no longer see it.

In fact, Kevin Kelly's TED® Talk compares its growth to the evolution of the organic world. He calls technology the Seventh Kingdom, following right after Animalia. He proposes that humans have given so much of the stuff of life over to computers that the line between humans and machines has become fragile. The distinction is nearly undetectable when we consider the sophistication of artificial intelligence. New software in psychology can learn to identify your mood and suggest ways to feel better if you are sad or activities to maintain your happy face. Siri, a virtual personal assistant , can find any obscure fact that sparks our curiosity, anytime, anywhere we have our phone. And don't we always have our phones?

Kelly takes his social analysis further by explaining that computers have one operating system, to which he refers as The One. Every device, in every hand, of every person runs through that same portal. We turn to The One for guidance. We tell our secrets to The One. Our prized possessions are curated by The One. Our bodies are regulated and monitored by The One, for our own good (as in the case of, say, a pacemaker) and for the sake of society (as with electronic monitoring of individuals on probation). Kelly proposes that we accept this fact. "Go ahead and tell it everything because all of us will benefit." His message resonates out there; as of this writing, Kelly's TED® Talk has had more than 14,000 views.

So what does the Seventh Kingdom have to do with the humble spirometer, a device for measuring lung function? Innovators in the field of medicine have been tinkering with this device since 129 B.C.E. Periodic advances, especially during the 1800's, ultimately led to the design used in 21st century medicine. I received my very own spirometer following both knee replacement surgeries. Like its high tech cousins, it has become a common device whose beneficence is assumed.

However, medical historian Lundy Braun revealed the dark side of this ordinary medical instrument. In her book Breathing Race into the Machine she discusses the practice of "race correction… the idea of racial difference in lung capacity." Pulmonologists, drawing on the assumptions of scientific racism that embraced categories of "problem people," designed the stigma into the mechanical specifications. A button on the device produced different measures of "normal" depending on the patient's race. This widely accepted practice was used by insurance companies to minimize disability claims. Black workers had a harder time qualifying for workers' compensation than white workers because of a feature built into the machine. Racism was part of the underlying code used to develop a diagnostic tool intended to benefit pulmonary patients.

Like the spirometer, high tech tools are built using codes. These codes are articulated through the language of the programmers, who are individuals, embedded in a particular social context. The coders' attitudes and assumptions drive technology's design even more powerfully than their skills with DOS precisely because their biases are implicit. Witness the story of a glitch in Google maps software. When announcing the user's arrival at "Malcolm X Boulevard," the software stated, "You have reached your destination, Malcolm Ten Boulevard." Or some of us are old enough to remember a time when cameras used film that had to be sent to a developer. Kodak, the tyrannosaurus of the marketplace, used default settings in the developing process that favored white complexions. The result for us, and hundreds of thousands of families around the world, was a childhood of pictures in which my loved ones and friends were shadows with eyes and teeth.

These stories are relatively innocuous about problems that has since been corrected, but they provide a glimpse into much deeper hidden assumptions with scarier ramifications. Consider facial recognition software used by law enforcement agencies from local to federal to international levels. Analyses of contours and contrasts on which the software depends are most accurate with white faces; the software cannot "see" dark faces clearly increasing the likelihood of false positive identifications involving people of color.

Ostensibly neutral job recruiting software divides applicants into categories and weights their files based on names, SES, and zip code because it is unlawful to use race; however, studies have shown each of these categories to be statistically correlated to race. Health care "hot spotting," uses SES to predict where health problems are likely to be concentrated. Computer modeling based on the "broken windows theory" – predicting that neighborhoods with more broken windows will have higher crime rates -- is still being taught in criminal justice curricula. These data inform urban planners deciding where to build free clinics versus private hospitals, public schools versus magnet schools, lofts versus subsidized housing, chain grocery stores versus specialty markets, liquor stores versus malls, jail bond offices versus professional suites.

Technology is not neutral.

I will say it again, louder, for the people in the back.

Technology is not neutral.

Technological advances, especially those designed intending to address inequities, can mask destructive historical fault lines with far-reaching implications for communities of color. On the surface these innovations appear to be benevolent fixes. Proponents exclaim, It's a machine! Computers cannot be biased! Therefore, software and hardware are distributed as a means to level the playing field and foster inclusion. Actually, like the spirometer, they accomplish the antithesis due to common usage without interrogation. We are caught up in the wow-factor of the technology or mystified by the complexity of the code supporting the functions. We are not wondering what stereotypes and social myths lie hidden inside the Terms of Agreement.

If we are, indeed, evolving toward Kelly's Seventh Kingdom, we must ask better questions along the way.

 
De' Bryant