Show older

Then the automobile industry began the process of normalizing traffic fatalities.

Core to their approach was an emphasis on human error. Accidents weren't due to systemic decisions (or corporate greed). It was all "human error". Pedestrians who didn't yield to cars were "jaywalkers" causing accidents.

The industry lobbied against restrictions like "speed governors" that would keep cars from going too fast and funded education campaigns to teach a new generation that roads were for cars.

(BTW, one of the things this book has really affirmed for me is the extent to which blaming individuals for problems is not just ineffective at solving the larger problem, it *actively works against *system change. It is a tool wielded by those in power.)

One particularly obnoxious manifestation of industry's focus on individual responsibility is Otto Nobetter.

Otto Nobetter was an education campaign designed by the industry group the National Council for Industrial Safety. It was formed largely because states started passing worker's compensation laws, so businesses suddenly had an incentive to protect their workers from injury and death.

Industry also tried to argue that some people are just "accident prone".

Psychologists, usually on the corporate payroll, conducted studies attempting to prove that people who got into accidents had something wrong with them: they lacked strong religious values, had trouble with authority, were divorcees or gamblers or had "a psychosexual need to court danger."

Of course this was all bunk, and yet another example of scientists cynically serving those in power. (See merchantsofdoubt.org/)

History Professor Bryant Simon says "what we call accidents are in some ways manufactured vulnerabilities".

He wrote a book about the 1991 Hamlet Fire, which killed 25 workers, mostly black women. Simon refuses to blame the "greedy owners" who violated OSHA regulations.

"Those people did not just end up in that plant that day.
Historical forces brought a particular kind of person to that plant, and the fact that no one cared about them didn’t just begin that day.”

In chapter two (yes, that was only the first chapter, guys 😂) Singer brings us back to the late 1800s and the plague of railway coupling "accidents".

This topic is of particular interest to me, because my great-great-great-grandfather died trying to couple two train cars together. At the time he died, "automatic couplers" existed that would have done his job safely for him, but the railroads didn't want to cut into profit margins.

I wrote more about this story here: rethinkingpower.info/noble-fru

Eventually congress intervened and forced the railroads to use automatic couplers. In the early 1990s, government intervened again, passing worker's compensation laws.

Previously, injured workers or bereaved relatives had to sue to receive compensation, and seldom won. New laws said that injured workers would *automatically* receive compensation from the companies that injured them.

Companies suddenly had an incentive to prevent injuries and deaths, and workplace accidents plummeted.

We see this pattern again and again: people dying in preventable "accidents" until companies are actually forced to protect people.

Hugh DeHaven, inventor of the three-point seat belt, in 1953 invited automakers to a conference to learn about safety technologies like the collapsible steering wheel. Most were not adopted until the late 1960s when Ralph Nader and the consumer safety movement started campaigning for them.

Hundreds of thousands died because it was cheaper for the auto industry.

Chapter 3 of Singer's book focuses on scale: "accidents" with low probability and big impact, like nuclear meltdowns and oil spills. I have fewer notes on this chapter, and they're mostly just ugh!!!!!"

Like, fun fact: David Rainey, VP of BP, lied to congress about how bad the Deepwater Horizon spill was. He was acquitted of obstructing congress: steptoe.com/en/news-publicatio (ugh!!!!)

Also: more than half of the fish species endemic to the Gulf of Mexico could be found after the spill (ugh!!!!)

(Pausing for the night; will come back and finish up the thread soon. Blogging one's notes takes more time than I predicted!)

Ok we're back. Time for Chapter 4, titled "Risk" but which I might title "Time to get mad about traffic engineering!"

Perhaps you already know that the crash test dummies used in test collisions are modeled after men. The "female" dummies are not modeled after women, they're just male dummies but smaller. *Too* small: at 4'11" & 108 lbs they represent only 5% of women.

The result? Women are "73% more likely to be injured and up to 28% more likely to be killed in a front-facing car accident".

Singer does a deep dive into the work of civil engineer Eric Dumbaugh.

Most of the US road engineering guidelines were written in the 50s and 60s. This was in the middle of the auto industry's campaign to convince the public it wasn't to blame for the tens of thousands of people who were suddenly being killed by cars.

They blamed "jaywalkers", they blamed individual bad drivers, and they also blamed roads. So road engineers tried to design roads to prevent accidents.

Engineers tried to make a "forgiving roadside". They tried to make suburban+city streets like interstates: straight, wide, and with as few trees and poles and people as possible. But that only encourages cars to treat city streets like interstates.

"Those curves, trees, and benches that engineers removed had actually been making drivers slow down to avoid the risk these hazards presented—without all that, drivers felt less at risk and more in control" so they drove faster & killed more people.

You might think road engineers would update their guidelines, but apparently they haven't!

Moving on. How do road engs determine the speed limit for a road? Turns out they measure how fast cars are already going on it Usually about 85% of the cars are going the same speed, and 15% are going faster. They set the limit at that threshold between the average speed and the 'fast car' speed.

"We look at how fast cars are going and we assume that is the safe speed of the roadway" says Dumbaugh

How many people does this methodology for setting speed limits kill? We don't know!

In 2018 9,000 people died in crashes attributed to speeding. But more than 36,000 people were killed in traffic accidents. If any of those accidents were caused by a road that was given too high a speed limit, it wouldn't be labeled as caused by speeding.

And if you thought the process for setting speed limits was bonkers and reckless, wait until you hear about crosswalks!

Guess how they decide where to place crosswalks! Whatever you're thinking, it's not callous enough.

"the rules actually discourage installing crosswalks or pedestrian walk signals at intersections unless the risk of an accident is extreme. According to those rules, a lower-risk way across a street—like a crosswalk and traffic signal—is only warranted if one hundred people cross a street every hour for four hours. **Ninety-nine people running across the highway is not enough of a risk.**"

Given how roads are designed, speed limits set, and crosswalks placed, I have to agree with Dumbaugh when he calls road engineering a "fraud discipline".

Why is it so bad? One cause might be education. The majority of traffic engineering programs in the U.S. "do not have a single course that covers the issue of road safety".

One study found that less than 25% of programs said they offered a safety course, and of those, a third didn't actually offer one.

Singer makes a really good point about how technologies can magnify harm:

"Applied to a whole system, the consequences of misperceiving risk get multiplied across the population [...] The risk perception of a person with power can create dangerous conditions for us all."

This is absolutely something us software devs need to grapple with. Silicon Valley is obsessed with scale, but as tools and systems scale, so does the potential harm of what might otherwise be small errors and oversights.

Ch 5 tackles stigma:

"While you may see an accidental overdose as different from a car accident, in many respects it is not," Singer writes.

"An accidental overdose happens when dangerous conditions stack up—addictive drugs marketed as nonaddictive, a lack of access to health care, or the threat of criminal prosecution if you call for help. [...] Stigma is what doctors call a “fundamental cause” of health disparities—an inescapable reason why some people die by accident and others do not."

Stigma gives rise to absurd situation where it is far easier to prescribe addictive drugs than drugs that treat addiction.

Doctors need no special training to prescribe OxyContin, but for buprenorphine, an opioid substitute which facilitates safe recovery, "doctors must fill out a pile of paperwork, get a special waiver from the Drug Enforcement Administration, and undergo an eight-hour training session. After all that, they’re only permitted to prescribe to a limited number of patients."

Bias and stigma also influence who gets access to buprenorphine. Doctors are 35 times more likely to prescribe it to white people than to black people.

This even though overdoses among black people are rising faster than overdoses among white people.

Singer writes: "In accidents, stigmas stack up, and race trumps them all."

Singer describes a vicious cycle wherein Black people are more likely to be harmed or killed in accidents due to systemic causes ("manufactured vulnerabilities") but are then more likely to be blamed for it, as individuals and as a race.

According to scholars Barbara Fields and Karen Fields, this is a form of "racecraft": when racism itself makes race seem more real.

Fields & Fields wrote a book about Racecraft, which is now on my reading list: penguinrandomhouse.com/books/2

Singer also discusses the White Male Effect. "White men felt the
least threatened by every [kind of accidental risk], from nuclear waste to suntanning to plane crashes." This was particularly true of "white men who were wealthier, more educated, and more politically conservative."

This isn't rich white men being reckless. They are accurately perceiving that they're at less risk.

CW for child death.

There are two diagnoses doctors use when babies die in bed. One, SIDS, is a medical condition. The other is "accidental suffocation and strangulation in bed" (ASSB).

Diagnoses of SIDS are generally declining, and diagnoses of ASSB are generally on the rise. But SIDS rates are falling faster and ASSB rates rising faster among people of color.

Researchers hypothesize this is because doctors are more willing to blame parents of color for the death of their children.

Black, Latino and Indigenous pedestrians are more likely to be struck and killed by drivers. Relatedly, research shows that drivers are less likely to yield to black pedestrians.

Research also shows that people asked to place the role of a police officer are more likely to "accidentally" shoot black figures over white figures.

Singer lays out a long and grotesque list of accidental injuries, all of which are more likely to happen to you if you're Black, Latino, Indigenous and sometimes Asian.

Indigenous people are most likely to die of every single kind of accident except exposure to smoke and fire (where they're edged out by Black people) and "death by falling, a subcategory which is fatal mainly for older adults. To die by accidental fall, you need to live a long life, which is less likely if you’re Indigenous."

Chapter 8 is about blame and repeats Singer's mantra that blaming individuals prevents systemic change.

Remember the road engineering profession's bonkers horrifying methodology for deciding where to place crosswalks?

It's back here if you need a refresh elk.zone/social.coop/@shauna/1 but tl;dr at least 100 people an hour need to be crossing the road.

Back in 2010, a woman named Raquel Nelson and her three children tried to cross a street. They were hit by a car, and one of her children died.

The driver was charged with a hit and run. But, shockingly, Nelson too was blamed. She was charged with vehicular homicide. Again: she and her children were hit by *somebody else's car*. An all-white jury convicted Nelson, a Black woman, but after media attention arose the judge offered a retrial and she ended up sentenced to a year's probation.

Blaming Nelson may seem absurd. Blaming the driver may seem fair. But, Singer warns us, blaming either of them lets the system off the hook.

Nelson's family crossed the street where they did because it was where the bus stop was. The nearest crosswalk was a 20 min out of the way.

In the aftermath of the "accident", traffic engineers inspected the place where Nelson's child died. They decided there were not enough pedestrians to justify installing a crosswalk or a traffic light.

Meanwhile in the greater Atlanta area, a quarter of all pedestrian fatalities occurred within 100 ft of a bus stop, and over half occurred within 300 feet.

In Chapter 9, Singer quotes Susan P Baker, who tells us her approach to accident prevention: make the world safe for drunks.

“The bottom line is if you make this world safe for drunks, you make it safe for everybody. If you focus on making the world safe for the average, reasonably smart, sober person, then the drunks, the sleepyheads, the guy who is worried about his child’s operation and trying to get home in time for it, it is not going to be safe for them.”

On to the last chapter, "Accountability".

When Boeing created the 737 Max it had some problems. It was designed to be more fuel-efficient, but ended up less aerodynamic. Rather than redesigning the plane, Boeing created some automated software, MCAS, which made the plane easier to fly.

Singer writes: "To cover for the plane’s poor maneuverability, the software would sometimes push down the plane’s nose without input from the pilot. If this software failed, Boeing expected the pilot to fix it"

"To review: Boeing built a plane that was difficult to fly, then added software that autocorrects these difficulties, but decided that if that software fails, the pilot should save the day by figuring out how to fly the difficult-to-fly plane."

They didn't even tell the pilots about the software.

The pilot of the first crashed 737 Max spent his final minutes "paging through the pilot manual, trying to determine what was going wrong. This was futile; any mention of MCAS had been removed."

I'm struggling to articulate how horrifying I find that previous section. It is some of the most reckless, arrogant, patronizing, and incompetent systems design I have ever heard of.

To think you could *possibly* design software that would always work. To think that pilots didn't even need to know about it. And yet to somehow think that a pilot that *didn't even know the software existed* could somehow compensate in the "unlikely" case that something went wrong?

You might ask: how could such a choice be allowed? Don't we have government regulators to oversee this sort of thing?

Two words: regulatory capture. The head of the FAA at the time was a former Boeing lobbyist. With the encouragement of George W Bush, who was trying to "deregulate" the airline industry, he worked to make the FAA more corporate-friendly. The decision to remove MCAS from the pilot manuals was supported by the FAA.

Corporations like Boeing make these decisions because they are legally incentivized to.

One way of holding them accountable for these decisions is through torts. Torts are a civil procedure whereby people who do harm negligently or intentionally can be held liable for damages even if they haven't broken a law or a contract.

It's a vital part of preventing wrongdoing in a changing world where many kinds of harms can't be identified and legislated ahead of time.

As you might expect, corporations don't like torts.

"Tort reform" has been a Republican calling card for decades. What tort reform actually means is restricting your right to go to court, especially against corporations.

For example, "tort reform" in Michigan has prevented people there from suing Merck for selling a pill that caused strokes or suing Purdue/the Sacklers for pushing opioids.

The conservative legislation-peddling outfit ALEC has been pushing these kinds of laws for years.

(Side note: this book has made me realize I need to learn more about tort law. So if you have any recommendations for things to read, watch, etc, please share.)

Aaaand that's the end of my notes. Before I finish, though, I want to talk about Eric Ng.

Eric Ng was Jessie Singer (the author)'s best friend. He was struck and killed by a car while riding on a bike path in New York City in December of 2006.

After Eric died, city planners did nothing to protect future riders on that path. But a decade later, a terrorist drove down that same path, intentionally killing 8 people and injuring more. Within days, the path was protected by steel barricades.

Why is "accidental" death so much more acceptable to us than intentional death? The lives lost are just as precious, the grief just as painful.

"Eric was was kind," Singer writes. "Eric was loved. Eric was very funny. Eric drew his own tattoos. Eric was impossibly cool."

"Eric was killed at age twenty-two."

The tragedy of his death was not an accident. There are no accidents.

Again, you can buy the book here: simonandschuster.com/books/The

Follow

@shauna

I thought about this once and my conclusion was that large part of unacceptability of murder (compared to e.g. causing on average one death by increasing pollution) is due to the potential use of death threats (implied or explicit) as an extortion mechanism.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.