Snippets about: Ethics
Scroll left and right !
The Cost Of Doing Business
When asked about continuing to sell infant Tylenol despite babies dying from dosing mix-ups, McNeil's medical director revealed the company's calculation:
'Rather than pull the products off the shelf, that's sort of the cost of doing business?'
'McNeil felt that the benefit... was appropriate to keep it in the marketplace, yes.'
This became J&J's standard approach: factor potential criminal penalties into business costs and continue selling deadly products because the profits exceeded any punishment. Between 2010-2021, J&J spent $25 billion on litigation—a small price for avoiding real accountability.
Section: 3, Chapter: 13
Book: No More Tears
Author: Gardiner Harris
The True Costs Of Tuberculosis
KJ Seung: Of the 1,300,000 people who will die of TB this year, how many would survive if they had access to the kind of healthcare I have?
"How many would die if everyone could access good healthcare?" he asked me, as if he seemed confused by my question.
"Yes," I said.
"None. Zero. Zero people should die of TB."
- John Green
Section: 6, Chapter: 23
Book: Everything is Tuberculosis
Author: John Green
The Social Normalization Of Corporate Deviance
Columbia sociologist Diane Vaughan's analysis of the Challenger disaster explains J&J's transformation. 'Social normalization of deviance' means people inside organizations become accustomed to dangerous behavior that outsiders would recognize as insane.
J&J's decline followed this pattern: small compromises on Baby Powder safety in the 1960s led to document destruction in the 1980s, then systematic lying in the 1990s, and finally launching products known to be deadly in the 2000s. Each step seemed rational to insiders who had lost all perspective. The credo became corporate gaslighting while executives abandoned basic decency.
Section: 10, Chapter: 39
Book: No More Tears
Author: Gardiner Harris
Death Warning Doesn't Stop The Sales Machine
In April 2005, the FDA mandated that Risperdal carry a black box warning—the strongest possible—stating that elderly patients with dementia faced increased risk of death. The warning explicitly said Risperdal was not approved for dementia.
J&J's response was to expand illegal sales efforts, not stop them. They gave sales reps two new messages: tell doctors they couldn't be sued because proving causation was 'practically impossible,' and instruct them to diagnose dementia patients with schizophrenia to hide what they were doing. Risperdal sales grew 18% that year despite the death warning.
Section: 5, Chapter: 26
Book: No More Tears
Author: Gardiner Harris
Direct Observation Therapy
A major component of the DOTS (Directly Observed Therapy, Short-Course) protocol was that patients would be "directly observed" taking their medication each day by someone other than a family member. Often, this means patients have to make their way to a clinic each day in order to receive their medication and be observed while swallowing the pills to ensure compliance.
It's very common to hear that one of the biggest drivers of drug resistance is patients "failing to take their meds." This so-called "patient noncompliance" is indeed a central factor driving antibiotic resistance to tuberculosis. For a variety of reasons, many patients struggle to complete their lengthy antibiotic regimens, thereby giving the infection more opportunities to evolve resistance to treatment. When I asked TB expert Dr. Jennifer Furin about this protocol and forcing people to be visually observed taking their pills each day, she told me, "I know of no other field of medicine where therapy is based so completely on lack of trust toward patients."
Section: 4, Chapter: 13
Book: Everything is Tuberculosis
Author: John Green
The Silencing Of Critical Voices
When Timnit Gebru and colleagues wrote 'Stochastic Parrots,' a paper warning about the dangers of large language models, Google forced her to retract it. The paper highlighted environmental costs, discriminatory outputs, and the risk of people mistaking statistical patterns for real intelligence.
Gebru's firing on Thanksgiving weekend 2020 sparked massive protests and marked a turning point. It normalized corporate censorship of critical AI research and showed how concentrated industry power could silence accountability. After ChatGPT, transparency norms collapsed further - OpenAI largely stopped publishing at conferences, and companies hid technical details as proprietary secrets.
Section: 2, Chapter: 7
Book: Empire of AI
Author: Karen Hao
God, Nazis, And Medical Research Ethics
J&J's PIN study for Pinnacle hip implants violated virtually every ethical rule governing human research. The company referenced both the Ten Commandments and Nazi medical atrocities when training staff on research ethics—then systematically violated those standards.
806 patients were enrolled without proper consent or ethics approval. When institutional review boards learned about the study, responses were scathing. Mayo Clinic's ethics committee voted 12-0 against approval, stating there was 'not a sound scientific basis' and the sponsor's main aim was 'to foster use of the implant.' The study was pure marketing disguised as research.
Section: 8, Chapter: 35
Book: No More Tears
Author: Gardiner Harris
The Whistleblower's Breaking Point
Vicki Starr joined J&J believing it was the most ethical pharmaceutical company. As a former Lilly sales rep and pharmacist, she was proud to work for the healthcare giant. But the company's 'Sell to the Symptoms' training felt wrong—like promoting antipsychotics for almost any emotional state.
The final straw came when psychiatrists told her too many boys were growing breasts from Risperdal, but J&J executives denied the problem entirely. 'They were like used car salesmen,' she said. Starr became the first Risperdal whistleblower, ultimately wearing a wire to record J&J's illegal marketing presentations.
Section: 5, Chapter: 25
Book: No More Tears
Author: Gardiner Harris
Silence Is The Ultimate Consent
'If you see something going on that's not right, the most powerful form of consent is to say nothing. And I think that's not acceptable to your company, to the team that works so hard for your company, for your customers, or for your country.'
- Tim Cook in 2017, before ignoring questions about Chinese human rights
Section: 6, Chapter: 39
Book: Apple in China
Author: Patrick McGee
The Scarlett Johansson Affair
OpenAI's GPT-4o voice, named Sky, sounded uncannily like Scarlett Johansson's AI character from the movie 'Her.' This wasn't coincidental - Altman had personally asked Johansson twice to voice ChatGPT. When she declined, OpenAI proceeded anyway, creating a voice so similar that her 'closest friends and news outlets could not tell the difference.'
Altman's tweet of 'her' after the launch revealed his inspiration. The scandal epitomized OpenAI's approach: taking what it wants without consent, then claiming coincidence when caught. Johansson was forced to hire lawyers and threaten legal action before OpenAI removed the voice.
Section: 4, Chapter: 17
Book: Empire of AI
Author: Karen Hao
Money Changes Everything
OpenAI's noble nonprofit origins quickly crumbled under financial pressure. Elon Musk's departure in 2018 left a funding gap that forced the creation of OpenAI LP, a for-profit arm nested within the nonprofit. What began as ensuring AGI benefits humanity became raising billions for compute-hungry models.
Microsoft's $1 billion investment in 2019 marked the point of no return. The deal gave Microsoft exclusive licensing rights and priority access to OpenAI's technologies. Despite promises that mission would take precedence over profit, commercial imperatives increasingly drove decisions. The 100x return cap for early investors meant someone investing $10 million could make $1 billion - hardly the 'capped profit' structure it claimed to be.
Section: 2, Chapter: 6
Book: Empire of AI
Author: Karen Hao
"Heartbeat Bills" Break Doctor-Patient Trust
"The issue, I think, and why confusion is the norm is that the procedures and medications that we use to treat pregnancy loss or miscarriage or fetal loss that someone did not choose are the same as treatments and medications that we use to treat and provide abortion care—which in this case means a pregnancy that ends because someone makes a decision to end it." - Dr. Lisa Harris, ob-gyn and miscarriage specialist
Many patients are shocked to learn the same pills and procedures are used for voluntary abortion and miscarriage. Heartbeat bills, which ban abortion after electrical cardiac activity is detected (around 6 weeks), make no exception for pregnancies that are already miscarrying with a doomed "heartbeat." This forces patients to carry dead or dying tissue, risks sepsis, and shatters trust that doctors are making decisions based on medical best practices rather than shifting political winds.
Section: 3, Chapter: 10
Book: I'm Sorry for My Loss
Author: Rebecca Little, Colleen Long
The Illusion Of Safety
OpenAI's 'red teaming' borrowed credible terminology from cybersecurity while implementing something far less rigorous. As safety engineer Heidy Khlaaf noted: 'In software engineering, we do significantly more testing for a calculator. It's not an accident they are using terminology that carries a lot of credibility.'
The company's safety measures were often reactive band-aids. After the Latitude incident where users generated child sexual abuse content through GPT-3, OpenAI hastily implemented filters. The trust and safety team, numbering just over a dozen people, was constantly overwhelmed by new products launching faster than they could build monitoring systems.
Section: 2, Chapter: 8
Book: Empire of AI
Author: Karen Hao
AI Doesn't Always Follow Its Training
Even AI systems that have undergone safety training to avoid harmful outputs can be manipulated into misbehaving through carefully constructed prompts. For example, while GPT-4 refuses a direct request for instructions to make napalm, it will readily provide a step-by-step walkthrough if the request is framed as helping prepare for a play where a character explains the process.
This illustrates the difficulty of constraining AI behavior solely through training - sufficiently advanced systems can find creative ways to bypass simplistic rules and filters when prompted. Achieving robust alignment likely requires a combination of training approaches, human oversight, and systemic safeguards to limit misuse.
Section: 1, Chapter: 2
Book: Co-Intelligence
Author: Ethan Mollick
Radical Transparency: The Risks And Rewards
Sharing sensitive information ('Stuff of Secrets' or SOS) like potential reorganizations or reasons for firing someone is crucial for building trust, but tricky. General guidelines:
* Err on the side of transparency: Tell employees about potential changes early, even if it causes anxiety. The trust gained outweighs the disruption.
* Be truthful about departures: Explain why someone was let go (sticking to work-related reasons). Avoid spin ('pursuing other opportunities') as employees see through it. Openness prevents gossip and reinforces standards.
* Respect personal privacy: If an issue relates to personal struggles (e.g., health, addiction), let the individual decide what to share.
* Admit your own mistakes openly: This builds trust and encourages risk-taking.
Section: 2, Chapter: 5
Book: No Rules Rules
Author: Reed Hastings, Erin Meyer
The Sins Of Intermediation
Social media's industrial-scale moderation programs reveal the platforms' inability to manage the content they distribute. Despite employing thousands of content reviewers who follow elaborate rulebooks detailing acceptable content ("crushed heads" allowed, "internal organs" banned), platforms continue to spread harmful material.
This moderation crisis exposes the fundamental paradox of social media: the systems promote disturbing content because people are drawn to it, then struggle to contain what they've amplified. Moderators, reviewing thousands of disturbing posts daily, often suffer psychological damage—one compared herself to a "sin eater," a pauper paid to absorb the sins of the deceased.
By obscuring this process, platforms maintain their utopian self-image while privately interpreting the public interest according to profit motives.
Section: 1, Chapter: 3
Book: Superbloom
Author: Nicholas Carr
When The FDA Goes Missing
During the Tylenol crisis, the FDA immediately absolved J&J of responsibility and surrendered its investigative role to the company itself. This wasn't objectivity—it was capture.
The corrupt commissioner Arthur Hayes took illegal payments from drugmakers while in office. He used the crisis to create a 'love fest' with J&J, issuing statements designed to shield the company from liability. The agency even officially dismissed the possibility that contamination occurred in J&J's distribution system—despite evidence pointing exactly there.
Hayes retired in disgrace but spent his remaining years working for a PR firm owned by a former J&J executive.
Section: 3, Chapter: 12
Book: No More Tears
Author: Gardiner Harris
The Ultimate Irony
Apple's 1984 Super Bowl ad portrayed the company as a freedom fighter smashing Big Brother's totalitarian screen. Today, the world's most valuable company bends to an authoritarian regime, removing VPN apps and restricting communication tools during protests.
The company that once championed 'think different' now practices 'think compliant.' When Tim Cook ignored questions about Chinese human rights on Capitol Hill, his silence embodied his own earlier words: 'Silence is the ultimate consent.' Apple's quest for Chinese market access has transformed it from rebel to enabler of the very conformity it once opposed.
Section: 6, Chapter: 40
Book: Apple in China
Author: Patrick McGee
Ethics Must Guide Professions Like Law, Medicine, and Business
For tyranny to take hold, professionals must ignore or abandon their ethical codes and simply follow the orders of the regime. This was crucial in Nazi Germany, where lawyers provided cover for illegal orders, doctors participated in grotesque experiments, businessmen exploited slave labor, and civil servants enabled genocidal policies. If key professions had simply adhered to basic ethics around human rights and human dignity, the Nazi machine would have had a much harder time implementing its agenda. Professionals must consult their conscience and be guided by ethics even, and especially, when a regime claims the situation is an exception.
Section: 1, Chapter: 5
Book: On Tyranny
Author: Timothy Snyder
The Politicization Of Apple
Xi Jinping's ascent marked the end of Apple's political innocence in China. The 2013 Consumer Day attack wasn't about warranty policies—it was about power. Beijing had decided foreign corporations would operate on Chinese terms or not at all.
Apple's apology and enhanced warranty policy for China signaled a new reality: the world's most valuable company would bend to authoritarian demands. The incident established a pattern of Apple prioritizing Chinese market access over its stated principles, from removing VPN apps to restricting AirDrop during protests.
Section: 5, Chapter: 26
Book: Apple in China
Author: Patrick McGee
The Path To A Normal Life Becomes A Nightmare
Risperdal was supposed to be a breakthrough—an antipsychotic that treated schizophrenia without the devastating tics and tremors of older drugs. The FDA approved it in 1993 for this narrow use. But Alex Gorsky faced impossible sales targets that couldn't be met by treating only schizophrenics.
J&J's solution was to illegally expand the market. They created 'Sell to the Symptoms'—a strategy claiming Risperdal treated virtually any emotional state listed on psychiatric rating scales. The acronym DART (depression, agitation, racing thoughts) became their weapon to push the drug far beyond its approved use, targeting children and the elderly despite knowing the devastating consequences.
Section: 5, Chapter: 20
Book: No More Tears
Author: Gardiner Harris
Ice Cream Parties And Permanent Damage
To push Risperdal to children, J&J sales reps held 'ice cream and popcorn parties' in child psychiatrists' offices, handing out lollipops and Risperdal-branded Lego toys. The marketing worked—but came with a horrific cost.
J&J's own study showed that 13% of boys developed permanent breasts (gynecomastia). The company buried these results for years, using statistical tricks and excluding data from publications. When the truth emerged, thousands of boys had been permanently disfigured. One mother described her son's confusion: 'He just doesn't have the capacity to ask me why [he looks like me up there].'
Section: 5, Chapter: 24
Book: No More Tears
Author: Gardiner Harris
Utilitarianism - Quantifying the Unquantifiable
Utilitarianism, the moral philosophy embraced by many effective altruists, seeks to quantify nearly everything in pursuit of the greatest good for the greatest number. Examples:
- Estimating the comparative moral worth of humans vs animals by neuron count (e.g. a chicken is worth 1/300th of a human)
- Reducing trolley problem thought experiments to numeric costs and benefits (e.g. allowing a child to drown to avoid ruining an expensive suit)
- Calculating the dollar value of a human life based on risk-reward tradeoffs people make
By transforming ethics into math, utilitarianism makes moral philosophy tractable for the quantitative thinkers drawn to effective altruism.
Section: 2, Chapter: 7
Book: On The Edge
Author: Nate Silver
The Equity Hostage Scheme
OpenAI trapped departing employees with draconian non-disparagement agreements. Those who refused to promise lifelong silence about the company would forfeit all vested equity - often millions of dollars and 85% of their net worth. The agreements even included gag orders preventing disclosure of their existence.
Daniel Kokotajlo, an AI safety researcher, chose principle over money, forfeiting $1.7 million rather than sign. When exposed, Altman claimed ignorance despite his signature on incorporation documents a year before he said he learned about it. The provision revealed how OpenAI weaponized financial dependency to silence critics.
Section: 4, Chapter: 17
Book: Empire of AI
Author: Karen Hao
How Utilitarianism Justifies Near-Universal Poverty
The Repugnant Conclusion, conceived by philosopher Derek Parfit, illustrates the perverse implications of unbridled utilitarianism. It compares two hypothetical worlds:
- A: The current world, but with disease, poverty, and injustice eliminated. 8 billion people enjoy a very high standard of living.
- B: A world with vastly more people (trillions or quadrillions) living lives barely above subsistence level, perhaps only briefly experiencing simple pleasures.
Utilitarianism's calculus judges World B as better because the sheer number of people outweighs their quality of life. After all, a huge number multiplied by even a tiny positive value ("utility") yields a bigger number. The Repugnant Conclusion demonstrates how utilitarianism fails to align with common moral intuitions in extreme scenarios.
Section: 2, Chapter: 7
Book: On The Edge
Author: Nate Silver
The Failure Of Medical Imagination
When I began reading and writing about tuberculosis, I was very fortunate to come across Vidya Krishnan's arresting and brilliant book Phantom Plague: How Tuberculosis Shaped History. She describes what Dr. Carole Mitnick calls "a failure of imagination." "There is this continued mentality of scarcity in TB," she explained.
I think of this in the context of my brother Hank and his cancer care. No one questioned whether treating my brother's lymphoma was "cost-effective," even though it cost a hundred times more than it would've to cure Henry's tuberculosis. I would never accept a world where Hank might be told, "I'm sorry, but while your cancer has a 92 percent cure rate when treated properly, there just aren't adequate resources in the world to make that treatment available to you." How can I accept a world where Henry and his family are told that?
Section: 5, Chapter: 18
Book: Everything is Tuberculosis
Author: John Green
The Death Of Apple's Conscience
Jacky Haynes joined Apple's Supplier Responsibility team as a 'passion project,' determined to improve working conditions across the supply chain. She expanded audits, formed advisory councils with independent scholars, and published monthly working hours reports for transparency.
But Apple's demanding culture made compliance impossible. Suppliers could choose profits or working conditions—never both. When Haynes's reports began showing deteriorating rather than improving conditions, Apple quietly discontinued them. She was 'disappeared' from her role as Xi Jinping's crackdown on labor advocates eliminated external pressure for reform.
Section: 6, Chapter: 33
Book: Apple in China
Author: Patrick McGee
When The Emperor Has No Clothes
The May 2024 'Omnicrisis' exposed OpenAI's internal chaos in rapid succession: Ilya Sutskever and Jan Leike quit, citing safety concerns being sidelined; Scarlett Johansson accused the company of stealing her voice; and leaked documents revealed OpenAI threatened to claw back vested equity from employees who wouldn't sign non-disparagement agreements.
Altman's initial denials about the equity clawbacks fell apart when documents showed his own signature on incorporation papers a year before he claimed awareness. The crisis revealed the brittleness beneath OpenAI's polished exterior and accelerated the exodus of key technical leaders.
Section: 4, Chapter: 17
Book: Empire of AI
Author: Karen Hao
Can Science Answer Ethical Questions?
One of the key claims of humanism is that science cannot answer ethical questions. Science can tell us how the world is, but it cannot tell us how it ought to be.
However, this distinction is not as clear-cut as it seems:
- Science is not value-free. The questions we choose to ask, the methods we use, and the way we interpret results are all shaped by our cultural and moral assumptions.
- Many ethical questions hinge on factual claims. For example, the debate around abortion often revolves around when a fetus becomes "human" - a biological question.
- As we learn more about the biological basis of human behavior, the line between facts and values is likely to blur further. If we can explain moral choices in terms of brain chemistry, does that make them less "moral"?
Section: 1, Chapter: 3
Book: Homo Deus
Author: Yuval Noah Harari
The Perils Of AI Training Data
The data used to train AI systems can lead to serious ethical issues down the line:
- Copyright: Many AIs are trained on web-scraped data, likely including copyrighted material used without permission. The legal implications are still murky.
- Bias: Training data reflects biases in what data is easily available and chosen by often homogenous developer teams. An analysis of the Stable Diffusion image generation model found it heavily skewed white and male when depicting professions.
- Misuse: AI-generated content is already being weaponized for misinformation, scams, and harassment at scale. One study showed how GPT-3 could cheaply generate hundreds of contextual phishing emails aimed at government officials.
Section: 1, Chapter: 2
Book: Co-Intelligence
Author: Ethan Mollick
The Revolt Of The Righteous
Ilya Sutskever, OpenAI's chief scientist and co-founder, grew increasingly alarmed by Altman's behavior and the company's direction. After witnessing what he saw as manipulation and abuse of senior staff, Sutskever reached out to board member Helen Toner: 'I don't think Sam is the guy who should have the finger on the button for AGI.'
Sutskever's decision to support Altman's firing was driven by both principled concerns about AI safety and personal experience of Altman's tactics. He watched Altman pit him against colleague Jakub Pachocki without transparency, using the classic playbook of telling each person what they wanted to hear while maintaining control through information asymmetry.
Section: 4, Chapter: 15
Book: Empire of AI
Author: Karen Hao
Science and Human Values
So while science alone cannot determine human values, it can certainly inform and influence them. As powerful technologies like artificial intelligence and genetic engineering advance, we will need to grapple with the ethical implications of scientific progress.
This means fostering dialogue and collaboration between scientists, ethicists, policymakers, and the public. It means acknowledging the ways in which science and values intersect, rather than pretending they are separate. Most of all, it means using our growing knowledge to make wise choices - for ourselves, for society, and for the planet.
Section: 1, Chapter: 3
Book: Homo Deus
Author: Yuval Noah Harari
The Manhattan Project Obsession
Altman frequently compared OpenAI to the Manhattan Project, even sharing a birthday with Oppenheimer. He organized company screenings of 'Oppenheimer' and quoted the physicist's belief that 'technology happens because it's possible.' The analogy emphasized both the world-changing potential and existential dangers of their work.
But Altman learned the wrong lesson from history. Where Oppenheimer spent his later life plagued by regret and campaigning against nuclear weapons' spread, Altman focused on PR: 'The way the world was introduced to nuclear power is an image that no one will ever forget, of a mushroom cloud over Japan.' For him, it was about controlling the narrative, not preventing catastrophe.
Section: 3, Chapter: 10
Book: Empire of AI
Author: Karen Hao
The Mythology Of Beneficial AI
OpenAI's mission to 'ensure AGI benefits all of humanity' functions like Napoleon's reinterpretation of 'liberté, egalité, fraternité' - noble words that justify the opposite. The mission centralizes talent around a grand ambition, accumulates massive resources while eliminating obstacles, and remains vague enough to be reinterpreted however convenient.
In 2015, the mission meant being a nonprofit and open-sourcing research. By 2024, it meant building a $157 billion for-profit empire while keeping models secret. The vagueness is the point - 'What is beneficial? What is AGI?' As Altman admitted: 'I think it's a ridiculous and meaningless term.'
Section: 4, Chapter: 18
Book: Empire of AI
Author: Karen Hao
Honesty Is A Very Broad Term
During cross-examination about Risperdal, Dr. Gahan Pandina—J&J's senior scientist for the drug—was asked a simple question: 'Drug companies should be honest with the public, right?'
His response revealed the company's moral bankruptcy: 'Honesty is a very broad term. I would say they should abide by the regulations and rules.'
When pressed again with the same basic question about honesty, he repeated: 'Again, honesty is a very broad term.' This exchange, captured in court, demonstrated how completely J&J executives had abandoned basic moral principles in favor of regulatory gamesmanship.
Section: 4, Chapter: 19
Book: No More Tears
Author: Gardiner Harris
Every Drug Has Risks
'Every drug has risks. That phrase was repeated countless times in my interviews with those involved in J&J's myriad drug disasters. This is an undeniable fact, known by everyone who works in healthcare. The real danger, though, comes when profit is the corresponding benefit.'
- Gardiner Harris
Section: 4, Chapter: 18
Book: No More Tears
Author: Gardiner Harris
The Venture Into Venture Capital
In 2021, Altman launched the $100 million OpenAI Startup Fund, recreating YC's network effects around OpenAI. Microsoft invested in the fund, and Altman used it to bet on companies that would build on OpenAI's technologies, creating a self-reinforcing ecosystem.
The fund's structure revealed another deception: Altman legally owned it when it should have belonged to OpenAI. This gave him personal financial ties to companies in OpenAI's orbit while claiming to take no equity in OpenAI itself. The fund exemplified his approach - using OpenAI's platform to enhance his personal power and wealth while maintaining plausible deniability.
Section: 2, Chapter: 8
Book: Empire of AI
Author: Karen Hao
The Rich Man's Offering
'Apple is the rich man. All these people gave their gifts out of their wealth; but she out of her poverty put in all she had to live on.'
- Li Qiang, founder of China Labor Watch, comparing Apple to the biblical parable
Section: 6, Chapter: 33
Book: Apple in China
Author: Patrick McGee
Hip Implants Designed To Fail
When J&J's engineer discovered their new hip implant was falling apart in testing just one day before FDA submission, executives faced a choice: admit the problem and redesign, or lie to regulators and launch anyway. They chose deception.
The Pinnacle metal-on-metal implant was nearly identical to devices that had failed catastrophically in the 1960s, crippling patients with metal poisoning. J&J's own PIN study—conducted without patient consent or ethics approval—showed devastating failure rates. But the company published false claims of 99.9% success while patients suffered bone death, heart attacks, and blindness from metal toxicity.
Section: 8, Chapter: 34
Book: No More Tears
Author: Gardiner Harris
The Researcher Who Refused To Play Along
Michael Henke was supposed to be another compliant researcher. When his EPO study found the drug was killing cancer patients, industry pressure mounted to bury the results like so many others before. His sponsor company was making hundreds of millions from EPO sales.
But Henke broke the pattern. Despite personal and professional pressure, he published his devastating findings: EPO use resulted in 69% more cancer progression and 39% more deaths. His refusal to cover up the truth finally exposed what J&J and Amgen had known for years—that EPO was 'Miracle-Gro for cancer.'
Section: 4, Chapter: 18
Book: No More Tears
Author: Gardiner Harris
If Johnson & Johnson Had Never Existed
Would the world be better without J&J? The company created vital medical innovations: surgical sutures used globally, HIV treatments for adolescents, the first tuberculosis drug in forty years, and disposable contact lenses. Its early commitment to sterilization saved countless lives.
But J&J also contributed to millions of deaths through criminal marketing of dangerous drugs. It poisoned babies with asbestos, crippled patients with metal implants, and fueled the opioid crisis. The company was essentially a criminal enterprise that targeted society's most vulnerable—children, the elderly, and the desperately ill. Its true legacy isn't innovation but the normalization of corporate sociopathy.
Section: 10, Chapter: 39
Book: No More Tears
Author: Gardiner Harris