AI’s 1929 Moment: Building HART for Healthcare Safety

by | Nov 11, 2025 | Artificial Intelligence, Healthcare Technology

In the final days of the Roaring Twenties, optimism ruled the markets. Factories were booming, credit was cheap, and new technologies — radio, automobiles, and electrification — were transforming life at a dizzying pace. Everyone, it seemed, wanted a share of the future. Then came the crash of 1929.

Financial journalist Andrew Ross Sorkin, in 1929: The Greatest Crash in Wall Street History — and How It Shattered a Nation (2025), reminds us that speculation itself wasn’t the problem. What destroyed confidence and livelihoods was the absence of guardrails — no disclosure rules, no oversight, no understanding of how human psychology amplifies collective risk. After the collapse, America didn’t outlaw speculation; it re-engineered it. The creation of the SEC, FDIC, and new disclosure requirements didn’t slow capitalism — it saved it. By institutionalizing accountability, we transformed chaos into the foundation of the American Century.

Innovation Without Restraint Is Not Progress

Today, healthcare stands on its own speculative frontier: artificial intelligence. Venture capital pours in. Start-ups promise miracles. Large systems deploy algorithms faster than they can be validated. Like investors in 1929, we believe our new technology can only rise. But speculation without oversight breeds fragility. A misfiring clinical algorithm or a biased predictive model could spark a crash in healthcare AI — one measured not in dollars, but in lives and trust.

Speculation drives discovery; governance sustains it. The reforms that followed 1929 prove that rules can coexist with growth. Without them, growth consumes itself.

The Psychology of the Boom

Sorkin’s research highlights a truth that transcends eras: markets — and now machine learning — are governed by human emotion. In 1929, exuberance blinded investors. In 2025, optimism about AI’s promise can do the same. We reward speed over scrutiny, innovation over introspection. The lesson isn’t to stop building — it’s to recognize that optimism without structure leads to collapse.

patients, doctors around floating globeJust as President Hoover and Senator Carter Glass, who later authored the Glass-Steagall Act to repair the damage, ignored early warnings before the 1929 crash, today’s healthcare leaders risk repeating their mistake. When every vendor claims “FDA-grade AI” without evidence and clinicians adopt systems they barely understand, the line between innovation and recklessness blurs. We must act before the headlines force us to.

From Regulation to Collaboration

Traditional regulation alone cannot keep pace with the velocity of AI. Government processes that once reviewed drugs and devices over years must now assess algorithms that update weekly. We need a governance model that blends public accountability with private-sector agility.

That is why I propose the creation of HART — Healthcare AI Review & Transparency — a national public–private partnership designed to evaluate every form of healthcare AI. HART would operate as a clearinghouse, ensuring safety, fairness, and real-world effectiveness while enabling rapid innovation.

Building HART: A Public–Private Partnership for AI Accountability

Public–private partnerships (P3s) are a proven American mechanism for balancing innovation and oversight. They unite government, academia, and industry under shared goals, combining the speed of enterprise with the stability of public institutions.

Consider the Biomedical Advanced Research and Development Authority (BARDA), which accelerates vaccine and countermeasure development through biotech collaborations; the Fixing America’s Surface Transportation (FAST) Act, which mobilizes private investment to modernize national infrastructure; and the Water Infrastructure Finance and Innovation Act (WIFIA), which pairs federal support with private funding to rebuild essential water systems.

Each demonstrates that collaboration — not control — produces sustainable progress. HART would bring that same model to healthcare AI.

What HART Would Do

HART would serve as a national clearinghouse evaluating AI systems both before and after deployment. Its mandate would extend across clinical, operational, administrative, and population-health applications. It would:

  • Conduct continuous performance reviews using standardized metrics for safety, bias, and transparency.
  • Publish results in a public registry accessible to clinicians, developers, and patients.
  • Coordinate post-deployment monitoring to ensure algorithms perform safely across populations and institutions.
  • Recommend corrective action or suspension when AI tools demonstrate unacceptable bias or drift.

Who Would Lead It

Diversity of expertise — not bureaucracy — must define this partnership. HART’s governing board and review panels would include:

  • Clinicians and healthcare administrators providing real-world insight.
  • Data scientists, engineers, and AI researchers ensuring technical rigor.
  • Ethicists and patient advocates reinforcing fairness and accountability.
  • Economists and policy experts analyzing cost and access implications.
  • Representatives from the FDA and ONC serving as liaisons to align standards without slowing innovation.

This diversity of knowledge, experience, and interests would prevent groupthink and ensure every AI system is viewed through multiple professional and moral lenses.

Transparency as a Catalyst for Trust

Transparency is the cornerstone of HART’s mission. By publishing evaluations openly, the clearinghouse would empower healthcare organizations to make evidence-based deployment decisions. Developers would gain credibility through verified performance data. Patients would regain confidence that the algorithms influencing their care meet shared ethical and scientific standards.

Speculate Responsibly

Speculation built the American economy — and it drives innovation in medicine today. But unchecked, it can destroy the very systems it seeks to improve. Healthcare cannot afford an AI crash of 1929 proportions. The time to act is now, before failure forces reform upon us.

HART — Healthcare AI Review & Transparency — offers a path forward. It joins the best of public oversight with the creativity of private enterprise. It enables us to innovate quickly while remaining accountable to the people we serve.

The goal is not to stop building. It is to build wisely — together — and ensure that this time, the crash never comes.

Reference

Sorkin, A. R. (2025). 1929: Inside the greatest crash in Wall Street history — and how it shattered a nation. Viking.

Post Topics

Dr Barry Speaks Videos

Future-Primed Healthcare

Navigating the Code Series

Dr. Barry Speaks

To book Dr. Chaiken for a healthcare or industry keynote presentation, contact – Sharon or Aleise at 804-464-8514 or info@drbarryspeaks.com

Related Posts

Topic Discussion

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *