
The modern university – industry partnership was not an accident. It was built.
In the early twentieth century, the Rockefeller Foundation poured millions into medical schools such as Johns Hopkins and the University of Chicago. These schools became the model of modern medicine: laboratory-centered, science-heavy, and focused on measurable outcomes. Standards rose, but something else happened too. Entire schools of thought were cut away.
The Flexner Report of 1910, supported by Rockefeller funding, declared many medical schools unfit. Dozens of institutions serving women, Black physicians, and practitioners of non-allopathic medicine were closed. The result was a system that elevated laboratory-based science while marginalizing prevention, nutrition, and holistic traditions. A new medical order was created, one aligned with commercial science and built on exclusion.
By the mid-twentieth century, partnerships deepened. Universities became preferred sites for clinical trials. At the University of Pennsylvania and Harvard, industry-funded trials on antibiotics and other drugs became common. Patients in university hospitals sometimes gained access to promising therapies, but negative results were often hidden or delayed. Universities learned to balance their dual roles: teachers of students and contractors for industry.
The Bayh–Dole Act of 1980 shifted the foundation once again. Before the law, discoveries funded with taxpayer dollars belonged to the public. After it, universities could patent and license those discoveries.
Stanford University became a showcase of the new system. Recombinant DNA, developed by Stanley Cohen and Herbert Boyer, was licensed widely under Bayh–Dole. By the 1990s, it had generated more than $250 million for the university and launched entire industries in biotechnology. Stanford’s campus became not only a place of learning but also the seedbed for Silicon Valley biotech.
The benefits were clear. Innovation accelerated. Start-ups grew around campuses. Patients gained faster access to new therapies. Universities survived periods of declining public support. But costs were also clear. Professors became entrepreneurs. Laboratories were viewed not only as places of discovery but as engines of commercialization. Disputes arose over patents, royalties, and access. In some cases, patients were caught in lawsuits where universities defended profits over availability.
By the early 2000s, financial ties were no longer hidden. A national survey published in JAMA reported that 60 percent of department chairs personally had financial ties to industry. Two-thirds of medical school departments maintained institutional partnerships.
At Harvard Medical School, students protested in 2009 after discovering that professors had failed to disclose millions in industry payments. Their complaint was simple: their education was being shaped by financial interests they had not been told about. The administration tightened disclosure rules, but the financial structures remained.
At the Cleveland Clinic, then-CEO Dr. Delos “Toby” Cosgrove sat on the boards of major device companies while overseeing research and education. The institution became a leader in cardiac innovation, but the blurred lines raised questions. Was education being guided by science alone, or by commercial interest?
Students absorbed these lessons. Departments tied to commercial funding grew strong, while those without sponsors struggled. Prevention and public health were underfunded. The hidden curriculum was clear: prestige followed the money.
Continuing medical education, or CME, was meant to be a safeguard to ensure physicians kept up with new science. Instead, it became another channel of influence.
In 2008, a Senate Finance Committee investigation revealed that Harvard, Stanford, and other schools had accepted millions from industry to fund CME programs. The investigation found that companies had sometimes influenced speaker selection and course content. Senators concluded that CME had reached “an unacceptable level of influence.”
Reforms followed, but industry money did not disappear. In 2023, CME providers nationwide reported $858 million in commercial support, nearly one quarter of their revenue, plus $668 million from exhibits and advertising. Courses tied to high-profit therapies were widely available and polished. Courses on prevention or low-cost interventions were rare.
Doctors relied on CME to stay licensed and informed, but much of what they learned came through programs influenced by industry sponsors. Subtle framing mattered: benefits were emphasized, risks minimized, and alternatives overlooked.
The Physician Payments Sunshine Act of 2010 created the Open Payments database, which began reporting in 2013. For the first time, financial transfers from companies to physicians and hospitals were made public.
The case of Dr. Joseph Biederman at Harvard illustrated both the importance and the limits of transparency. Biederman, a child psychiatrist, received millions from pharmaceutical companies while promoting the use of antipsychotics in children. When his ties were revealed, they caused outrage. Yet the influence had already shaped prescribing practices. Disclosure came too late to undo the effects.
In 2023, companies reported $12.75 billion in payments nationwide. In 2024, the figure climbed to $13.18 billion. Some of these were modest meals or travel reimbursements. Others were multimillion-dollar consulting contracts. Transparency revealed the scale, but it did not end the practice. In many cases, it normalized it. Patients grew accustomed to seeing financial ties, and universities treated disclosure as compliance rather than reform.
Behind every partnership sits politics. Industry lobbying ensured that laws, regulations, and funding streams favored commercial interests. In the 2022 election cycle, pharmaceutical and health product companies spent more than $372 million on lobbying and campaign contributions, more than any other sector.
The Prescription Drug User Fee Act of 1992 provides an example. It allowed the FDA to collect fees from industry to fund drug reviews. The benefit was speed. Patients gained faster access to therapies. But dependence followed. By 2019, more than 40 percent of the FDA’s drug review budget came directly from industry. The regulator became reliant on those it was supposed to regulate.
Universities gained too. Stronger patent laws raised the value of their intellectual property. Larger NIH budgets, shaped by lobbying, meant more grants. But lobbying also directed those grants toward commercially promising fields.
The University of Pennsylvania faced lawsuits after the death of Jesse Gelsinger in a 1999 gene therapy trial. Investigators had financial stakes in the therapy being tested. Regulators, universities, and sponsors were all implicated in a system where money blurred judgment. It was a tragedy that revealed how political and financial capture could cost lives.
It is important to acknowledge what was gained. Stanford’s recombinant DNA licensing helped launch biotechnology. Harvard’s research contributed to breakthroughs in cardiology and oncology. Johns Hopkins advanced public health science that saved lives worldwide.
Patients received access to therapies that might otherwise have remained in laboratories. Universities gained stability during decades of declining public support. Entire fields of medicine advanced more quickly than they would have under a purely public system.
But much was lost. Harvard’s CME controversies revealed the vulnerability of continuing education. Stanford’s biotechnology success showed how easily priorities shifted toward profit. Johns Hopkins’ Rockefeller roots revealed how philanthropy could elevate some traditions while silencing others.
Rare diseases, preventive medicine, and low-cost therapies were neglected. Industry-funded trials were more likely to produce favorable outcomes, while negative results disappeared. Students were educated in systems where prestige meant revenue. Patients began to question whether their doctors were learning medicine or marketing.
The credibility of universities, once their greatest strength, was weakened by the very relationships that sustained them.
Universities live in two worlds. They are educators, trusted to train the next generation of doctors. They are also entrepreneurs, expected to commercialize discoveries and sustain themselves through partnerships.
Case studies make the paradox plain. Stanford’s DNA patents fueled biotechnology but tied universities to profit. Harvard’s CME controversies showed how easily education can be bent. The Cleveland Clinic revealed how leadership could be entangled with corporate boards. The University of Pennsylvania’s gene therapy tragedy showed how conflicts of interest can endanger lives.
Industry partnerships brought innovation, but they also entrenched conflicts. Public dollars created discoveries, but private incentives redirected them. Transparency exposed dependence but often normalized it. Politics kept the system funded but also entrenched capture.
The seeds of progress were also the seeds of compromise. They were planted more than a century ago, nurtured by philanthropy, and fed by laws, money, and politics until they grew into a system where independence is no longer assured.
Food for thought: If universities cannot return to pure independence, what safeguards can ensure they remain faithful to their public mission? Could prestige be redefined not by patents and licensing revenue, but by advances in prevention, public trust, and equity? What reforms would make it clear that medicine is taught and practiced as a service to people rather than as an extension of the market?
Our Next Exposing the Beast Blog:
Day 3: When Marketing Wears a Lab Coat.