Health insurance companies have been growing in influence since the professionalization and centralization of American healthcare in the early 20th century.
By the 1970s, health insurance companies had begun challenging other corporate interests for more economic and political power over the medical industry.
The unrestrained influence health insurance companies have over medicine has driven healthcare…
