RightCare
See the following -
How the American Health Care Business Turned Patients into Consumers
By Philip Caper, MD | March 24, 2015
A clash of cultures is rapidly developing among those of us who see the mission of the health care system to be primarily the diagnosis and healing of illness and those who see it primarily as an opportunity to create personal wealth. The concept of health care primarily as a business is uniquely American, and it has gained ascendancy during the last few decades. While there have always been a few greedy doctors, businessmen-wealth-seekers — not doctors — now dominate the medical-industrial complex.
- Login to post comments