Monday, May 30, 2011

The Politics of Medicine Becoming More Liberal

link here to article

Medical doctors change their politics when they become hospital employees. As more women enter the profession and doctors work for hospitals that provide malpractice insurance they become less conservative and less oriented toward the attitudes typical of small business people.

No comments:

Post a Comment