I know this is a novel idea but I really think we should take the politics out of health care. When we are worried about our health, we go to the doctor. Do we ask our doctor which political party they belong to? No because it is not relevant. Do health ailments discriminate against certain political stances? No. If I could become healthy by switching political parties, I would be happy to.
So why has health and health care become a political issue? The recent Komen/Planned Parenthood brouhaha has brought this issue to the forefront. No health issue and access to health care should have its availability compromised based on politics. National health reform is also tainted by this as well. The purpose of national health care reform is to make health care accessible and affordable to all. That's it. The end result should be saved lives. I don't have problems with that one. People who are alive are happy people.
But why do politics need to get involved with health care? In the case of the Komen/Planned Parenthood PR disaster, accusations are flying regarding people's stance on abortion which is a political hot button. From their website:
"For more than 90 years, Planned Parenthood has promoted a commonsense
approach to women’s health and well-being, based on respect for each
individual’s right to make informed, independent decisions about health,
sex, and family planning."
Only 3% of their services include abortion which is THE issue, that means that 97% of their services are not related to abortion. (I personally don't care what you think about abortion because that is not the issue here.) My opinion is that this is about accessible women's health care. Women of all political beliefs, races, creeds, religions, heights, weights, and ages need access to health care. If they can't afford it, they need to have resources to help them. By politicizing the access to health care, women will be denied it. How much sense does that make? It doesn't.
So let's get the politics out of health care.