Getting paid for the job they are not doing
What is wrong with our doctors today? Seems to me they are more concerned about getting paid than making sure their patients are healthy and taken care of.
You are sick and go into the hospital emergency department and what happens? They make you wait for hours, then when you get in to the ER you wait another hour or so just to have them tell you that nothing is wrong and they send you home.
When my son was first ill, we spent the better part of 2 weeks in the hospital emergency waiting room and they always sent us home. They would take a look at him, take his temperature, look in his ears and throat, say he had a virus then send us home. We made countless trips to the same hospital, only to be told the same thing over and over again. Telling me that I was a first time paranoid mother.
2 weeks later my son was again in the hospital, but they decided to admit him. They did nothing for him that night but just look in on him now and then. The next morning they took him for tests. When he stopped breathing, they finally did something about it. And he was diagnosed with cancer (brain tumor), and passed away in 1997.
Out of all the doctors my son saw in the ER, only 1 came to apologize for not listening.
Reminds me of a movie I once saw about a doctor who treated his patients badly. He ended up dianosed with cancer himself, and the doctors he saw treated him the very same way. He took a long look at his life to see the mistakes and misjudgements he had made. If I remember the name of the movie, I will post it.
We put the lives of our children in the hands of doctors, expecting that they will find out whats wrong and treat it accordingly, but this is seldom the case. A little understanding and compassion would be nice.
Just once it would be nice to hear "everything is going to be ok, we will look after you"
I don't trust doctors anymore, and Heaven forbid my other 2 children get sick in any way. But I tell you one thing, if they ever did, someone better damn well listen to me!