Monday, September 12, 2011

I heard in Japan they don't tell people they have cancer. Is there anywhere else with the same belief?

Question:




Answer:


Actually, I was interpreting for a Japanese person in the U.S. and the doctor told her that certain cells might be pre-cancerous. I translated this directly and she was very shook up about it. She said, after the appointment, that in Japan a doctor would be more likely to run the tests first. Then if you have cancer, he would tell your family and then tell you if it was getting really bad. She said no doctor would come out and say, 'we need to test cause this could be cancer.' This leads me to believe that they would not tell you if you had cancer either. As for other countries like this, I don't know. Definitely not the U.S.

No comments:

Post a Comment