I have always been fascinated, drawn to, and otherwise interested in the healing arts. At one time I thought maybe I'd be a doctor, then I had my twins. When I became a single mom I thought maybe I'd go back to school and become a nurse. Specifically a surgical nurse. My plans were changed by a large earthquake, but that's another story. I did finish massage therapy school(as well as a BA in Organizational Leadership), so I could still help people feel better, relieve tension, and help them know what causes it to try and avoid it if possible.
I am absolutely disgruntled, however, with the way scientific medicine simply wants to treat the symptoms and not the cause. I don't understand it. Why just mask something instead of getting to the reason it's happening? Anyone with me on this? I understand there are many, many things that have no explanations yet, but that doesn't stop people from simply handing out prescriptions and band-aids and saying "all better", when clearly we are not.
We live in a society obsessed with quick fixes. The standard American diet is filled with chemicals and non-food ingredients. Our bodies don't know what to do with that stuff, it can't digest it, it can't easily metabolize it, so it "sticks" it somewhere. Perhaps the liver, or other tissue, and we wondered why we are unhealthy.
I will climb off my soapbox for now.
i was simply really frustrated when I paid a lot of money out of pocket to have someone be in such a hurry to give my spots a name that means he doesn't know what they are and a cream that will not cure me, but just mask the spots.
And that's my 20cents. (inflation)