(no subject)

Date: 2012-01-09 01:28 am (UTC)
I really have lost my trust in our western medicine especially when it comes to mental health & problems because it really seems they treat the symptoms but do not pay enough attention to the cause.

My thoughts exactly.

I just don't understand how it helps anyone to make people even more distant from their true emotions with medicine instead of offering tools to deal with the hardships of life

Perhaps it helps those in power, who are very much invested (quite literally) in maintaining the status quo. A society preoccupied with day-to-day hardships and distanced from deeper reflection is easier to control. Sounds creepy, I know - but the longer I live in the US, the more I am convinced it must be true...
This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting
Page generated May. 19th, 2025 07:22 am
Powered by Dreamwidth Studios
OSZAR »