I ask, are therapists, psychologists and psychiatrists really necessary?
I mean, I know they are in certain types of institution; but I feel that that’s where they belong. Not let loose on society at large.
Basically in my day, one never heard of a psychologist, let alone have them in schools, etc.
Now they are a dime a dozen. How much good do they do? In my opinion, not much.
They seem to me to create problems, where in my day you got a clip around the ear and told to deal with it. Now you are a victim, with the associated lowering of ones self esteem.
If you look at it bluntly, the role of a psychologist is to make you dependent and thereby a source of income; it has nothing to do with making you ‘better.’ They prescribe antidepressant drugs that can make you suicidal or turn you into a murderous beast. The difference between murder and suicide is merely, who is the target.
Really, the world would be a better place if all these specialists crawled back into their holes.