Why do people lie to make you feel better?
When you're a child, they tell you that you can be anything you want to be.
When you're a teenager, they tell you that you will find The One for you some day, you just have to be patient.
And then when you're an adult, along come those motivational types who say that if you just put in the effort, you will be rewarded. Or "Do your best, and God will do the rest."
Why? What's the point of making people feel better about themselves in the short term, when we all know that these are SER-WEEPING GENERALIBLOODYZATIONS, and that these things are only true for the lucky ones? Why keep the truth from them so that they can end up feeling like total losers when they realize they've been living a lie?
I'm probably not even looking for true answers to these questions, I just needed to vent. Sorry.
Feel free to ignore the above. I have in the meantime had a nice long talk with a good friend, and I feel much better now. Or at least ready to go back into denial.
What a day this has been! https://forum.interference.com/u2feedback/rolleyes.gif
Reality is merely an illusion, albeit a very persistent one.
i TOTALLY understand you, klod
and i have absolutely no words of wisdom so i wont even try
Well, thanks for sympathizing, Zoomanda. And good luck! https://forum.interference.com/u2feedback/smile.gif
|All times are GMT -5. The time now is 08:54 PM.|
Powered by vBulletin® Version 3.8.8 Beta 1
Copyright ©2000 - 2021, vBulletin Solutions, Inc.
Design, images and all things inclusive copyright © Interference.com