Does the world just feel weird and wrong/different to anyone else recently?
I'm not sure how to really explain this, but everything has felt really weird and kinda meaningless for the last like year or so. Society feels like a sham, girls don't feel trustworthy at all, consistent social culture doesn't exist, everyone is just on their own, etc. It feels like lord of the flies, I constantly feel like everyone is out to get me and nobody really cares. What are we doing? What does money even get you? Is there anything to look forward to? Are societal trends fucking me or am I just a pussy? I have heard this feeling reciprocated by other kids in my demographic (20-25 year old guys), but idk if that's selection bias or what.