I'm currently in the last semester of college. I have a job lined upstarting this summer and am psyched about starting work. I've worked hard for these past few years and I am ready to get out there and make practical use of my knowledge and make good money doing it.
However, I feel as though I am just a naive college kid. I talk to people that are currently working and 9/10 people seem to be miserable out there in the working world. It's actually quite depressing to hear things such as "College is the best years of your life and everything after that is all downhill."
So I made this thread to hear your views on what the real world is like. How did life change after college? Are you miserable or happy? Any big lifestyle changes?
Also, I would like to hear your guys' opinions on how I should spend my last semester of "freedom." Thanks.