When talking to ordinary strangers (in Ubers, around town, etc.) I've been asked what I do for a living and multiple times people have gotten hostile when I tell them that I work for a firm that develops real estate. Maybe they'll tell a story about how they used to live in one neighborhood but the new apartments/hotels made traffic unbearable and forced them to move, or they'll voice their anger that everything that's getting built is luxury apartments that are unaffordable. And of course there's the people who are outraged by gentrification. I hear their points but it's not like what we do doesn't also help communities too-- we're creating jobs, reducing crime in neighborhoods, and a lot of the time we're building affordable units in our projects (though apparently not enough to create goodwill).
Anyone else have these types of experiences? Do people see us as the villain? Maybe it's just because I'm in a liberal city known for its NIMBYism. I'm just a little alarmed by the hostility and never expected that I would be seen as the bad guy by all these people.