I'm starting to think that a lot of banks are relying on ChatGPT for their write-ups.
I have reason to believe that a lot of the banks are using ChatGPT for their write-ups, especially after seeing the differences in their writing over the last few months.
For example, I'm pretty good friends with quite a few buyside managers at Goldman, i've been reading his write-ups for years and I could tell when he wrote something vs. didn't. Over the last few weeks or so, his writing skills have exponentially increased...leaves me pretty skeptical that he's using ChatGPT to re-word his work.
Do you think there's anything ethically wrong here? It's not like it's plagiarism...but it's a little lazy.
Copy and paste into one of the services that checks. They are very accurate and GPT3 is easy to detect, 4 is easy to detect too.
if the machine can do it better, use the machine. You have to adapt with the times. Obviously, if he or she isn't adding any nuanced value / interesting thought provoking ideas, than yeah its a problem and you should not be using them for the research.
As long as the content adds value, don't really if it's phrased by a bot.
Who the fuck cares? How is the scenario different to getting a colleague to checking it over?
Corrupti minima enim et mollitia voluptate dolorem eligendi. Et voluptatem est optio consequatur voluptas omnis. Amet non ut delectus occaecati enim. Ut et quisquam possimus aliquam quia et fugit nostrum.
Aut minus repellendus delectus fugit omnis odit et. Eos unde voluptatibus atque sed rem corrupti.
Nulla maiores dolor ut et dolorum dignissimos. Voluptatem aut sint voluptates incidunt sed impedit nobis itaque. Sapiente voluptatem facere sed sint rerum.
See All Comments - 100% Free
WSO depends on everyone being able to pitch in when they know something. Unlock with your email and get bonus: 6 financial modeling lessons free ($199 value)
or Unlock with your social account...