Code of Ethics: ChatGPT’s Moral Implications in PR, Part One

Stan O'Neill
|
April 11, 2023

It seems we can’t escape the ChatGPT fervor in recent months. However, the language model raises pressing ethical concerns despite its ubiquitous buzz. When ChatGPT’s creator says he’s “scared” of AI, it’s easy to imagine a dystopian future governed by unfeeling machines. Aside from broader ethical concerns, ChatGPT presents industry-specific conundrums to PR pros. Though emotional resonance and empathy may become afterthoughts in the name of enhanced efficiency, they are still vital in PR, especially for crisis communications.  

One university landed in hot water for using a language model to pen an open letter to its community after a nearby campus shooting. Although ChatGPT may one day dream of electric sheep, this is what we call a PR nightmare. This outrage proves that people still want to feel that an organization values its message and that a human cared enough to invest time into crafting that message. Unfortunately, automated writing conveys the opposite concept to audiences and hinders PR pros’ ethical responsibility to communicate with empathy and emotional resonance on their client’s behalf.

Societal backlash

On a societal level, ChatGPT raises ethical questions for the future of employment in many creative and technical industries. For example, will machine learning enable a UBI utopia where we can work however much we like without sacrificing our livelihoods? Or will language models and other AI-enabled tools cause job loss that incites economic and cultural collapse?  

Some tech leaders are raising red flags, including an open letter signed by the likes of Elon Musk, cautioning against the rapid development of machine learning tools and positing an extended pause until better human safety parameters are established. It’s up to you, dear reader, whether this is out of altruism or a desire to secure a piece of the market share pie.  

Still, these societal concerns highlight that we may be ill-prepared for the ethical implications of language models and related tools. As we keep these wider implications in mind, PR pros must remember there’s no ethical precedent to measure ourselves against when using ChatGPT in our own industry. Thus, we must proceed with caution.

A broken record can’t strike a chord

History offers an optimistic perspective despite these concerns. Industrial revolutions were met with similar resistance, like the Luddites or the Weavers’ Revolt protesting machine-made textiles. But humans consistently create new jobs in the face of technological progress. For once, it may be good if history repeats.

Some critics claim that ChatGPT’s output is uninspired and vapid, with its popularity reflecting society’s plummeting standards for language. One thing is certain amid ChatGPT’s shortcomings - it should augment human writing, not function as a crutch. With that said, ChatGPT can be useful for PR pros who need to kill the blank page, increase efficiency and brainstorm quicker.  

The tool still has issues repeating the same phrasing when writing long-form pieces about certain topics, especially if those topics require technical knowledge past 2021. This can be counterproductive for companies that require highly specific messaging for their cutting-edge technologies. For example, suppose five service providers are all feeding similar blog prompts into ChatGPT about a new protocol for latency-based segment routing. In this case, these companies may have nearly exact phrasing across all their content, syntactically and technically.  

PR pros have an ethical responsibility to provide their clients with diverse, accurate content that helps them stand out from the crowd. So, this disadvantage can hamper our industry’s value. It even leaves clients open to plagiarism. However, PR pros may have bigger concerns than plagiarism when using ChatGPT.

Check back next week for part two of this blog, where we will explore why plagiarism may be the least of your ethical concerns when using ChatGPT for PR content.

Share:

Read More