Yes to human oversight. No to story generation.
How Brown & White at Lehigh University comes up with AI policy for its newsroom
This article was written by a human, really.
But if The Brown & White, the newspaper at Lehigh University, uses ChatGPT for any reason within its stories, the paper has promised that such usage “will be expressly stated in the story or caption.”
After a semester-long internal discussion, the Pennsylvania-based student news outlet released its policy on artificial intelligence tools, emphasizing human oversight and transparency.
Among the prohibitions is no AI use in journalistic processes including sourcing, fact-checking, content generation, interview question generation and photo and art alteration, according to the paper’s internal written policy.
Those tools may be used in SEO optimization, revisions of already-published headlines, grammar checking and interview or meeting transcription, according to the policy. All these must only happen with explicit permission from an editor.
Editors-in-chief and the paper’s advisor provided an inside look into how their team came up with such a policy at a College Media Association panel in New York City in March.
The conversation was among about 35 top editors, former Editor-in-Chief Sam Barney-Gibbs said.
It went like this: The editors were looking at everything from text editing to multimedia content while considering what’s OK to do, what shouldn’t be allowed and what aspects fall into the gray area.
They discussed different tools like Google, ChatGPT and Otter.ai, and they looked into existing guidelines from professional newsrooms.
From that discussion, Barney-Gibbs said he learned the editorial board was “a lot more polarizing” than he thought.
Some people were open to the idea of using AI. Some were already using it. Some were a hard no. He emphasized the need to at least open up that dialogue.
While the conversation prompted people to think about things philosophically, he said it was also important to ensure it leads to a concrete written policy the audience can understand.
“Let people know that this is productive,” he said. “It’s better for our paper, for the content we produce and for transparency with our community.”
After that debate was over in the fall, current Editor-in-Chief Ella Holland took charge.
She said she was working to put all the ideas together to create two documents for internal and external uses.
One big emphasis on the internal document was human oversight.
The policy states that “neither editors nor reporters are permitted to use generative AI technology, such as ChatGPT, in the story process, which includes reporting, writing, and editing.”
It also says readers can contact editors with any concerns.
With human oversight, Holland said the team would be aware of the general process of those AI, so they are able to explain it to their readers when asked.
The newspaper will also produce a survey each semester to “gauge interests and concerns” of the audience, according to the policy.
Matt Veto, a teaching assistant professor of journalism, said the whole process took about a little more than a semester, from brainstorming to analyzing different ideas and writing them down on paper.
“Developing a policy was something that we wanted to do thoughtfully and with some kind of research or with some kind of methodology behind that process,” Veto said. “I would advocate, as an advisor, just be done truly at the student level collaboratively.”
Bigger picture
Study shows impact: Nearly three-quarters of student and early-career journalists believe AI will significantly impact journalism, according to a September 2023 report from Greentarget. Around 50% of them consider it a threat to the industry’s workforce.
Student adoption: Student journalists are already incorporating AI into their work, according to the same study. The majority of them use translation tools such as Google Translate, DeepL and Mirai, as well as writing, research and copy-editing tools.
University of Portland: Oregon-based The Beacon asserted its commitment to human-created content, refraining from publishing stories written by AI due to concerns about bias.
The Archers Schools of Girls: Los Angeles high school paper The Oracle has an official policy against using AI chatbots, although they use Otter.ai and Grammarly for specific tasks.
Carlmont High School: The Scot Scoop in California adheres to the Society of Professional Journalists ethics guidelines. No AI in reporting and story writing. Being transparent about the usage.
Northern Illinois University: The Northern Star will utilize AI in covering campus news. Just kidding. That was just an April Fools’ joke. For real now — they utilize AI for brainstorming, grammar correction and interview transcriptions.
Developing best practices: Northwestern University got a $1 million grant to create an AI guideline for news outlets.
Your newsroom needs an AI policy
Poynter has a guidebook to assist newsrooms in crafting AI ethics policies. That’s here.
Klaudia Jaźwińska, a researcher at Tow Center for Digital Journalism at Columbia University and a former editor-in-chief of The Brown and White, offered some suggestions for creating an effective AI policy:
Tailor responsibilities within the policy to different newsroom roles to reflect their interactions with AI tools.
View the policy as a guiding framework rather than restrictive rules.
Draw a clear line between hard-no practices and experimental areas.
Prioritize transparency by openly communicating with the audience about AI decisions.
Create a two-way communication channel with the audience for feedback.
Correction: Barney-Gibbs’ last name was spelled incorrectly in the earlier version.
Story Spotlight:
Generative AI in journalism: The evolution of newswork and ethics in a generative information ecosystem (The Associated Press): The AP presents original research on the state of generative AI usage and views in a survey conducted with journalists around the world. 🤖 TD;LR: Poynter has a summary: Despite ethical concerns, nearly 70% of newsroom staffers recruited for an Associated Press survey say they’re using generative AI to create content.
Featured Opportunities:
Nominate your teacher/professor for SPJ’s Distinguished Teaching in Journalism Award before April 17.
Democracy Now is looking for a digital fellow. Apply before April 22.
Students who are members of the Asian American Journalists Association can apply for broadcast news internship grants before April 22.
The Press Club Institute is hosting a webinar on how journalists can champion news literacy and empower their communities April 24.
The Nation Student Journalism Conference hosts its in-person event at The New School (where I go!) May 31. It’s free. Apply before May 1.
Apply to the six-month Resolve Philly internship program before May 6.
Business Insider has multiple remote openings for fellowships and internships for the summer.
The Pulitzer Center on Crisis Reporting is seeking a remote, full-time, one-year-long editorial intern.
I want to hear from you: What type of stories do you want to see in The Nutgraf? Is your student publication doing something cool that you’d like to share? Reach me at nutgrafnews@substack.com. I will respond!