
We continue to receive more requests from media companies seeking AI training, guidance and workflow implementation for newsrooms and sales departments.
As we start working with newsrooms, it’s pretty common that reactions are mixed to using AI. But as soon as teams understand the guardrails we’re suggesting they use, the conversation becomes more about the possibilities.
I thought it would be helpful to answer a few of the common questions I’m getting, and what we suggest editorial teams put in place to ensure that their standards are upheld in whatever AI tools and workflows they implement.
1. What will we — and won’t we — use AI for?
Most newsrooms we work with are already using AI in some way, but how it’s used varies widely from person to person and in many cases, there are no clear expectations from the organization that guide what people should be doing.
I just wouldn’t make this overly complicated, but here’s a list that we have created of things you can and can’t use AI for that we use as a starting point for many conversations with editorial teams:
FREE GUIDE
Download our 25-page branded content handbookThis detailed guide provides you everything you need to know about growing your branded content business with best practices in pricing, packaging and content creation.
"*" indicates required fields
What we can use AI for
- Transcribing interviews and video-recorded meetings to support reporting
- Helping with grammar, AP style and editing suggestions
- Uploading data or screenshots to help generate visualizations or identify potential story breakouts
- Creating SEO elements (headlines, URLs)
- Drafting social copy and newsletter summaries
- Suggesting story structure or outlines
- Adapting a reported story for a different format or audience (list, newsletter)
What we can’t use AI for
- Original reporting or fact-gathering
- Generate facts, quotes, data, or attribution without verification
- Produce or contribute to breaking news coverage
- Make editorial decisions (what to cover)
- Publish content without human reporting
- Create or alter images in ways that could mislead readers
A general set of rules like these above that allows for flexibility is a really good place to start and gets everyone on the same page.
2. How do we prevent AI errors?
While AI has gotten better, it still is making mistakes and this is especially true for news content.
This is why it’s critical to detail what you are going to use it for and what you won’t, but for the things that make the list of what you will use it for, make sure you have clear checks in place.
I like this checklist as a good, basic place to start and require with each use of AI:
- Names, titles, and organizations
- Dates, numbers, and locations
- Direct quotes and how facts are attributed
3. How do we stop AI from changing our voice?
This is one of the most common complaints I hear: AI doesn’t sound like me or our brand.
First, you need to make sure you are clear with AI on who are and what you do. Using prompts like this can help:
We write clean, straightforward stories in AP style. We avoid marketing language and our reporting is grounded in facts, balance, and depth. We’re a publisher people trust, and our goal is to be accurate, useful, and clear for readers.
This is the kind of guidance that then can be used with whatever task you might be asking for as background that will help AI.
There are other things that you can do to help on the voice and standards including:
- Share some of your published work to show the AI examples of your work and tone that you are shooting for.
- Use personalization settings to tell the AI how your newsroom writes, noting your tone and focus.
- If you create custom GPTs, upload your style guide, editing rules and other standards to help the machine know what to follow.
4. Do we need to disclose AI use and how?
Transparency is critical in everything we do and especially with AI based on questions and reader trust. Here’s a short guide on how to think about this:
When disclosure is needed
- AI-generated or heavily AI-edited illustrations or graphics
- AI-created images used in place of original photography
- AI-assisted videos or animation
- AI-generated audio or voice elements
- Data visualizations or charts where AI was leaned on heavily
Here’s a simple editor’s note you could use: This story was produced by our newsroom with the assistance of AI tools. All reporting, writing, and final editorial decisions were handled by our editorial staff.
When disclosure is not needed
- Transcription of interviews or meetings
- Grammar, spelling, or editing
- SEO headlines, URLs, or metadata
- Social media copy
- Idea generation or background work
We can help with your AI strategy
We would love to help you implement AI best practices in your newsroom. Reach out to me today at david@davidarkinconsulting.com so we can talk about how.
We'd love to help your organization! Fill out the form below to get started.
Recent Posts

How to get 8 posts from a single well-reported story

Build trust with these practical AI guardrails in your newsroom

From holiday help to mid-year school updates, here are December story ideas

Where writing longer on Facebook works

Why your About Us page matters so much
Case Studies

How analytics can make your content better and your sales case studies really effective

How this unique coaching program taught a reporter the digital skills she needs for the future

How branded content sales exploded for this newspaper in New York

How a TV station in South Dakota significantly grew its traffic through Stacker’s news wire

