Sponsored link
Thursday, November 21, 2024

Sponsored link

Home Featured The Chron’s robo-journalism

The Chron’s robo-journalism

Bots are now providing simple news stories for the local paper. Nothing to worry about here.

It looked just like so many other short news briefs in the San Francisco Chronicle; straightforward, factual, a little boring. A modest earthquake struck in Soledad:

The United States Geological Survey detected the quake at 7:54 a.m. with an epicenter 15.1 miles northeast of Soledad. With a magnitude of 3.7 and depth of 6.22 miles, this quake could be felt near the epicenter but damage to structures is unlikely.

Over the last seven days, there have been four other earthquakes above magnitude 3.0 within 100 miles of this area.

What struck me wasn’t the story but the tagline at the end:

This story was created automatically by an online bot built in The San Francisco Chronicle’s newsroom. 

Yes: The Chron is now running stories written by robots.

This is nothing new, really. Bots have been creating sports and financial reports for some time now. SFist returned a couple years ago as a “software-assisted newsroom.” But it’s new to the Chron’s news section, unless I’ve been missing something.

I asked Audrey Cooper, the Chron’s editor, about it:

Seriously? With all of our agreements about the threats to journalism, what are the Chron’s plans to use bots to keep writing stories? This is scary.

Cooper:

Tim, you are really too much sometimes. A better way to approach this would be to ask why we would do this.

It actually requires more journalists to use this bot. However, it is faster than a human because of how our CMSs (yes, plural) work. It has to do with how the systems ingest stories, how reporters start new files, how we have to tag stories and how the content-creation and web systems talk to each other. Suffice to say, in the middle of the night, to do it properly takes a lot of time — time you don’t have because everyone is checking the website immediately after they feel a rumble. This bot is inside the web cms where web producers work; reporters write in a different cms, which creates a lot of lag time as the systems communicate. So we can have a producer or editor check this story much quicker and post something nearly immediately while a reporter checks on damage. It cannot get published until at least one journalist verifies it. It also gives us a vehicle to attach our quake maps, etc., which tell people how far the quake was felt.

In every other case to date, these stories stay up for only a few minutes before they are updated by reporters (who start new files in their usual CMS so these original bot stories don’t get overwritten, which is why you see some of them in the system.)

Nothing to worry about here. Just a few robots, working away in the wee hours, of course under proper human supervision.

This system seems to be working so well in baseball– I can’t wait to see how it transforms journalism.