By Samuel Woolley and Phil Howard
In today’s data-saturated world journalists often struggle to report on many, if not most, of the potential stories that come across their desks. The average reporter is lucky to generate more than a few normal articles a week, while in-depth, investigative, pieces often take months. The 24-hour news cycle, deadlines, an ambient media system, and overwhelming troves of digital information present challenges, and concretize practices, for today’s journalists. How can those who report the news make massive caches of information, especially those with political/civic importance, available to the public? How can data-driven technologies be used to this end? What potentials are there for creating new, creative, digital tools for journalists who would otherwise not have access to such technology?
Enter the Bot
Many journalists have transformed the use of bots–computer programs designed to do automated tasks–to answer the question of gathering, parsing, and disseminating important information that might otherwise go unpublished. Bots are usually designed to save the time and energy of a human author, because they parse and organize information at great speeds, saving human actors from doing the work. Early bots were designed by computer scientists to perform simple computer network maintenance, but bots were quickly extended beyond these types of tasks to social interactions–at least social interactions that could be engineered.
Social bots are particularly prevalent on Twitter. These accounts are run by software and designed to look and act like real human users online. While social-media users get access from front-end websites, bots get access to such websites directly through a code-to-code connection via the site’s wide-open application programming interface (API). This allows them to post and parse information in real time. Numerous news outlets, from the New York Times (here) to the Guardian (here), have covered rising and evolving use of social bots. They attempt to explain how these socially oriented surrogates work in specific contexts, from the world of online dating to that of real-time ad sharing.
Bots as Journalistic Scaffolding
Reporters now use different varieties of bots to mine both the Internet and massive data sets to find nuance in otherwise chaotic info. For instance, bots were used to sift through the massive document sets released by Wikileaks and Edward Snowden. In another case, Nieman Lab offered open-source code for Fuego—a “heat-seeking” bot that works to keep journalists up to date on important topics by sifting through Twitter conversations.
Social bots, on the other hand, can work as proxy accounts–a sort of journalistic scaffolding–to tweet information on particular events. The last two years have seen the emergence of increasingly creative “news” social bots. Reporters, activists, and civic-society groups have built public-facing social bots–programs that mimic human user accounts, sending messages and interacting with other accounts, on sites like Reddit, Twitter, and Tumblr–to engender conversation on pressing social issues, reveal political stances or actions of elected officials, and call attention to protests. Because bots are able to automatically function at a computationally enhanced rate, reading data and reporting information within portions of seconds, they are particularly useful for journalists facing the very real demands of deadlines and traditional story generation.
Bots have even been used to generate stories and to report on pending and real-time natural disasters/public health concerns. Journalists and newsroom data teams have set up social bots to tweet updates on gun control legislation, congressional malpractice, and NSA surveillance. Employees of Al Jazeera and the Los Angeles Times have designed social bots that both tweet on the news and write simple articles. These programs can cover mundane stories—on upcoming community events or sports scores—leaving time for reporters to focus on more in-depth or investigative work.
The Future of Automation and Journalism
Labor scholars and popular media outlets have suggested that robots will put humans out of jobs in the near future. Programs with social bot front ends, like LA Quake Bot, are able to actually write articles–so it’s unsurprising that fear of robot usurpation is present in journalism too. Bots cannot, however, capture the nuance of an investigative report. At best, those like Quake Bot are used for the most simple article generation–recording the when and where of an event.
Beneficial bots suggest potential for a degree of technological optimism. Bots can work to supplement the tools already available to journalists–acting as a reporting prosthesis. They can be programmed to search for particular evidence or information, streamlining the data analysis process. Social bots can automatically tweet news to the public, working to update citizens on important legislation, events, and concerns. They can write basic articles and leave space for journalists to investigate, interview, and comment. In short, journalists should not fear bots but rather should learn how to harness them to their benefit.
Samuel Woolley is the project manager of the Project on Computational Propaganda, a research endeavor based at the University of Washington and the Oxford Internet Institute. Phil Howard is the Principal Investigator of this project and is a professor at the Oxford Internet Institute.
Comments (0)
Comments are closed for this post.