Natural language generation (NLG) is the use of artificial intelligence (AI) programming to produce written or spoken narrative from a dataset. NLG is related to computational linguistics, natural language processing (NLP) and natural language understanding (NLU), the areas of AI concerned with human-to-machine and machine-to-human interaction.
NLG research often focuses on building computer programs that provide data points with context. Sophisticated NLG software has the ability to mine large quantities of numerical data, identify patterns and share that information in a way that is easy for humans to understand. The speed of NLG software is especially useful for producing news and other time-sensitive stories on the internet. At its best, NLG output can be published verbatim as web content.
The Associated Press and other media outlets have used NLG robojournalism programs for many years to give data sets context. For example, when an earthquake struck Los Angeles in 2014, a content generation algorithm created by programmer/journalist Ken Schwencke posted the story to the L.A. Times within eight minutes of the tremor, complete with a map pinpointing the epicenter.
While it is relatively easy for humans to recognize NLG during interactions with a mechanical or digital device, it is often difficult for humans to tell when written text has been generated by a computer. As of this writing, it is up to the publisher to decide whether machine-generated content is labeled as such.