OpenAI’s text-generating system GPT-3 is now spewing out 4.5 billion words a day
James Vincent reports in The Verge:
The best-known AI text-generator is OpenAI’s GPT-3, which the company recently announced is now being used in more than 300 different apps, by “tens of thousands” of developers, and producing 4.5 billion words per day. That’s a lot of robot verbiage. This may be an arbitrary milestone for OpenAI to celebrate, but it’s also a useful indicator of the growing scale, impact, and commercial potential of AI text generation.
The obvious industry target for auto-text is journalism. It is already being used there for sports reporting, and could probably be used for local government reporting too. There are other areas it will encroach on that are news-adjacent (think financial news or business book digest services like Blinkist), like education and entertainment.
I’m an auditor, and I think that, eventually, auto-text is going to destroy about 50% of our billable hours. Auditors spend a ton of time writing. We document our procedures. We record our work. We summarize our findings. We write reports for our clients. Making sense of all the work, both for our own understanding and for our client’s, for the sake of writing it down, takes a lot of time. Moreover, the simple act of typing it out and, especially, hyperlinking everything together so we support our conclusions, requires a lot of manual work. Some of this work is communicative, but a lot of it (like the hyperlinking part) is mechanical. Consequently, much of it is ripe for automation.
I think, though, that transforming this laborious and time-consuming writing process into something auto-generated by a bot would produce a lot more information but a lot less knowledge than we had before. You gain a lot more understanding how a company, business process, or control works by writing about it than by reading about it. Why? Because writing is thinking. To write well is mentally strenuous. It requires you to think how you would communicate an idea to someone else, notably someone else with a different perspective and different knowledge than you have. At the end of the writing process, you should understand your subject backwards and forwards and from all sides; you could probably describe it in a number of different ways; and you will likely remember the gist of it, and specific nuances about it, for far longer than you would if you had just read about it in a report.
The world won’t end, and few will shed tears, when AI writes audit reports and work papers. Audits cost money, and businesses mostly don’t think they benefit from them. If it makes the auditors less knowledgeable, however, side effects will develop, and those will be long-lasting. Knowledgable auditors help make good control environments possible, which leads to more stable and more solvent companies. Decreasing the understanding of auditors could make systems and processes more unstable and less trustworthy over time. Additionally, knowledgable auditors often move to the business side and go on to manage well-controlled organizations. While the auditor-to-manager career path will likely continue, such moves would not result in the same level of management quality if the auditors don’t have to think as deeply about what they are auditing as they did before the AI text-bots took over. Neither of these side effects bodes particularly well for the people the auditors and the audited companies (and their fancy AI) serve.