For years now, regular innovations around technical communication and content strategy seem to be aimed at preparing the ground for chatbots. Chatbots are “sexy” and that sells.
Yet chatbots are not the future of technical communication.
Chatbots are “stupid”
The idea is not stupid. These artificial intelligences are really stupid. As W. Knight writes, today’s AI does little better than chance to decipher the ambiguity of a phrase like: “City councillors refused a permit to the protesters because they feared violence”. (Who feared violence?) Humans do this so easily that we rarely notice that the ambiguity exists. AIs can’t.
An AI who plays Go or chess has no idea what she is doing. It only analyses a set of statistical data.
They are far from being able to help you solve your technical problem.
In other words, chatbots can be very effective at ordering stuff on Amazon (Google, etc.) or telling you to put on a coat because the forecast says it’s going to rain, but they’re far from ready to help you solve your technical problem.
But even if they were much smarter than they are, chatbots still wouldn’t be the future of technical communication.
Chatbots are a command line interface.
You ask them something. They answer you (often stupidly, who hasn’t done the test with their smartphone). That’s what a command line interface does (you see, that beautiful black screen, white text). In fact, we’ve been using chatbots for a long time, but they didn’t talk.
Saying platitudes is not the pinnacle of intelligence
ELIZA, a chatbot created in the 1960s at the MIT Artificial Intelligence Laboratory, could play the role of a low-level psychotherapist. Saying comforting platitudes to heartbroken people is not the pinnacle of intelligence. Any sympathetic schoolboy can do it.
Solving complex technical problems is much more complicated because the field is bound to be much more diverse. Putting a voice interface on the AI is not going to change that.
Command line interfaces, whether visual or verbal, always have the same problem: they do not allow for discovery or exploration.