Bedtime stories are great, if that's what you want, but customers who expect understanding and accurate answers will soon become impatient with bedtime stories. Without an NLU front-end, LLMs are just a way to generate bedtime stories. Hallucinations is just a fancy word for errors, and these errors are unavoidable in current NLG. Otherwise we get the euphemistically-named "hallucinations" for which ChatGPT is justly famous, and which OpenAI has rightly said will not go away. Something else has to understand the input, and trigger appropriate output. If you are a business, your customers expect that you understand their queries and their needs. There is no understanding involved - and that's the danger. I think of it as an infinite amount of monkeys, and a smart filter: the monkeys generate text, and the filter decides which text fits the current use case. #NLG has been very successful in areas such as paraphrasing and machine translation: give it some text and it will generate some related text. It's important not to confuse #NLG, which is what LLMs do, with #NLU which is what most mass consumer applications need. The differences with #ChatGPT are the scale of the LLMs behind it, and the interface which it provides for more general use. This is a game changer for CX, and for some industries - but it is not the technological leap forward some people are claiming. This is spot on from Kane Simms and Kore.ai - #ChatGPT has sent customer expectations sky high, and businesses are struggling to understand what is possible, what is sensible, and what is achievable for them.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |