We are excited to announce the general availability of QnAIntent on Amazon Lex. This allows developers to securely connect foundational models (FM) to enterprise data for retrieval augmentation generation (RAG). Introduced in preview at re:Invent in November 2023, QnAIntent leverages enterprise data and underlying models on Amazon Bedrock to generate relevant, accurate, and contextual responses. You can use QnAIntent with new or existing Lex bots to automate Frequently Asked Questions (FAQs) through text and voice channels such as Amazon Connect.
QnAIntent helps bot developers automate customer questions and avoid unnecessary transfers to human agents. Developers no longer need to create variations of intents, sample utterances, slots, and prompts to anticipate and handle a wide range of FAQs. Simply connect your new QnAIntent to your company’s knowledge sources and the bot will use your authorized content to instantly answer questions like “What documents do I need to submit for an accident claim?” You will be able to respond to QnAIntent currently supports Amazon Bedrock, Amazon OpenSearch, and Amazon Kendra knowledge bases. Developers can also choose between a summary of generated responses or a full response match, giving developers control over the responses their bot provides. QnAIntent is currently generally available in English in the US East (N. Virginia) and US West (Oregon) regions. For more information, see the Amazon Lex documentation page.