Seamless integration of buyer expertise, collaboration instruments, and related knowledge is the inspiration for delivering knowledge-based productiveness features. On this put up, we present you learn how to combine the favored Slack messaging service with AWS generative AI providers to construct a pure language assistant the place enterprise customers can ask questions of an unstructured dataset.
To display, we create a generative AI-enabled Slack assistant with an integration to Amazon Bedrock Information Bases that may expose the mixed data of the AWS Effectively-Architected Framework whereas implementing safeguards and accountable AI utilizing Amazon Bedrock Guardrails.
Amazon Bedrock is a totally managed service that gives a selection of high-performing basis fashions (FMs) from main AI firms like AI21 labs, Anthropic, Cohere, Meta, Stability AI and Amazon by means of a single API.
Amazon Bedrock Information Bases offers a totally managed Retrieval Augmented Era (RAG) workflow, a method that fetches knowledge from firm knowledge sources and enriches the immediate to supply extra related and correct responses to pure language queries. This makes Amazon Bedrock Information Bases a horny possibility to include superior generative AI capabilities into services and products with out the necessity for in depth machine studying experience.
Amazon Bedrock Guardrails lets you implement safeguards to construct and customise security, privateness, and truthfulness protections on your generative AI functions to align with accountable AI insurance policies. Guardrails will help stop undesirable content material, block immediate injections, and take away delicate data for privateness, defending your organization’s model and fame.
This content material builds on posts corresponding to Deploy a Slack gateway for Amazon Bedrock by including integrations to Amazon Bedrock Information Bases and Amazon Bedrock Guardrails, and the Bolt for Python library to simplify Slack message acknowledgement and authentication necessities.
Resolution overview
The code within the accompanying GitHub repo offered on this answer allows an automatic deployment of Amazon Bedrock Information Bases, Amazon Bedrock Guardrails, and the required sources to combine the Amazon Bedrock Information Bases API with a Slack slash command assistant utilizing the Bolt for Python library.
On this instance, we ingest the documentation of the Amazon Effectively-Architected Framework into the data base. Then we use the mixing to the Amazon Bedrock Information Bases API to supply a Slack assistant that may reply consumer questions on AWS structure greatest practices. You’ll be able to substitute the instance documentation on your enterprise dataset, corresponding to your company, HR, IT, or safety insurance policies, or gear consumer or upkeep guides.
The next diagram illustrates the high-level answer structure.
Within the following sections, we talk about the important thing elements in additional element.
Slack integration
The Slack integration is offered by means of the Slack Bolt Library for Python operating within the Request Processor AWS Lambda perform. The Slack Bolt Library handles authentication and permissions to the Slack software we construct, and comes with built-in assist for asynchronous request dealing with. Slack Bolt offers a dedicated user guide to deploy and run the library in a Lambda perform.
Retrieval Augmented Era
Amazon Bedrock Information Bases provides FMs contextual data out of your personal knowledge sources for RAG to ship extra related, correct, and customised responses.
The RAG workflow consists of two key elements: knowledge ingestion and textual content technology.

- Information ingestion workflow – Throughout knowledge ingestion, unstructured knowledge from the info supply is separated into chunks. Chunks are quick collection of textual content from every supply doc separated by a set phrase depend, paragraphs, or a single thought. Chunks are vectorized and saved in a vector database. Amazon Bedrock Information Bases helps plenty of vector databases, corresponding to Amazon OpenSearch Serverless, Amazon Aurora, Pinecone, Redis Enterprise Cloud, and Mongo DB Atlas. On this instance, we use the default possibility of OpenSearch Serverless.
- Textual content technology workflow – After the supply knowledge is ingested into the vector database, we are able to carry out a semantic search to search out chunks of information which can be related to the consumer question primarily based on contextualized that means as a substitute of simply literal string matching. To finish the method, each the consumer question and the related knowledge chunks are introduced to the chosen massive language mannequin (LLM) to create a pure language response.
Amazon Bedrock Information Bases APIs
Amazon Bedrock Information Bases offers a totally managed RAG workflow that’s uncovered utilizing two foremost APIs:
- Retrieve – This API retrieves the related knowledge chunks utilizing semantic search, which you’ll then course of additional in software logic
- RetrieveAndGenerate – This API completes a full RAG textual content technology workflow to return a pure language response to a human question of the given dataset
The answer on this put up calls the RetrieveAndGenerate API to return the pure language response to the Slack Bolt integration library.
Amazon Bedrock Guardrails
Amazon Bedrock Guardrails offers extra customizable safeguards on high of built-in protections supplied by FMs, delivering security options which can be among the many greatest within the trade.
On this answer, we configure Amazon Bedrock Guardrails with content material filters, delicate data filters, and phrase filters.
Content material filters assist detect and filter dangerous consumer inputs and model-generated outputs throughout six classes: immediate injections, misconduct, insults, hate, violence, and sexually specific content material. On this answer, we use all six content material filter classes.
Delicate data filters detect delicate data corresponding to personally identifiable data (PII) knowledge in a immediate or mannequin responses. To align together with your particular case, you should utilize customized delicate data filters by defining them with common expressions (regex).
On this answer, we configure delicate data filters as follows:
Electronic mailwith an motion ofAnonymizeCellphonewith an motion ofAnonymizeTitlewith an motion ofAnonymizeCredit_Debit_Card_Numberwith an motion ofBlock
Phrase filters are used to dam phrases and phrases in enter prompts and mannequin responses. On this answer, we have now enabled the AWS offered profanity filter. To align together with your use case, you possibly can create customized phrase filters.
Resolution walkthrough
Slack interfaces with a easy REST API, configured with Lambda proxy integration that in flip interacts with Amazon Bedrock Information Bases APIs.
The answer is deployed with the next high-level steps:
- Create a brand new Slack software.
- Allow third-party mannequin entry in Amazon Bedrock.
- Deploy the Slack to Amazon Bedrock integration utilizing the AWS Cloud Improvement Equipment (AWS CDK).
- Ingest the AWS Effectively-Architected Framework paperwork to the data base.
Stipulations
To implement this answer, you want the next stipulations:
- A Slack workspace the place you might have permissions to create a Slack app (when you don’t have a Slack workspace, join a workspace on Slack.com)
- An AWS account
- Entry to following AWS providers:
This put up assumes a working data of the listed AWS providers. Some understanding of vector databases, vectorization, and RAG could be advantageous, however not needed.
Create a brand new Slack software
After you might have logged in to your Slack workspace, full the next steps:
- Navigate to your Slack apps and create a brand new software.

- Select From scratch when prompted.

- Present an software identify. For this put up, we use the identify
aws-war-bot. - Select your workspace and select Create App.

- To offer permissions on your Slack software, select OAuth & Permissions in your Slack software navigation pane.

- Within the Scopes part, beneath Bot Token Scopes, add the next permissions:
calls:writeinstructionsincoming-webhook

- Underneath OAuth Tokens for Your Workspace, select Set up to [workspace name].
- Select a channel that the Slack software will likely be accessed from. You could wish to first create a devoted channel in Slack for this goal.
- Select Enable.

- When the Slack software set up is full, copy the token worth generated for Bot Person OAuth Token to make use of in a later step.

- Underneath Settings within the navigation pane, select Primary Info.
- Within the App Credentials part, copy the worth for Signing Secret and save this to make use of later.

Allow mannequin entry in Amazon Bedrock
Full the next steps to allow mannequin entry in Amazon Bedrock:
- On the Amazon Bedrock console, select Mannequin entry within the navigation pane.
- Select Modify mannequin Entry or Allow particular fashions (if that is the primary time utilizing Amazon Bedrock in your account).
- Choose the fashions you wish to use for the embeddings and RAG question response fashions. On this put up, we use Amazon Titan Textual content Embeddings V2 because the embeddings mannequin and Anthropic’s Claude Sonnet 3 for the RAG question fashions within the
US-EAST-1AWS Area. - Select Subsequent.
- Assessment the mannequin choice and select Submit.

In the event you’re not utilizing the US-EAST-1 Area, the fashions accessible to request might differ.
When the entry request is full, you will note the mannequin’s standing proven as Entry granted for the chosen fashions.![]()
Deploy the Slack to Amazon Bedrock integration
On this part, you deploy the companion code to this put up to your AWS account, which is able to deploy an API on API Gateway, a Lambda perform, and an Amazon Bedrock data base with OpenSearch Serverless because the vector database.
This part requires AWS CDK and TypeScript to be put in in your native built-in growth atmosphere (IDE) and for an AWS account to be bootstrapped. If this has not been performed, seek advice from Getting began with the AWS CDK.
- Clone the code from the GitHub repository:
- Open the
amazon-bedrock-knowledgebase-slackbotlisting in your most well-liked IDE and open thelib/amazon-bedrock-knowledgebase-slackbot-stack.tsfile. - Replace the variables if wanted (relying on mannequin entry and Regional assist) for the RAG question and embeddings fashions:
- Save the adjustments in spite of everything updates are full.
- From the basis of your repository, run the command
npm set up. - Run the command
cdk synthto carry out primary validation of AWS CDK code. This generates a CloudFormation template from the AWS CDK stack, which might be reviewed within thecdk.outlisting created within the root of the repository. - To deploy the appliance stack, run the next command, changing the values with the token and the signing secret you created earlier:
The AWS CDK will deploy the stack as a CloudFormation template. You’ll be able to monitor the progress of the deployment on the AWS CloudFormation console.
Moreover, AWS CDK will try to deploy the appliance stack to the default account and Area utilizing the default credentials file profile. To alter profiles, add the profile flag. For instance:
When the deployment is full, you will note an output just like the next screenshot, which particulars the API endpoint that has simply been deployed.
- Copy the API endpoint URL for later use.

You can too retrieve this URL on the Outputs tab of the CloudFormation stack AmazonBedrockKnowledgebaseSlackbotStack that was run to deploy this answer.
- Swap again to the Slack API web page.
- Underneath the Slack software you created, select Slash Instructions within the navigation pane after which select Create New Command.

- Present the next data (be certain to incorporate the Area and API ID that has been deployed):
- For Command, enter
/ask-aws. - For Request URL, enter
https://[AWS-URL]/slack/[command]. For instance,https://ab12cd3efg.execute-api.us-east-1.amazonaws.com/prod/slack/ask-aws. - For Quick Description, enter an outline (for instance,
AWS WAR Bot).

- For Command, enter
- Select Save.
- Reinstall the Slack software to your workspace within the Set up App part by selecting Reinstall subsequent to the workspace identify.

- Select the channel the place the Slack app will likely be deployed and select Enable.

Within the Slack channel, you will note a message just like the one within the following screenshot, indicating that an integration with the channel has been added.
Populate the Amazon Bedrock data base
Full the next steps to populate the Amazon Bedrock data base with the mixed data of the AWS Effectively-Architected Framework:
- Obtain the next AWS Effectively-Architected Framework paperwork:
You can too embrace any Effectively-Architected Lenses which can be related to your group by downloading from AWS Whitepapers and Guides.
- On the Amazon Bedrock console, select Information bases within the navigation pane.
- Select the data base you deployed (
slack-bedrock-kb).
- Within the Information supply part beneath Supply hyperlink, select the S3 bucket hyperlink that’s displayed.
It will open the S3 bucket that’s being utilized by the Amazon Bedrock data base as the info supply.

- Within the S3 bucket, select Add then Add information, and choose all the downloaded AWS Effectively-Architected paperwork from the earlier step.
- When the paperwork have accomplished importing, change again to the Information bases web page on the Amazon Bedrock console.
- Choose the info supply identify and select Sync.
It will sync the paperwork from the S3 bucket to the OpenSearch Serverless vector database. The method can take over 10 minutes.
When the sync is full, the info supply will present a Standing of Out there.

Take a look at the Slack software integration with Amazon Bedrock
Full the next steps to check the mixing:
- Open the Slack channel chosen within the earlier steps and enter
/ask-aws.
The Slack software will likely be displayed.
- Select the Slack software and enter your immediate. For this take a look at, we use the immediate “Inform me in regards to the AWS Effectively Architected Framework.”
The Slack software will reply with Processing Request and a duplicate of the entered immediate. The appliance will then present a response to the immediate.

- To check that the guardrails are working as required, write a immediate that may invoke a guardrail intervention.
When an intervention happens, you’ll obtain the next predefined message as your response.

Clear up
Full the next steps to scrub up your sources:
- Out of your terminal, run the next command, changing the values with the token and the signing secret created earlier:
- When prompted, enter y to substantiate the deletion of the deployed stack.

Conclusion
On this put up, we applied an answer that integrates an Amazon Bedrock data base with a Slack chat channel to permit enterprise customers to ask pure language questions of an unstructured dataset from a well-known interface. You should use this answer for a number of use circumstances by configuring it to completely different Slack functions and populating the data base with the related dataset.
To get began, clone the GitHub repo and improve your clients’ interactions with Amazon Bedrock. For extra details about Amazon Bedrock, see Getting began with Amazon Bedrock.
In regards to the Authors

Barry Conway is an Enterprise Options Architect at AWS with 20 years of expertise within the know-how trade, bridging the hole between enterprise and know-how. Barry has helped banking, manufacturing, logistics, and retail organizations notice their enterprise targets.
Dean Colcott is an AWS Senior GenAI/ML Specialist Resolution Architect and SME for Amazon Bedrock. He has areas of depth in integrating generative AI outcomes into enterprise functions, full stack growth, video analytics, and laptop imaginative and prescient and enterprise knowledge platforms.

