Sunday, April 19, 2026
banner
Top Selling Multipurpose WP Theme

Managing cloud prices and understanding useful resource utilization could be a daunting job, particularly for organizations with advanced AWS deployments. Whereas the AWS Value and Utilization Report (AWS CUR) gives useful knowledge insights, decoding and querying the uncooked knowledge might be troublesome.

This publish describes an answer that makes use of generative synthetic intelligence (AI) to generate SQL queries from a consumer’s pure language questions. The answer simplifies the method of utilizing SQL question technology to question CUR knowledge saved in an Amazon Athena database, working the queries in Athena, and displaying them in an internet portal for simple understanding.

The answer makes use of Amazon Bedrock, a completely managed service that gives a alternative of high-performance foundational fashions (FMs) from main AI firms similar to AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon via a single API, and likewise gives a variety of capabilities for constructing generative AI functions with safety, privateness, and accountable AI.

Points to be resolved

The next challenges can stop organizations from successfully analyzing CUR knowledge, resulting in inefficiencies, overspending, and missed value optimization alternatives: Utilizing generative AI with Amazon Bedrock, we goal to focus on and simplify these challenges:

  • SQL Question Complexity – Writing SQL queries to derive insights from CUR knowledge might be sophisticated, particularly for non-technical customers or these unfamiliar with CUR knowledge constructions (except you’re an skilled database administrator).
  • Information Accessibility – To achieve insights from the structured knowledge within the database, customers must entry the database, which could be a potential risk to general knowledge safety.
  • Ease of use Conventional strategies of analyzing CUR knowledge usually lack user-friendly interfaces, making it troublesome for non-technical customers to faucet into the dear insights hidden within the knowledge.

Answer overview

The answer described here’s a internet utility (chatbot) that permits customers to ask questions on their AWS prices and utilization in pure language. The appliance generates SQL queries based mostly on consumer enter, runs them towards an Athena database containing CUR knowledge, and shows the leads to a user-friendly format. The answer combines the facility of generative AI, SQL technology, database querying, and an intuitive internet interface to supply a seamless expertise for analyzing CUR knowledge.

This resolution makes use of the next AWS companies:

The next diagram exhibits the answer structure:

Determine 1. Answer structure

The info movement consists of the next steps:

  1. CUR knowledge is saved in Amazon S3.
  2. Athena is configured to entry and question the CUR knowledge saved in Amazon S3.
  3. Customers work together with the Streamlit internet utility and submit pure language questions on their AWS prices and utilization.
Figure 2. Showing the chatbot dashboard for asking questions

Determine 2. Exhibiting the chatbot dashboard for asking questions

  1. The Streamlit utility sends consumer enter to Amazon Bedrock, and the LangChain utility facilitates the general orchestration.
  2. The LangChain code makes use of LangChain’s BedrockChat class to name FM, which interacts with Amazon Bedrock to generate SQL queries based mostly on the consumer’s enter.
Figure 3. Showing the initialization of the SQL chain

Determine 3. Exhibiting the initialization of the SQL chain

  1. The generated SQL queries are run towards the Athena database utilizing FM on Amazon Bedrock to question the CUR knowledge saved in Amazon S3.
  2. The outcomes of the question are returned to the LangChain utility.
Figure 4. Application output log showing generated queries

Determine 4. Utility output log displaying generated queries

  1. LangChain sends the SQL question and the question outcomes again to the Streamlit utility.
  2. The Streamlit utility shows SQL queries and question outcomes to the consumer in a formatted and user-friendly method.
Figure 5. It shows the SQL query and the final output displayed in the chatbot web app, which contains the query results.

Determine 5. It exhibits the SQL question and the ultimate output displayed within the chatbot internet app, which comprises the question outcomes.

Conditions

To arrange this resolution, you want the next conditions:

Configure the answer

To arrange the answer, comply with these steps:

  1. Create an Athena database and desk to retailer your CUR knowledge. Be sure that the suitable permissions and settings are in place for Athena to entry the CUR knowledge saved in Amazon S3.
  2. Configure your compute surroundings to name the Amazon Bedrock API. You’ll want to affiliate an IAM function with this surroundings whose IAM coverage permits entry to Amazon Bedrock.
  3. As soon as the occasion is up and working, set up the next libraries that shall be used to work inside the surroundings:
pip set up langchain==0.2.0 langchain-experimental==0.0.59 langchain-community==0.2.0 langchain-aws==0.1.4 pyathena==3.8.2 sqlalchemy==2.0.30 streamlit==1.34.0

  1. Use the next code to determine a connection to the Athena database utilizing the langchain library and pyathena, and configure the language mannequin to generate SQL queries based mostly on consumer enter utilizing Amazon Bedrock. It can save you this file as cur_lib.py.
from langchain_experimental.sql import SQLDatabaseChain
from langchain_community.utilities import SQLDatabase
from sqlalchemy import create_engine, URL
from langchain_aws import ChatBedrock as BedrockChat
from pyathena.sqlalchemy.relaxation import AthenaRestDialect

class CustomAthenaRestDialect(AthenaRestDialect):
    def import_dbapi(self):
        import pyathena
        return pyathena

# DB Variables
connathena = "athena.us-west-2.amazonaws.com"
portathena="443"
schemaathena="mycur"
s3stagingathena="s3://cur-data-test01/athena-query-result/"
wkgrpathena="main"
connection_string = f"awsathena+relaxation://@{connathena}:{portathena}/{schemaathena}?s3_staging_dir={s3stagingathena}/&work_group={wkgrpathena}"
url = URL.create("awsathena+relaxation", question={"s3_staging_dir": s3stagingathena, "work_group": wkgrpathena})
engine_athena = create_engine(url, dialect=CustomAthenaRestDialect(), echo=False)
db = SQLDatabase(engine_athena)

# Setup LLM
model_kwargs = {"temperature": 0, "top_k": 250, "top_p": 1, "stop_sequences": ["nnHuman:"]}
llm = BedrockChat(model_id="anthropic.claude-3-sonnet-20240229-v1:0", model_kwargs=model_kwargs)

# Create the immediate
QUERY = """
Create a syntactically appropriate athena question for AWS Value and Utilization report back to run on the my_c_u_r desk in mycur database based mostly on the query, then have a look at the outcomes of the question and return the reply as SQLResult like a human
{query}
"""
db_chain = SQLDatabaseChain.from_llm(llm, db, verbose=True)

def get_response(user_input):
    query = QUERY.format(query=user_input)
    consequence = db_chain.invoke(query)
    question = consequence["result"].cut up("SQLQuery:")[1].strip()
    rows = db.run(question)
    return f"SQLQuery: {question}nSQLResult: {rows}"

  1. Create a Streamlit internet utility to supply a UI for interacting together with your LangChain utility. Embrace an enter subject for customers to enter pure language questions and think about the generated SQL queries and question outcomes. You possibly can identify this file cur_app.py.
import streamlit as st
from cur_lib import get_response
import os

st.set_page_config(page_title="AWS Value and Utilization Chatbot", page_icon="chart_with_upwards_trend", format="centered", initial_sidebar_state="auto",
menu_items={
        'Get Assist': 'https://docs.aws.amazon.com/cur/newest/userguide/cur-create.html',
        #'Report a bug':,
        'About': "# The aim of this app is that will help you get higher understanding of your AWS Value and Utilization report!"
    })#HTML title
st.title("_:orange[Simplify] CUR data_ :sun shades:")

def format_result(consequence):
    elements = consequence.cut up("nSQLResult: ")
    if len(elements) > 1:
        sql_query = elements[0].substitute("SQLQuery: ", "")
        sql_result = elements[1].strip("[]").cut up("), (")
        formatted_result = []
        for row in sql_result:
            formatted_result.append(tuple(merchandise.strip("(),'") for merchandise in row.cut up(", ")))
        return sql_query, formatted_result
    else:
        return consequence, []

def foremost():
    # Get the present listing
    current_dir = os.path.dirname(os.path.abspath(__file__))
    st.markdown("<div class="foremost">", unsafe_allow_html=True)
    st.title("AWS Value and Utilization chatbot")
    st.write("Ask a query about your AWS Value and Utilization Report:")

  1. Name the get_response kind to attach the LangChain utility with the Streamlit internet utility and show the SQL question and leads to the Streamlit internet utility. Add the next code to the applying code above.
# Create a session state variable to retailer the chat historical past
    if "chat_history" not in st.session_state:
        st.session_state.chat_history = []

    user_input = st.text_input("You:", key="user_input")

    if user_input:
        attempt:
            consequence = get_response(user_input)
            sql_query, sql_result = format_result(consequence)
            st.code(sql_query, language="sql")
            if sql_result:
                st.write("SQLResult:")
                st.desk(sql_result)
            else:
                st.write(consequence)
            st.session_state.chat_history.append({"consumer": user_input, "bot": consequence})
            st.text_area("Dialog:", worth="n".be a part of([f"You: {chat['user']}nBot: {chat['bot']}" for chat in st.session_state.chat_history]), top=300)
        besides Exception as e:
            st.error(str(e))

    st.markdown("</div>", unsafe_allow_html=True)

if __name__ == "__main__":
    foremost()

  1. Deploy the Streamlit and LangChain functions to a internet hosting surroundings, similar to Amazon EC2 or a Lambda operate.

cleansing

So long as this resolution doesn’t name Amazon Bedrock, there aren’t any expenses. To keep away from ongoing Amazon S3 storage expenses for storing your CUR experiences, delete your CUR knowledge and S3 bucket. In case you use Amazon EC2 to arrange the answer, be sure you cease or delete the occasion if you’re completed.

benefit

This resolution gives the next advantages:

  • Simplified Information Evaluation – Use generative AI to investigate CUR knowledge in pure language, eliminating the necessity for superior SQL information.
  • Improved accessibility – Net-based interface permits non-technical customers to entry and achieve insights into CUR knowledge with no need database credentials.
  • Save time – Get prompt solutions to your value and utilization questions with out having to manually write advanced SQL queries.
  • Improved visibility – This resolution gives visibility into AWS prices and utilization, enabling higher value optimization and useful resource administration choices.

abstract

The AWS CUR Chatbot resolution makes use of Amazon Bedrock’s Anthropic Claude to generate SQL queries, database queries, and a user-friendly internet interface to simplify the evaluation of CUR knowledge. The answer permits you to ask questions in pure language, eradicating boundaries and enabling each technical and non-technical customers to realize useful insights into their AWS prices and useful resource utilization. This resolution helps organizations make extra knowledgeable choices, optimize their cloud spend, and enhance general useful resource utilization. We suggest that you just train due diligence when setting this up, particularly in a manufacturing surroundings. You possibly can select different programming languages ​​and frameworks to set it up relying in your preferences and wishes.

Amazon Bedrock makes it straightforward to construct highly effective generative AI functions. Comply with our fast begin information to speed up your improvement. GitHub You too can use Amazon Bedrock information bases to quickly develop state-of-the-art Retrieval Augmented Era (RAG) options or use Amazon Bedrock brokers to allow your generative AI functions to execute multi-step duties throughout enterprise methods and knowledge sources.


Concerning the Creator

Author ImageAnuthosh I’m a Options Architect at AWS India. I dive deep into buyer use circumstances and assist them easy their journey on AWS. I take pleasure in serving to clients by constructing options within the cloud. I’m enthusiastic about migration & modernization, knowledge analytics, resiliency, cybersecurity, and machine studying.

banner
Top Selling Multipurpose WP Theme

Converter

Top Selling Multipurpose WP Theme

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

banner
Top Selling Multipurpose WP Theme

Leave a Comment

banner
Top Selling Multipurpose WP Theme

Latest

Best selling

22000,00 $
16000,00 $
6500,00 $

Top rated

6500,00 $
22000,00 $
900000,00 $

Products

Knowledge Unleashed
Knowledge Unleashed

Welcome to Ivugangingo!

At Ivugangingo, we're passionate about delivering insightful content that empowers and informs our readers across a spectrum of crucial topics. Whether you're delving into the world of insurance, navigating the complexities of cryptocurrency, or seeking wellness tips in health and fitness, we've got you covered.