AI brokers proceed to realize momentum, as companies use the facility of generative AI to reinvent buyer experiences and automate advanced workflows. We’re seeing Amazon Bedrock Brokers utilized in funding analysis, insurance coverage claims processing, root trigger evaluation, promoting campaigns, and rather more. Brokers use the reasoning functionality of basis fashions (FMs) to interrupt down user-requested duties into a number of steps. They use developer-provided directions to create an orchestration plan and perform that plan by securely invoking firm APIs and accessing data bases utilizing Retrieval Augmented Era (RAG) to precisely deal with the consumer’s request.
Though organizations see the advantage of brokers which might be outlined, configured, and examined as managed sources, we now have more and more seen the necessity for an extra, extra dynamic approach to invoke brokers. Organizations want options that modify on the fly—whether or not to check new approaches, reply to altering enterprise guidelines, or customise options for various purchasers. That is the place the brand new inline brokers functionality in Amazon Bedrock Brokers turns into transformative. It lets you dynamically modify your agent’s habits at runtime by altering its directions, instruments, guardrails, data bases, prompts, and even the FMs it makes use of—all with out redeploying your software.
On this submit, we discover how you can construct an software utilizing Amazon Bedrock inline brokers, demonstrating how a single AI assistant can adapt its capabilities dynamically based mostly on consumer roles.
Inline brokers in Amazon Bedrock Brokers
This runtime flexibility enabled by inline brokers opens highly effective new potentialities, corresponding to:
- Speedy prototyping – Inline brokers decrease the time-consuming create/replace/put together cycles historically required for agent configuration modifications. Builders can immediately check totally different combos of fashions, instruments, and data bases, dramatically accelerating the event course of.
- A/B testing and experimentation – Information science groups can systematically consider totally different model-tool combos, measure efficiency metrics, and analyze response patterns in managed environments. This empirical strategy allows quantitative comparability of configurations earlier than manufacturing deployment.
- Subscription-based personalization – Software program firms can adapt options based mostly on every buyer’s subscription stage, offering extra superior instruments for premium customers.
- Persona-based information supply integration – Establishments can modify content material complexity and tone based mostly on the consumer’s profile, offering persona-appropriate explanations and sources by altering the data bases related to the agent on the fly.
- Dynamic instrument choice – Builders can create functions with tons of of APIs, and shortly and precisely perform duties by dynamically selecting a small subset of APIs for the agent to think about for a given request. That is significantly useful for big software program as a service (SaaS) platforms needing multi-tenant scaling.
Inline brokers increase your choices for constructing and deploying agentic options with Amazon Bedrock Brokers. For workloads needing managed and versioned agent sources with a pre-determined and examined configuration (particular mannequin, directions, instruments, and so forth), builders can proceed to make use of InvokeAgent on sources created with CreateAgent. For workloads that want dynamic runtime habits modifications for every agent invocation, you should use the brand new InvokeInlineAgent API. With both strategy, your brokers will probably be safe and scalable, with configurable guardrails, a versatile set of mannequin inference choices, native entry to data bases, code interpretation, session reminiscence, and extra.
Answer overview
Our HR assistant instance reveals how you can construct a single AI assistant that adapts to totally different consumer roles utilizing the brand new inline agent capabilities in Amazon Bedrock Brokers. When customers work together with the assistant, the assistant dynamically configures agent capabilities (corresponding to mannequin, directions, data bases, motion teams, and guardrails) based mostly on the consumer’s function and their particular alternatives. This strategy creates a versatile system that adjusts its performance in actual time, making it extra environment friendly than creating separate brokers for every consumer function or instrument mixture. The entire code for this HR assistant instance is accessible on our GitHub repo.
This dynamic instrument choice allows a personalised expertise. When an worker logs in with out direct studies, they see a set of instruments that they’ve entry to based mostly on their function. They’ll choose from choices like requesting trip time, checking firm insurance policies utilizing the data base, utilizing a code interpreter for information evaluation, or submitting expense studies. The inline agent assistant is then configured with solely these chosen instruments, permitting it to help the worker with their chosen duties. In a real-world instance, the consumer wouldn’t have to make the choice, as a result of the appliance would make that call and mechanically configure the agent invocation at runtime. We make it express on this software so as to show the influence.
Equally, when a supervisor logs in to the identical system, they see an prolonged set of instruments reflecting their further permissions. Along with the employee-level instruments, managers have entry to capabilities like operating efficiency evaluations. They’ll choose which instruments they need to use for his or her present session, immediately configuring the inline agent with their selections.
The inclusion of data bases can also be adjusted based mostly on the consumer’s function. Staff and managers see totally different ranges of firm coverage info, with managers getting further entry to confidential information like efficiency overview and compensation particulars. For this demo, we’ve carried out metadata filtering to retrieve solely the suitable stage of paperwork based mostly on the consumer’s entry stage, additional enhancing effectivity and safety.
Let’s have a look at how the interface adapts to totally different consumer roles.
The worker view offers entry to important HR features like trip requests, expense submissions, and firm coverage lookups. Customers can choose which of those instruments they need to use for his or her present session.
The supervisor view extends these choices to incorporate supervisory features like compensation administration, demonstrating how the inline agent will be configured with a broader set of instruments based mostly on consumer permissions.

The supervisor view extends these capabilities to incorporate supervisory features like compensation administration, demonstrating how the inline agent dynamically adjusts its out there instruments based mostly on consumer permissions. With out inline brokers, we would wish to construct and keep two separate brokers.
As proven within the previous screenshots, the identical HR assistant affords totally different instrument alternatives based mostly on the consumer’s function. An worker sees choices like Information Base, Apply Trip Instrument, and Submit Expense, whereas a supervisor has further choices like Efficiency Analysis. Customers can choose which instruments they need to add to the agent for his or her present interplay.
This flexibility permits for fast adaptation to consumer wants and preferences. As an illustration, if the corporate introduces a brand new coverage for creating enterprise journey requests, the instrument catalog will be shortly up to date to incorporate a Create Enterprise Journey Reservation instrument. Staff can then select so as to add this new instrument to their agent configuration when they should plan a enterprise journey, or the appliance might mechanically accomplish that based mostly on their function.
With Amazon Bedrock inline brokers, you may create a catalog of actions that’s dynamically chosen by the appliance or by customers of the appliance. This will increase the extent of flexibility and adaptableness of your options, making them an ideal match for navigating the advanced, ever-changing panorama of contemporary enterprise operations. Customers have extra management over their AI assistant’s capabilities, and the system stays environment friendly by solely loading the mandatory instruments for every interplay.
Technical basis: Dynamic configuration and motion choice
Inline brokers permit dynamic configuration at runtime, enabling a single agent to successfully carry out the work of many. By specifying motion teams and modifying directions on the fly, even inside the similar session, you may create versatile AI functions that adapt to varied eventualities with out a number of agent deployments.
The next are key factors about inline brokers:
- Runtime configuration – Change the agent’s configuration, together with its FM, at runtime. This allows speedy experimentation and adaptation with out redeploying the appliance, lowering improvement cycles.
- Governance at instrument stage – Apply governance and entry management on the instrument stage. With brokers altering dynamically at runtime, tool-level governance helps keep safety and compliance whatever the agent’s configuration.
- Agent effectivity – Present solely crucial instruments and directions at runtime to cut back token utilization and enhance the agent accuracy. With fewer instruments to select from, it’s easier for the agent to pick out the correct one, lowering hallucinations within the instrument choice course of. This strategy may result in decrease prices and improved latency in comparison with static brokers as a result of eradicating pointless instruments, data bases, and directions reduces the variety of enter and output tokens being processed by the agent’s giant language mannequin (LLM).
- Versatile motion catalog – Create reusable actions for dynamic choice based mostly on particular wants. This modular strategy simplifies upkeep, updates, and scalability of your AI functions.
The next are examples of reusable actions:
- Enterprise system integration – Join with techniques like Salesforce, GitHub, or databases
- Utility instruments – Carry out widespread duties corresponding to sending emails or managing calendars
- Workforce-specific API entry – Work together with specialised inside instruments and companies
- Information processing – Analyze textual content, structured information, or different info
- Exterior companies – Fetch climate updates, inventory costs, or carry out internet searches
- Specialised ML fashions – Use particular machine studying (ML) fashions for focused duties
When utilizing inline brokers, you configure parameters for the next:
- Contextual instrument choice based mostly on consumer intent or dialog move
- Adaptation to totally different consumer roles and permissions
- Switching between communication kinds or personas
- Mannequin choice based mostly on job complexity
The inline agent makes use of the configuration you present at runtime, permitting for extremely versatile AI assistants that effectively deal with numerous duties throughout totally different enterprise contexts.
Constructing an HR assistant utilizing inline brokers
Let’s have a look at how we constructed our HR Assistant utilizing Amazon Bedrock inline brokers:
- Create a instrument catalog – We developed a demo catalog of HR-related instruments, together with:
- Information Base – Utilizing Amazon Bedrock Information Bases for accessing firm insurance policies and pointers based mostly on the function of the appliance consumer. With the intention to filter the data base content material based mostly on the consumer’s function, you additionally want to supply a metadata file specifying the kind of worker’s roles that may entry every file
- Apply Trip – For requesting and monitoring time without work.
- Expense Report – For submitting and managing expense studies.
- Code Interpreter – For performing calculations and information evaluation.
- Compensation Administration – for conducting and reviewing worker compensation assessments (supervisor solely entry).
- Set dialog tone – We outlined a number of dialog tones to swimsuit totally different interplay kinds:
- Skilled – For formal, business-like interactions.
- Informal – For pleasant, on a regular basis assist.
- Enthusiastic – For upbeat, encouraging help.
- Implement entry management – We carried out role-based entry management. The appliance backend checks the consumer’s function (worker or supervisor) and offers entry to applicable instruments and knowledge and passes this info to the inline agent. The function info can also be used to configure metadata filtering within the data bases to generate related responses. The system permits for dynamic instrument use at runtime. Customers can change personas or add and take away instruments throughout their session, permitting the agent to adapt to totally different dialog wants in actual time.
- Combine the agent with different companies and instruments – We linked the inline agent to:
- Amazon Bedrock Information Bases for firm insurance policies, with metadata filtering for role-based entry.
- AWS Lambda features for executing particular actions (corresponding to submitting trip requests or expense studies).
- A code interpreter instrument for performing calculations and information evaluation.
- Create the UI – We created a Flask-based UI that performs the next actions:
- Shows out there instruments based mostly on the consumer’s function.
- Permits customers to pick out totally different personas.
- Offers a chat window for interacting with the HR assistant.
To grasp how this dynamic role-based performance works beneath the hood, let’s study the next system structure diagram.

As proven in previous structure diagram, the system works as follows:
- The top-user logs in and is recognized as both a supervisor or an worker.
- The consumer selects the instruments that they’ve entry to and makes a request to the HR assistant.
- The agent breaks down the issues and makes use of the out there instruments to resolve for the question in steps, which can embody:
- Amazon Bedrock Information Bases (with metadata filtering for role-based entry).
- Lambda features for particular actions.
- Code interpreter instrument for calculations.
- Compensation instrument (accessible solely to managers to submit base pay increase requests).
- The appliance makes use of the Amazon Bedrock inline agent to dynamically cross within the applicable instruments based mostly on the consumer’s function and request.
- The agent makes use of the chosen instruments to course of the request and supply a response to the consumer.
This strategy offers a versatile, scalable resolution that may shortly adapt to totally different consumer roles and altering enterprise wants.
Conclusion
On this submit, we launched the Amazon Bedrock inline agent performance and highlighted its software to an HR use case. We dynamically chosen instruments based mostly on the consumer’s roles and permissions, tailored directions to set a dialog tone, and chosen totally different fashions at runtime. With inline brokers, you may remodel the way you construct and deploy AI assistants. By dynamically adapting instruments, directions, and fashions at runtime, you may:
- Create customized experiences for various consumer roles
- Optimize prices by matching mannequin capabilities to job complexity
- Streamline improvement and upkeep
- Scale effectively with out managing a number of agent configurations
For organizations demanding extremely dynamic habits—whether or not you’re an AI startup, SaaS supplier, or enterprise resolution staff—inline brokers provide a scalable strategy to constructing clever assistants that develop together with your wants. To get began, discover our GitHub repo and HR assistant demo application, which show key implementation patterns and greatest practices.
To study extra about how you can be most profitable in your agent journey, learn our two-part weblog sequence:
To get began with Amazon Bedrock Brokers, take a look at the next GitHub repository with instance code.
Concerning the authors
Ishan Singh is a Generative AI Information Scientist at Amazon Net Companies, the place he helps clients construct progressive and accountable generative AI options and merchandise. With a robust background in AI/ML, Ishan focuses on constructing Generative AI options that drive enterprise worth. Exterior of labor, he enjoys enjoying volleyball, exploring native bike trails, and spending time along with his spouse and canine, Beau.
Maira Ladeira Tanke is a Senior Generative AI Information Scientist at AWS. With a background in machine studying, she has over 10 years of expertise architecting and constructing AI functions with clients throughout industries. As a technical lead, she helps clients speed up their achievement of enterprise worth by way of generative AI options on Amazon Bedrock. In her free time, Maira enjoys touring, enjoying together with her cat, and spending time together with her household someplace heat.
Mark Roy is a Principal Machine Studying Architect for AWS, serving to clients design and construct generative AI options. His focus since early 2023 has been main resolution structure efforts for the launch of Amazon Bedrock, the flagship generative AI providing from AWS for builders. Mark’s work covers a variety of use circumstances, with a major curiosity in generative AI, brokers, and scaling ML throughout the enterprise. He has helped firms in insurance coverage, monetary companies, media and leisure, healthcare, utilities, and manufacturing. Previous to becoming a member of AWS, Mark was an architect, developer, and expertise chief for over 25 years, together with 19 years in monetary companies. Mark holds six AWS certifications, together with the ML Specialty Certification.
Nitin Eusebius is a Sr. Enterprise Options Architect at AWS, skilled in Software program Engineering, Enterprise Structure, and AI/ML. He’s deeply captivated with exploring the chances of generative AI. He collaborates with clients to assist them construct well-architected functions on the AWS platform, and is devoted to fixing expertise challenges and aiding with their cloud journey.
Ashrith Chirutani is a Software program Improvement Engineer at Amazon Net Companies (AWS). He focuses on backend system design, distributed architectures, and scalable options, contributing to the event and launch of high-impact techniques at Amazon. Exterior of labor, he spends his time enjoying ping pong and mountain climbing by way of Cascade trails, having fun with the outside as a lot as he enjoys constructing techniques.
Shubham Divekar is a Software program Improvement Engineer at Amazon Net Companies (AWS), working in Brokers for Amazon Bedrock. He focuses on creating scalable techniques on the cloud that allow AI functions frameworks and orchestrations. Shubham additionally has a background in constructing distributed, scalable, high-volume-high-throughput techniques in IoT architectures.
Vivek Bhadauria is a Principal Engineer for Amazon Bedrock. He focuses on constructing deep learning-based AI and laptop imaginative and prescient options for AWS clients. Oustide of labor, Vivek enjoys trekking and following cricket.

