This put up was cowritten by Rishi Srivastava and Scott Reynolds from Clarus Care.
Many healthcare practices at present wrestle with managing excessive volumes of affected person calls effectively. From appointment scheduling and prescription refills to billing inquiries and pressing medical considerations, practices face the problem of offering well timed responses whereas sustaining high quality affected person care. Conventional cellphone programs typically result in lengthy maintain instances, pissed off sufferers, and overwhelmed employees who manually course of and prioritize tons of of calls day by day. These communication bottlenecks not solely influence affected person satisfaction however may also delay vital care coordination.
On this put up, we illustrate how Clarus Care, a healthcare contact middle options supplier, labored with the AWS Generative AI Innovation Heart (GenAIIC) group to develop a generative AI-powered contact middle prototype. This answer allows conversational interplay and multi-intent decision via an automatic voicebot and chat interface. It additionally incorporates a scalable service mannequin to help progress, human switch capabilities–when requested or for pressing circumstances–and an analytics pipeline for efficiency insights.
Clarus Care is a healthcare know-how firm that helps medical practices handle affected person communication via an AI-powered name administration system. By routinely transcribing, prioritizing, and routing affected person messages, Clarus improves response instances, reduces employees workload, and minimizes maintain instances. Clarus is the quickest rising healthcare name administration firm, serving over 16,000 customers throughout 40+ specialties. The corporate handles 15 million affected person calls yearly and maintains a 99% shopper retention charge.
Use case overview
Clarus is embarking on an revolutionary journey to remodel their affected person communication system from a standard menu-driven Interactive Voice Response (IVR) to a extra pure, conversational expertise. The corporate goals to revolutionize how sufferers work together with healthcare suppliers by making a generative AI-powered contact middle able to understanding and addressing a number of affected person intents in a single interplay. Beforehand, sufferers navigated via inflexible menu choices to depart messages, that are then transcribed and processed. This method, whereas useful, limits the system’s capability to deal with complicated affected person wants effectively. Recognizing the necessity for a extra intuitive and versatile answer, Clarus collaborated with the GenAIIC to develop an AI-powered contact middle that may comprehend pure language dialog, handle a number of intents, and supply a seamless expertise throughout each voice and net chat interfaces. Key success standards for the challenge had been:
- A pure language voice interface able to understanding and processing a number of affected person intents resembling billing questions, scheduling, and prescription refills in a single name
- <3 second latency for backend processing and response to the person
- The power to transcribe, document, and analyze name info
- Good switch capabilities for pressing calls or when sufferers request to talk instantly with suppliers
- Help for each voice calls and net chat interfaces to accommodate varied affected person preferences
- A scalable basis to help Clarus’s rising buyer base and increasing healthcare facility community
- Excessive availability with a 99.99% SLA requirement to facilitate dependable affected person communication
Answer overview & structure
The GenAIIC group collaborated with Clarus to create a generative AI-powered contact middle utilizing Amazon Join and Amazon Lex, built-in with Amazon Nova and Anthropic’s Claude 3.5 Sonnet basis fashions via Amazon Bedrock. Join was chosen because the core system on account of its capability to keep up 99.99% availability whereas offering complete contact middle capabilities throughout voice and chat channels.
The mannequin flexibility of Bedrock is central to the system, permitting task-specific mannequin choice based mostly on accuracy and latency. Claude 3.5 Sonnet was used for its high-quality pure language understanding capabilities, and Nova fashions provided optimization for low latency and comparable pure language understanding and technology capabilities. The next diagram illustrates the answer structure for the primary contact middle answer:
The workflow consists of the next high-level steps:
- A affected person initiates contact via both a cellphone name or net chat interface.
- Join processes the preliminary contact and routes it via a configured contact stream.
- Lex handles transcription and maintains dialog state.
- An AWS Lambda success perform processes the dialog utilizing Claude 3.5 Sonnet and Nova fashions via Bedrock to:
- Classify urgency and intents
- Extract required info
- Generate pure responses
- Handle appointment scheduling when relevant
The fashions used for every particular perform are described in answer element sections.
- Good transfers to employees are initiated when pressing circumstances are detected or when sufferers request to talk with suppliers.
- Dialog information is processed via an analytics pipeline for monitoring and reporting (described later on this put up).
Some challenges the group tackled in the course of the improvement course of included:
- Formatting the contact middle name stream and repair mannequin in a method that’s interchangeable for various prospects, with minimal code and configuration modifications
- Managing latency necessities for a pure dialog expertise
- Transcription and understanding of affected person names
Along with voice calls, the group developed an internet interface utilizing Amazon CloudFront and Amazon S3 Static Web site Internet hosting that demonstrates the system’s multichannel capabilities. This interface reveals how sufferers can have interaction in AI-powered conversations via a chat widget, offering the identical stage of service and performance as voice calls. Whereas the net interface demo makes use of the identical contact stream because the voice name, it may be additional custom-made for chat-specific language.

The group additionally constructed an analytics pipeline that processes dialog logs to supply worthwhile insights into system efficiency and affected person interactions. A customizable dashboard presents a user-friendly interface for visualizing this information, permitting each technical and non-technical employees to realize actionable insights from affected person communications. The analytics pipeline and dashboard had been constructed utilizing a beforehand printed reusable GenAI contact middle asset.

Dialog dealing with particulars
The answer employs a complicated dialog administration system that orchestrates pure affected person interactions via the multi-model capabilities of Bedrock and thoroughly designed immediate layering. On the coronary heart of this technique is the flexibility of Bedrock to supply entry to a number of basis fashions, enabling the group to pick out the optimum mannequin for every particular process based mostly on accuracy, price, and latency necessities. The stream of the dialog administration system is proven within the following picture; NLU stands for pure language understanding.

The dialog stream begins with a greeting and urgency evaluation. When a affected person calls, the system instantly evaluates whether or not the scenario requires pressing consideration utilizing Bedrock APIs. This primary step makes positive that emergency circumstances are shortly recognized and routed appropriately. The system makes use of a targeted immediate that analyzes the affected person’s preliminary assertion in opposition to a predefined record of pressing intent classes, returning both “pressing” or “non_urgent” to information subsequent dealing with.
Following this, the system strikes to intent detection. A key innovation right here is the system’s capability to course of a number of intents inside a single interplay. Reasonably than forcing sufferers via inflexible menu bushes, the system can leverage highly effective language fashions to grasp when a affected person mentions each a prescription refill and a billing query, queuing these intents for sequential processing whereas sustaining pure dialog stream. Throughout this extraction, we make it possible for the intent and the quote from the person enter are each extracted. This produces two outcomes:
- Built-in mannequin reasoning to make it possible for the proper intent is extracted
- Dialog historical past reference that led to intent extraction, so the identical intent is just not extracted twice until explicitly requested for
As soon as the system begins processing intents sequentially, it begins prompting the person for information required to service the intent at hand. This occurs in two interdependent levels:
- Checking for lacking info fields and producing a pure language immediate to ask the person for info
- Parsing person utterances to research and extract collected fields and the fields which can be nonetheless lacking
These two steps occur in a loop till the required info is collected. The system additionally considers provider-specific companies at this stage, the place fields required per supplier is collected. The answer routinely matches supplier names talked about by sufferers to the proper supplier within the system. This handles variations like “Dr. Smith” matching to “Dr. Jennifer Smith” or “Jenny Smith,” eradicating the inflexible identify matching or extension necessities of conventional IVR programs. The answer additionally contains sensible handoff capabilities. When the system wants to find out if a affected person ought to communicate with a selected supplier, it analyses the dialog context to think about urgency and routing wants for the expressed intent. This course of preserves the dialog context and picked up info, facilitating a seamless expertise when human intervention is requested. All through the dialog, the system maintains complete state monitoring via Lex session attributes whereas the pure language processing happens via Bedrock mannequin invocations. These attributes function the dialog’s reminiscence, storing all the things from the person’s collected info and dialog historical past to detected intents and picked up info. This state administration allows the system to keep up context throughout a number of Bedrock API calls, making a extra pure dialogue stream.
Intent administration
The intent administration system was designed via a hierarchical service mannequin construction that displays how sufferers naturally specific their wants. To traverse this hierarchical service mannequin, the person inputs are parsed utilizing pure language understanding, that are dealt with via Bedrock API calls.
The hierarchical service mannequin organizes intents into three major ranges:
- Urgency Stage: Separating pressing from non-urgent companies facilitates acceptable dealing with and routing.
- Service Stage: Grouping associated companies like appointments, prescriptions, and billing creates logical classes.
- Supplier-Particular Stage: Additional granularity accommodates provider-specific necessities and sub-services
This construction allows the system to effectively navigate via doable intents whereas sustaining flexibility for personalization throughout completely different healthcare amenities. Every intent within the mannequin contains customized directions that may be dynamically injected into Bedrock prompts, permitting for extremely configurable habits with out code modifications. The intent extraction course of leverages the superior language understanding capabilities of Bedrock via a immediate that instructs the mannequin to establish the intents current in a affected person’s pure language enter. The immediate contains complete directions about what constitutes a brand new intent, the whole record of doable intents, and formatting necessities for the response. Reasonably than forcing classification right into a single intent, we intend to detect a number of wants expressed concurrently. As soon as intents are recognized, they’re added to a processing queue. The system then works via every intent sequentially, making extra mannequin calls in a number of layers to gather required info via pure dialog. To optimize for each high quality and latency, the answer leverages the mannequin choice flexibility of Bedrock for varied dialog duties in a similar way:
- Intent extraction makes use of Anthropic’s Claude 3.5 Sonnet via Bedrock for detailed evaluation that may establish a number of intents from pure language, ensuring sufferers don’t have to repeat info.
- Info assortment employs a sooner mannequin, Amazon Nova Professional, via Bedrock for structured information extraction whereas sustaining conversational tone.
- Response technology makes use of a smaller mannequin, Nova Lite, via Bedrock to create low-latency, pure, and empathetic responses based mostly on the dialog state.
Doing this helps in ensuring that the answer can:
- Preserve conversational tone and empathy
- Ask for under the precise lacking info
- Acknowledge info already supplied
- Deal with particular circumstances like spelling out names
Your entire intent administration pipeline advantages from the Bedrock unified Converse API, which gives:
- Constant interface throughout the mannequin calls, simplifying improvement and upkeep
- Mannequin model management facilitating steady habits throughout deployments
- Future-proof structure permitting seamless adoption of latest fashions as they grow to be out there
By implementing this hierarchical intent administration system, Clarus can supply sufferers a extra pure and environment friendly communication expertise whereas sustaining the construction wanted for correct routing and knowledge assortment. The pliability of mixing the multi-model capabilities of Bedrock with a configurable service mannequin permits for easy customization per healthcare facility whereas maintaining the core dialog logic constant and maintainable. As new fashions grow to be out there in Bedrock, the system might be up to date to leverage improved capabilities with out main architectural modifications, facilitating long-term scalability and efficiency optimization.
Scheduling
The scheduling part of the answer is dealt with in a separate, purpose-built module. If an ‘appointment’ intent is detected in the primary handler, processing is handed to the scheduling module. The module operates as a state machine consisting of dialog states and subsequent steps. The general stream of the scheduling system is proven beneath:
There are three most important LLM prompts used within the scheduling stream:
- Extract time preferences (Nova Lite is used for low latency and use choice understanding)
- Decide if person is confirming or denying time (Nova Micro is used for low latency on a easy process)
- Generate a pure response based mostly on a subsequent step (Nova Lite is used for low latency and response technology)
The doable steps are:
- Let the person know you’ll escalate to the workplace
- Finish a dialog with reserving affirmation
System Extensions
Sooner or later, Clarus can combine the contact middle’s voicebot with Amazon Nova Sonic. Nova Sonic is a speech-to-speech LLM that delivers real-time, human-like voice conversations with main value efficiency and low latency. Nova Sonic is now instantly built-in with Join.
Bedrock has a number of extra companies which assist with scaling the answer and deploying it to manufacturing, together with:
Conclusion
On this put up, we demonstrated how the GenAIIC group collaborated with Clarus Care to develop a generative AI-powered healthcare contact middle utilizing Amazon Join, Amazon Lex, and Amazon Bedrock. The answer showcases a conversational voice interface able to dealing with a number of affected person intents, managing appointment scheduling, and offering sensible switch capabilities. By leveraging Amazon Nova and Anthropic’s Claude 3.5 Sonnet language fashions and AWS companies, the system achieves excessive availability whereas providing a extra intuitive and environment friendly affected person communication expertise.The answer additionally incorporates an analytics pipeline for monitoring name high quality and metrics, in addition to an internet interface demonstrating multichannel help. The answer’s structure gives a scalable basis that may adapt to Clarus Care’s rising buyer base and future service choices.The transition from a standard menu-driven IVR to an AI-powered conversational interface allows Clarus to assist improve affected person expertise, enhance automation capabilities, and streamline healthcare communications. As they transfer in direction of implementation, this answer will empower Clarus Care to satisfy the evolving wants of each sufferers and healthcare suppliers in an more and more digital healthcare panorama.
If you wish to implement an analogous answer in your use case, take into account the weblog Deploy generative AI brokers in your contact middle for voice and chat utilizing Amazon Join, Amazon Lex, and Amazon Bedrock Information Bases for the infrastructure setup.
In regards to the authors
Rishi Srivastava is the VP of Engineering at Clarus Care. He’s a seasoned business chief with over 20 years in enterprise software program engineering, specializing in design of multi-tenant Cloud based mostly SaaS structure and, conversational AI agentic options associated to affected person engagement. Beforehand, he labored in monetary companies and quantitative finance, constructing latent issue fashions for stylish portfolio analytics to drive data-informed funding methods.
Scott Reynolds is the VP of Product at Clarus Care, a healthcare SaaS communications and AI-powered affected person engagement platform. He’s spent over 25 years within the know-how and software program market creating safe, interoperable platforms that streamline scientific and operational workflows. He has based a number of startups and holds a U.S. patent for patient-centric communication know-how.
Brian Halperin
joined AWS in 2024 as a GenAI Strategist within the Generative AI Innovation Heart, the place he helps enterprise prospects unlock transformative enterprise worth via synthetic intelligence. With over 9 years of expertise spanning enterprise AI implementation and digital know-how transformation, he brings a confirmed monitor document of translating complicated AI capabilities into measurable enterprise outcomes. Brian beforehand served as Vice President on an working group at a world different funding agency, main AI initiatives throughout portfolio corporations.
Brian Yost is a Principal Deep Studying Architect within the AWS Generative AI Innovation Heart. He makes a speciality of making use of agentic AI capabilities in buyer help situations, together with contact middle options.
Parth Patwa is a Information Scientist within the Generative AI Innovation Heart at Amazon Internet Companies. He has co-authored analysis papers at high AI/ML venues and has 1500+ citations.
Smita Bailur is a Senior Utilized Scientist on the AWS Generative AI Innovation Heart, the place she brings over 10 years of experience in conventional AI/ML, deep studying, and generative AI to assist prospects unlock transformative options. She holds a masters diploma in Electrical Engineering from the College of Pennsylvania.
Shreya Mohanty Shreya Mohanty is a Strategist within the AWS Generative AI Innovation Heart the place she makes a speciality of mannequin customization and optimization. Beforehand she was a Deep Studying Architect, targeted on constructing GenAI options for purchasers. She makes use of her cross-functional background to translate buyer targets into tangible outcomes and measurable influence.
Yingwei Yu Yingwei Yu is an Utilized Science Supervisor on the Generative AI Innovation Heart (GenAIIC) at Amazon Internet Companies (AWS), based mostly in Houston, Texas. With expertise in utilized machine studying and generative AI, Yu leads the event of revolutionary options throughout varied industries. He has a number of patents and peer-reviewed publications in skilled conferences. Yingwei earned his Ph.D. in Laptop Science from Texas A&M College – School Station.

