Sunday, April 19, 2026
banner
Top Selling Multipurpose WP Theme

Operate calls enable LLM to behave as a bridge between pure language prompts and real-world code or APIs. As a substitute of merely producing textual content, the mannequin determines when to name a predefined perform, fires a structured JSON name utilizing perform names and arguments, and waits for the applying to execute that decision and return the consequence. This interplay will be loopable and may name a number of features so as, permitting for a totally wealthy multi-step interplay beneath dialog management. This tutorial exhibits you the best way to implement a climate assistant utilizing Gemini 2.0 Flash to arrange and handle the cycles that decision that performance. Implements varied variations of perform calls. By consolidating perform calls, you’ll be able to convert the chat interface right into a dynamic software for real-time duties, equivalent to retrieving dwell climate information, checking order standing, scheduling appointments, and updating databases. Customers not fill out complicated varieties or navigate a number of screens. They merely clarify what they want, and LLM seamlessly adjusts the underlying habits. This pure language automation makes it straightforward to construct an AI agent that may entry exterior information sources, execute transactions, and set off workflows.

Operate Calls in Google Gemini 2.0 Flash

!pip set up "google-genai>=1.0.0" geopy requests

Set up the Gemini Python SDK (Google-Genai ≥1.0.0), and Geopy adjusts the situation identify to request HTTP calls, guaranteeing all of the core dependencies of the Colab Climate Assistant.

import os
from google import genai


GEMINI_API_KEY = "Use_Your_API_Key"  


shopper = genai.Shopper(api_key=GEMINI_API_KEY)


model_id = "gemini-2.0-flash"

Import the Gemini SDK, set the API key, create a Genai.Shopper occasion configured to make use of the “GEMINI-2.0-FLASH” mannequin, establishing the idea for requests to name all subsequent features.

res = shopper.fashions.generate_content(
    mannequin=model_id,
    contents=["Tell me 1 good fact about Nuremberg."]
)
print(res.textual content)

Ship a person immediate (“Inform me one good truth about Nurmermerg”) to your Gemini 2.0 Flash mannequin by way of Generate_Content and print the mannequin’s textual content reply to point out a fundamental end-to-end textual content era name utilizing the SDK.

Operate name with JSON schema

weather_function = {
    "identify": "get_weather_forecast",
    "description": "Retrieves the climate utilizing Open-Meteo API for a given location (metropolis) and a date (yyyy-mm-dd). Returns a listing dictionary with the time and temperature for every hour.",
    "parameters": {
        "sort": "object",
        "properties": {
            "location": {
                "sort": "string",
                "description": "The town and state, e.g., San Francisco, CA"
            },
            "date": {
                "sort": "string",
                "description": "the forecasting date for when to get the climate format (yyyy-mm-dd)"
            }
        },
        "required": ["location","date"]
    }
}

Right here we outline the json schema for the get_weather_forecast software, specify its identify, outline a descriptive immediate as to when to make use of Gemini, and the precise enter parameters (location and date) utilizing its sort, description, and required fields.

from google.genai.varieties import GenerateContentConfig


config = GenerateContentConfig(
    system_instruction="You're a useful assistant that use instruments to entry and retrieve data from a climate API. At the moment is 2025-03-04.",
    instruments=[{"function_declarations": [weather_function]}],
)

Create a GenerateContentConfig. This tells you that Gemini acts as an assistant to the climate community and registers climate options beneath the software. Due to this fact, the mannequin is aware of the best way to generate structured calls when it’s requested for predictive information.

response = shopper.fashions.generate_content(
    mannequin=model_id,
    contents="Whats the climate in Berlin at the moment?"
)
print(response.textual content)

This name sends a naked immediate (“How is the climate in Berlin at the moment?”) with none configuration (and subsequently no perform definition). So, as a substitute of calling the faircast software, Gemini gives common recommendation and common recommendation.

response = shopper.fashions.generate_content(
    mannequin=model_id,
    config=config,
    contents="Whats the climate in Berlin at the moment?"
)


for half in response.candidates[0].content material.elements:
    print(half.function_call)

By passing in config (together with the JSON -Schema software), Gemini realizes that it is advisable name get_weather_forecast moderately than replying in plain textual content. Response.candidates loop[0].content material.elements prints the .function_call object for every half, indicating precisely what perform the mannequin decides to name (utilizing its identify and arguments).

from google.genai import varieties
from geopy.geocoders import Nominatim
import requests


geolocator = Nominatim(user_agent="weather-app")
def get_weather_forecast(location, date):
    location = geolocator.geocode(location)
    if location:
        attempt:
            response = requests.get(f"https://api.open-meteo.com/v1/forecast?latitude={location.latitude}&longitude={location.longitude}&hourly=temperature_2m&start_date={date}&end_date={date}")
            information = response.json()
            return {time: temp for time, temp in zip(information["hourly"]["time"], information["hourly"]["temperature_2m"])}
        besides Exception as e:
            return {"error": str(e)}
    else:
        return {"error": "Location not discovered"}


features = {
    "get_weather_forecast": get_weather_forecast
}


def call_function(function_name, **kwargs):
    return features[function_name](**kwargs)


def function_call_loop(immediate):
    contents = [types.Content(role="user", parts=[types.Part(text=prompt)])]
    response = shopper.fashions.generate_content(
        mannequin=model_id,
        config=config,
        contents=contents
    )
    for half in response.candidates[0].content material.elements:
        contents.append(varieties.Content material(function="mannequin", elements=[part]))
        if half.function_call:
            print("Software name detected")
            function_call = half.function_call
            print(f"Calling software: {function_call.identify} with args: {function_call.args}")
            tool_result = call_function(function_call.identify, **function_call.args)
            function_response_part = varieties.Half.from_function_response(
                identify=function_call.identify,
                response={"consequence": tool_result},
            )
            contents.append(varieties.Content material(function="person", elements=[function_response_part]))
            print(f"Calling LLM with software outcomes")
            func_gen_response = shopper.fashions.generate_content(
                mannequin=model_id, config=config, contents=contents
            )
            contents.append(varieties.Content material(function="mannequin", elements=[func_gen_response]))
    return contents[-1].elements[0].textual content.strip()
   
consequence = function_call_loop("Whats the climate in Berlin at the moment?")
print(consequence)

Implement a whole “agent” loop. It sends a immediate to gemini, inspects the response of the perform name, runs get_weather_forecast (utilizing Geopy and open-meteo http requests), returns the software’s outcomes again to the mannequin, and generates and returns a reply from the ultimate dialog.

Operate name utilizing Python features

from geopy.geocoders import Nominatim
import requests


geolocator = Nominatim(user_agent="weather-app")


def get_weather_forecast(location: str, date: str) -> str:
    """
    Retrieves the climate utilizing Open-Meteo API for a given location (metropolis) and a date (yyyy-mm-dd). Returns a listing dictionary with the time and temperature for every hour."
   
    Args:
        location (str): The town and state, e.g., San Francisco, CA
        date (str): The forecasting date for when to get the climate format (yyyy-mm-dd)
    Returns:
        Dict[str, float]: A dictionary with the time as key and the temperature as worth
    """
    location = geolocator.geocode(location)
    if location:
        attempt:
            response = requests.get(f"https://api.open-meteo.com/v1/forecast?latitude={location.latitude}&longitude={location.longitude}&hourly=temperature_2m&start_date={date}&end_date={date}")
            information = response.json()
            return {time: temp for time, temp in zip(information["hourly"]["time"], information["hourly"]["temperature_2m"])}
        besides Exception as e:
            return {"error": str(e)}
    else:
        return {"error": "Location not discovered"}

The get_weather_forecast perform first converts metropolis and state strings to coordinates utilizing Geopy’s nominatim, then sends an HTTP request to the Open-Meteo API to retrieve time-by-time temperature information for a given date and retry the dictionary that maps every time stamp to the corresponding temperature. It additionally handles errors gracefully and returns an error message if the situation is just not discovered or if the API name fails.

from google.genai.varieties import GenerateContentConfig


config = GenerateContentConfig(
    system_instruction="You're a useful assistant that may assist with climate associated questions. At the moment is 2025-03-04.", # to offer the LLM context on the present date.
    instruments=[get_weather_forecast],
    automatic_function_calling={"disable": True}
)

This configuration registers as a software that calls the Python get_weather_forecast perform. Disable “Automatic_Function_Calling” whereas setting a transparent system immediate (together with date) for the context, ejecting the function name payload moderately than calling it internally by Gemini.

r = shopper.fashions.generate_content(
    mannequin=model_id,
    config=config,
    contents="Whats the climate in Berlin at the moment?"
)
for half in r.candidates[0].content material.elements:
    print(half.function_call)

This snippet captures Gemini’s RAW function name choices by sending it in a customized configuration (together with Python instruments however together with computerized calls however together with computerized calls). Then, loop by means of every response half and print a .function_call object in order that the mannequin can use what arguments it needs to name.

from google.genai.varieties import GenerateContentConfig


config = GenerateContentConfig(
    system_instruction="You're a useful assistant that use instruments to entry and retrieve data from a climate API. At the moment is 2025-03-04.", # to offer the LLM context on the present date.
    instruments=[get_weather_forecast],
)


r = shopper.fashions.generate_content(
    mannequin=model_id,
    config=config,
    contents="Whats the climate in Berlin at the moment?"
)


print(r.textual content)

With this configuration (together with the get_weather_forecast perform and allow auto-call by default), once you name generate_content, gemini will name the behind-the-scenes climate software earlier than returning a pure language reply. Printing R.Textual content outputs the ultimate response, together with precise temperature predictions for Berlin for the desired date.

from google.genai.varieties import GenerateContentConfig


config = GenerateContentConfig(
    system_instruction="You're a useful assistant that use instruments to entry and retrieve data from a climate API.",
    instruments=[get_weather_forecast],
)


immediate = f"""
At the moment is 2025-03-04. You're chatting with Andrew, you will have entry to extra details about him.


Person Context:
- identify: Andrew
- location: Nuremberg


Person: Am i able to put on a T-shirt later at the moment?"""


r = shopper.fashions.generate_content(
    mannequin=model_id,
    config=config,
    contents=immediate
)


print(r.textual content)

We lengthen your assistant in a private context, telling Gemini Andrew’s identify and site (Nuremberg) and utilizing the get_weather_forecast software beneath the hood, asking if it is the climate on the t-shirt. Subsequent, print out the pure language suggestions for the mannequin primarily based on precise predictions for the day.

In conclusion, I understand how to outline features (by way of JSON schema or Python signatures), and the best way to configure Gemini 2.0 Flash to detect and emit perform calls. These constructing blocks can help you lengthen any LLM right into a succesful tool-enabled assistant that automates workflows, retrieves dwell information, interacts simply as simply as interacting with code or APIs and chats with colleagues.


Right here is Colove Notebook. Additionally, do not forget to observe us Twitter And be part of us Telegram Channel and LinkedIn grOUP. Do not forget to hitch us 90k+ ml subreddit.

🔥 [Register Now] Mini Converter Meeting on Agent AI: Free Registration + Certificate of Attendance + 4-hour short event (May 21, 9am to 1pm) + Hand-on Workshop


Asif Razzaq is CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, ASIF is dedicated to leveraging the chances of synthetic intelligence for social advantages. His newest efforts are the launch of MarkTechPost, a synthetic intelligence media platform. That is distinguished by its detailed protection of machine studying and deep studying information, and is straightforward to know by a technically sound and large viewers. The platform has over 2 million views every month, indicating its reputation amongst viewers.

banner
Top Selling Multipurpose WP Theme

Converter

Top Selling Multipurpose WP Theme

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

banner
Top Selling Multipurpose WP Theme

Leave a Comment

banner
Top Selling Multipurpose WP Theme

Latest

Best selling

22000,00 $
16000,00 $
6500,00 $

Top rated

6500,00 $
22000,00 $
900000,00 $

Products

Knowledge Unleashed
Knowledge Unleashed

Welcome to Ivugangingo!

At Ivugangingo, we're passionate about delivering insightful content that empowers and informs our readers across a spectrum of crucial topics. Whether you're delving into the world of insurance, navigating the complexities of cryptocurrency, or seeking wellness tips in health and fitness, we've got you covered.