FunctionInvokingChatClient for Tool calling in Microsoft Agent Framework

My previous article explored how to leverage Dependency Injection (DI) for function calling. However, that approach was limited because of its ability to execute a single function per request with no built-in mechanism to handle multiple function calls through single prompt.
The use case used in my previous article was an agent able to list the user access level post login. But imagine a scenario where during the conversation an user prompts contains multiple requests.
Example :
What level of access I have & give me the weather information ?
Key points to be noted from the above prompt :
The prompt contains multiple requests asking for both access details and weather information at the same time.
The prompt does not contain the user details and neither it has the location/city details to which the access details and weather details are requested.
Plugins over FunctionInvokingChatClient ?
You might technically argue that we could use Plugins instead of FunctionInvokingChatClient by adding mutliple functions(in this case GetAccessDetails and GetWeather) to the function that exposes AIFunctions as AITool of type IEnumerable for the Tools option of the ChatClientAgent similar to what is done in my other article.
public IEnumerable<AITool> AsAITools()
{
yield return AIFunctionFactory.Create(AccessDetails);
yield return AIFunctionFactory.Create(GetWeatherDetails);
}
But check out the prompt that was used in the article.
My name is Sachin and I am from Mumbai.
and compare the above prompt with the prompt that we are using
What level of access I have and give me weather information ?
The prompt above lacks context such as User identity (who is asking?) and Location for weather (city, region etc.).In such cases plugin-based auto tool calling may still occur but it becomes unreliable due to missing or ambiguous inputs. The function might fail or return arbitrary or garbage values.
This is where FunctionInvokingChatClient shines. It acts as an execution orchestrator i.e. intercepting function calls and validating arguments based on the context and feeds back the response to the model. You can send arguments through ,validate or override the arguments.
For example in following prompt
What level of access I have and give me weather information ?
The "I" in the above case is the logged in user and the weather information being asked is the "City" is the city the user is resides/associated with.
The technical definition of FunctionInvokingChatClient is as follows
When the client receives a FunctionCallContent in a chat response from its inner IChatClient, it responds by invoking the corresponding AIFunction defined in Tools (or in AdditionalTools), producing a FunctionResultContent that it sends back to the inner client. This loop is repeated until there are no more function calls to make, or until another stop condition is met, such as hitting MaximumIterationsPerRequest.
Another key aspect with plugin based function calling is that the model typically generates the arguments for you after deciphering the user prompt. With FunctionInvokingChatClient you can control the point to inject or override the arguments before the function runs.
So basically with FunctionInvokingChatClient you send tools to the LLM and LLM decides which function in the tools need to be called based on the prompt and orchestrates the entire flow which is not possible through plugins.
To demonstrate functioning of FunctionInvokingChatClient , lets create a function that returns the weather information and use the access function that was created in my previous article.
public static Func<string, string> GetWeatherDetails = (city) =>
{
return $"Weather in {city} is 35°C";
};
The above function is highly simplified. In real life production scenarios such data should come from a database or external API's.
Next ,we read the Azure Open AI details from the appsettings and create a DI container and register a ChatClient and an AIAgent.
var configuration = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile("appsettings.json", optional: false)
.Build();
ServiceCollection servicecollection = new();
var credential = new AzureKeyCredential(configuration["AppSettings:ApiKey"]);
servicecollection.AddKeyedChatClient("ChatClient", (sp) => new AzureOpenAIClient(
new Uri(configuration["AppSettings:EndPoint"]), credential)
.GetChatClient(configuration["AppSettings:Chat_DeploymentName"])
.AsIChatClient());
servicecollection.AddSingleton<AIAgent>(sp =>
{
Func<ChatClientAgentOptions> func = () =>
{
return new ChatClientAgentOptions
{
ChatOptions = new ChatOptions
{
Instructions = "You're a helpful assistant.",
},
};
};
return new ChatClientAgent
(
chatClient: sp.GetRequiredKeyedService<IChatClient>("ChatClient"),
options: func()
);
}
);
We have the basic set up in place.
In the next step we register the FunctionInvokingChatClient in the DI container.
servicecollection.AddSingleton<FunctionInvokingChatClient>(sp =>
{
return new FunctionInvokingChatClient(sp.GetRequiredKeyedService<IChatClient>("ChatClient"));
});
Build the service and create an instance named fn_invoking_client of type FunctionInvokingChatClient.
var serviceProvider = servicecollection.BuildServiceProvider();
var fn_invoking_client = serviceProvider.GetRequiredService<FunctionInvokingChatClient>();
Recall that the user prompt may or may not mention the keyword City. So the registration should be able to handle both the scenarios.
We register the GetAccessDetails and GetWeatherDetails functions.
GetAccessDetails >>
servicecollection.AddSingleton<AIFunction>(sp =>
{
Func<AIFunctionArguments, string[]> func = (args) =>
{
string? userlogin = args.TryGetValue("userlogin", out var c) ? c?.ToString() : "Unknown";
if (userlogin != "Unknown")
{
returnet GAccessDetails(userlogin!);
}
var ctx = FunctionInvokingChatClient.CurrentContext;
return GetAccessDetails(ctx == null ? null : ctx.Options.AdditionalProperties["userlogin"].ToString()!);
};
return AIFunctionFactory.Create(func, new AIFunctionFactoryOptions {Name = "GetAccessDetails", Description = "Returns access details for the current user. Requires parameter: userlogin (string)" });
});
GetWeatherDetails >>
servicecollection.AddSingleton<AIFunction>(sp =>
{
Func<AIFunctionArguments, string> func = args =>
{
string? city = args.TryGetValue("city", out var c) ? c?.ToString() : "Unknown";
if (city != "Unknown")
{
return GetWeather(city?.ToString());
}
var ctx = FunctionInvokingChatClient.CurrentContext;
return GetWeather(ctx == null ? null : ctx.Options.AdditionalProperties["city"].ToString()!);
};
return AIFunctionFactory.Create(func, new AIFunctionFactoryOptions { Name = "GetWeatherDetails", Description = "Returns weather details for a city. Requires parameter: city (string).Use ONLY for weather-related queries." });
});
DO NOT combine the two functions
GetWeatherDetailsandGetAccessDetailsand register into the DI container. The model hallucinates.
Check out the description in the GetWeatherDetails function.
Description = "Returns weather details for a city. Requires parameter: city (string)"
Our logic is built upon the fact that the city value either comes through the user prompt that is interpreted by the model or through CurrentContext of FunctionInvokingChatClient through AdditionalProperties .
return GetWeatherDetails(ctx.Options.AdditionalProperties["city"].ToString()!);
Also you might wonder why are there in place checks forargsin the functions ?
This is because not always the arguments for these two functions will be passed through FunctionInvokingChatClient.CurrentContext option.
For example there can be a possibility of manual invocation to one of the function...Something like this
var arg = new AIFunctionArguments
{
["userlogin"] = loggedinuser
};
var fn = serviceProvider.GetServices<AIFunction>();
var f = fn.FirstOrDefault(a => a.Name == "GetAccessDetails");
Console.WriteLine(await agent.RunAsync($"Access Details: Explain in markdown format {f!.InvokeAsync(arg)}: ", session));
So to ensure that we don't miss that, we have an check onargs value.
AdditionalProperties
Next, how to set AdditionalProperties for FunctionInvokingChatClient ?
We can do that through the AdditionalPropertiesDictionary
string loggedinuser = "superadmin@azureguru.net";
string userCity = "Mumbai";
var AdditionalProperties_dict = new AdditionalPropertiesDictionary
{
["userlogin"] = loggedinuser,
["usercity"] = userCity
};
You might also be wondering how to make the underlying function(s) available to FunctionInvokingChatClient.
Remember that we can create an instance of AIFunctionfrom the DI container.
var functionlist = serviceProvider.GetServices<AIFunction>();
functionlist is the instance and holds the list of all the functions registered in the DI container (ServiceCollection) .
ChatOptions
We create an instance of ChatOptions .
var chatOptions = new ChatOptions();
We need to set the Tools property for ChatOptions.
For that, we get the list of all the functions from the DI container and assign it to the Tools property and create a new list of type AITool called tools that are initialized with all the functions from functionlist.
var functionlist = serviceProvider.GetServices<AIFunction>();
List<AITool> tools = new(functionlist);
All the AIFunction(s) are now available in the tools collection as AITool .
Now that we have the values for AdditionalProperties and Tools , we can assign them to the ChatOptions property of AdditionalProperties and Tools respectively.
chatOptions.AdditionalProperties = AdditionalProperties_dict;
chatOptions.Tools = tools;
In the next step we create the required chat history and set relevant instructions
List<ChatMessage> chatHistory = [
new(ChatRole.System, $"You're a helpful assistant.You're a helpful assistant.You provide users with their access details or weather information or both")];
Below is the code snippet after registering AIFunction in the DI container.
string loggedinuser = "superadmin@azureguru.net";
string userCity = "Mumbai";
var chatOptions = new ChatOptions();
AdditionalPropertiesDictionary AdditionalProperties_dict = new ()
{
["userlogin"] = loggedinuser,
["usercity"] = userCity
};
var functionlist = serviceProvider.GetServices<AIFunction>();
List<AITool> tools = new(functionlist);
chatOptions.AdditionalProperties = AdditionalProperties_dict;
chatOptions.Tools = tools;
List<ChatMessage> chatHistory = [
new(ChatRole.System, $"You're a helpful assistant.You provide user with their access details along with weather information")];
In the final step we invoke the prompt through fn_invoking_client which is an instance that we created earlier of type FunctionInvokingChatClient .
chatHistory.Add(new ChatMessage(ChatRole.User,"What level of access do I have and also give me weather information ?"));
ChatResponse response = await fn_invoking_client.GetResponseAsync(chatHistory, chatOptions);
Console.WriteLine(await agent.RunAsync("Explain in markdown" + response, session));
Go ahead and execute the code
In the screengrab below we send three prompts separately
What level of access do I have ?
What level of access do I have and how is the weather ?
How is the weather ?
Conclusion
Plugins only exposes capabilities. What FunctionInvokingChatClient does is how those capabilities are used. Instead of you deciphering the intent , argument parsing and execution logic, it lets the LLM make decisions and provide flexibility to customise the details before execution.
Thanks for reading !!!



