Quantcast
Channel: Developer Express Inc.
Viewing all articles
Browse latest Browse all 2393

DevExpress Blazor AI Chat — Implement Function Calling

$
0
0

Modern AI-powered applications require seamless interaction with external systems or internal app components. Many AI service providers now support function calling (also known as tool calling), which allows AI models to trigger functions at runtime. This capability is of particular value for agentic workflows/applications wherein AI needs to execute actions such as fetching data, invoking APIs, or initiating tasks within an application (from scheduling appointments and modifying database info to updating the app’s appearance).

The overall flow in this instance looks like this: instead of replying to a user message, the model requests a function invocation with specified arguments. The chat client then invokes the function and supplies the result back to the LLM. At this point, the LLM constructs a response considering the value returned from the function.

In this guide, we'll explore how to enable function calling in the DevExpress Blazor DxAiChat component using:

  • The IChatClient interface from the Microsoft.Extensions.AI library.
  • Plugins from Microsoft's Semantic Kernel.

Getting Started

To get started, you must first integrate the DxAiChat component into your application (see our official guide for additional information): Add AI Chat to a Project.

Next, register your AI service. In this example, we’ll use Azure OpenAI. Here’s a sample Program.cs setup:



using Azure.AI.OpenAI;
using Microsoft.Extensions.AI;
...
var builder = WebApplication.CreateBuilder(args);
...
// Replace with your endpoint, API key, and deployed AI model name
string azureOpenAIEndpoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT");
string azureOpenAIKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string deploymentName = string.Empty;
...
var azureChatClient = new AzureOpenAIClient(
    new Uri(azureOpenAIEndpoint),
    new AzureKeyCredential(azureOpenAIKey));

IChatClient chatClient = azureChatClient.AsChatClient(deploymentName);

builder.Services.AddDevExpressBlazor();
builder.Services.AddChatClient(chatClient);
builder.Services.AddDevExpressAI();
...

Run the project to confirm that you can send messages and receive AI responses.

Tool Calling with IChatClient

First, define a simple function to retrieve weather information for a specified city. In this example, this is GetWeatherTool. To help AI understand how to invoke the GetWeatherTool function, use the System.ComponentModel.Description attribute for both the method and its parameters. LLM uses parameters to figure out the most suitable method call and plan a call sequence:

using System.ComponentModel;
using Microsoft.Extensions.AI;

public class CustomAIFunctions
{
    public static AIFunction GetWeatherTool => AIFunctionFactory.Create(GetWeather); 
    [Description("Gets the current weather in the city")]
    public static string GetWeather([Description("The name of the city")] string city)
    {
        switch (city)
        {
            case "Los Angeles":
            case "LA":
                return GetTemperatureValue(20);
            case "London":
                return GetTemperatureValue(15);
            default:
                return $"The information about the weather in {city} is not available.";
        }
    }
    static string GetTemperatureValue(int value)
    {
        var valueInFahrenheits = value * 9 / 5 + 32;
        return $"{valueInFahrenheits}\u00b0F ({value}\u00b0C)";
    }
}

Modify the chat client registration as shown below to provide a list of available functions and allow the client to invoke functions when responding to user questions. Ensure that you configure the chat client options first, as the method call sequence is crucial here:

using Azure;
using Azure.AI.OpenAI;
using Microsoft.Extensions.AI;
...
IChatClient chatClient = new ChatClientBuilder(azureChatClient)
    .ConfigureOptions(opt =>
    {
        opt.Tools = [CustomAIFunctions.GetWeatherTool];
    })
    .UseFunctionInvocation()
    .Build();

builder.Services.AddChatClient(chatClient);

At this point, when the user asks the AI service about the weather, the service will automatically trigger the GetWeatherTool function and add results into its response.

Integrating Semantic Kernel Plugins

Microsoft Semantic Kernel allows developers to incorporate advanced AI capabilities into their apps (including reasoning, workflow orchestration, and dynamic prompt engineering). Microsoft's framework enhances AI-powered solutions by allowing apps to interact with plugins and manage memory more effectively.

To begin, add the following NuGet packages to your project:

If you are already using the Semantic Kernel in your app and are familiar with the Plugin concept, you can easily connect it to our Blazor DxAiChat control.

Since DevExpress AI-powered APIs operate with LLMs using the IChatClient interface, you need to implement the interface manually and invoke IChatCompletionService methods from the Semantic Kernel:

using Microsoft.Extensions.AI;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI;
...
public class SemanticKernelPluginCallingChatClient : IChatClient {
	private IChatCompletionService _chatCompletionService;
    private Kernel _kernel;
    private OpenAIPromptExecutionSettings _executionSettings;
    public SemanticKernelPluginCallingChatClient(Kernel kernel)
    {
    	_kernel = kernel;
        _chatCompletionService = _kernel.GetRequiredService();
        _executionSettings = new OpenAIPromptExecutionSettings() { ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions };
    }

	public async Task GetResponseAsync(IEnumerable chatMessages, ChatOptions? options = null, CancellationToken cancellationToken = default)
    {
        var history = GetChatHistory(chatMessages);
        ChatMessageContent message = await _chatCompletionService.GetChatMessageContentAsync(history, _executionSettings, _kernel, cancellationToken);
        return new ChatResponse(new ChatMessage(ChatRole.Assistant, message.Content));
    }

	public async IAsyncEnumerable GetStreamingResponseAsync(IEnumerable chatMessages, ChatOptions? options = null, CancellationToken cancellationToken = default)
	{
        var history = GetChatHistory(chatMessages);
        await foreach(var item in _chatCompletionService.GetStreamingChatMessageContentsAsync(history, _executionSettings, _kernel, cancellationToken)) {
        	yield return new ChatResponseUpdate(ChatRole.Assistant, item.Content);
    	}
    }
    
    AuthorRole GetRole(ChatRole chatRole) {
        if(chatRole == ChatRole.User) return AuthorRole.User;
        if(chatRole == ChatRole.System) return AuthorRole.System;
        if(chatRole == ChatRole.Assistant) return AuthorRole.Assistant;
        if(chatRole == ChatRole.Tool) return AuthorRole.Tool;
        throw new Exception();
    }

	private ChatHistory GetChatHistory(IEnumerable chatMessages)
    {
    	var history = new ChatHistory(chatMessages.Select(x => new ChatMessageContent(GetRole(x.Role), x.Text)));
        return history;
    }
...
}

Implement a Semantic Kernel plugin similar to the previous function, but decorate the main function method with the Microsoft.SemanticKernel.KernelFunction attribute:

using Microsoft.SemanticKernel;
using System.ComponentModel;
...

public class WeatherPlugin {
        [KernelFunction]
        [Description("Gets the current weather in the city")]
        public static string GetWeather([Description("The name of the city")] string city) {
            switch(city) {
                case "Los Angeles":
                case "LA":
                    return GetTemperatureValue(20);
                case "London":
                    return GetTemperatureValue(15);
                default:
                    return $"The information about the weather in {city} is not available.";
            }
        }
        static string GetTemperatureValue(int value)
        {
            var valueInFahrenheits = value * 9 / 5 + 32;
            return $"{valueInFahrenheits}\u00b0F ({value}\u00b0C)";
        }
    }

Finally, register the Semantic Kernel and the chat client during app startup:

using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Plugins.Core;
...
    
var semanticKernelBuilder = Kernel.CreateBuilder();
semanticKernelBuilder.AddAzureOpenAIChatCompletion(
    deploymentName,
    azureOpenAIEndpoint,
    azureOpenAIKey);

// Add plugins from Microsoft.SemanticKernel.Plugins.Core
#pragma warning disable SKEXP0050
semanticKernelBuilder.Plugins.AddFromType<TimePlugin>(); // this is a built-in plugin
semanticKernelBuilder.Plugins.AddFromType<WeatherPlugin>(); // this is our custom plugin
#pragma warning restore SKEXP0050

var globalKernel = semanticKernelBuilder.Build();
builder.Services.AddChatClient(new SemanticKernelPluginCallingChatClient(globalKernel));

builder.Services.AddDevExpressAI();

Once configured, your application will use the Semantic Kernel plugin to intelligently handle requests:

Additional Resources

For an in-depth understanding of Semantic Kernel tool calling, please review the following YouTube video: Core GenAI Techniques - Function Calling.

You can also explore the following DevExpress GitHub sample: DevExpress Blazor AI Chat — Implement Function/Tool Calling.


Viewing all articles
Browse latest Browse all 2393

Trending Articles