Skip to main content

Command Palette

Search for a command to run...

Microsoft Agent Framework with Background Service and Azure Service Bus

Published
6 min read
Microsoft Agent Framework with Background Service and Azure Service Bus
S
From Synapse Analytics, Power BI, Spark, Microsoft Fabric,ASP.NET Core and recently Agentic AI on .NET I try to explore, learn and share all aspects of Microsoft Data Stack in this blog.

We all know that LLM backed agents scale at on demand execution. They perform well when prompted, execute tasks and return results almost instantly. But not all business all problems are request-driven.

Some require continuous monitoring of streams of data and provide in depth and actionable intelligence and without explicit user input. This is where background services powered by Microsoft Agent Framework come into play and help system process unstructured data by transforming raw unstructured data into structured data with added intelligence.

Use Case

Lets assume a system where real time stock values are being fed through API calls and the system is designed to generate alerts on price volatility and fluctuation beyond a certain threshold.

During such instance you don't just need to know that the prices are volatile, you also need to understand as why it moved and the context behind such sharp movement. Your existing system though robust has a unintentional shortcoming as it lacks any actionable intelligence that has driven such an event.

This is where an LLM can help. An additional LLM background layer that runs continuously but gets actioned ONLY on such eventful and sharp price movements. You can feed the alert alongside additional context into an LLM layer. Instead of just reporting a price spike and volatile movement which your existing system already does, the LLM can correlate it with recent news and market sentiment and provide informed insights.

Lets take an hypothetical example where stock of Tesla has gone down 5% in last 30 minutes and your system created an alert. But feeding the LLM with such narrowed price alert information will not work.

For example sending the following prompt to LLM .

figure out why Tesla stock has gone down by 5% in last 30 minutes 

will not provide you any sort of sensible insights.

Remember that LLM's don't create context they make a sense of the given context.

You would have to provide other contextual information in form of recent market news and sector/index movements.

Anyways the point of this article is to provide the design and the code that showcases to design a LLM backed system that runs in the background and processing data only if relevant information is provided to it.

In this example we will use Azure Service Bus as source and we pretend that your existing system pushes all the relevant contextual information into Azure service bus required for LLM to process.

SetUp

Create a new console application and add the following packages

dotnet add package Azure;
dotnet add package Azure.AI.OpenAI;
dotnet add package Azure.Messaging.ServiceBus;
dotnet add package Microsoft.Agents.AI;
dotnet add package Microsoft.Extensions.AI;
dotnet add package Microsoft.Extensions.Configuration;
dotnet add package Microsoft.Extensions.DependencyInjection;
dotnet add package Microsoft.Extensions.Hosting;
dotnet add package Microsoft.Extensions.Logging;

On Azure , create a Service Bus instance and the underlying Service Bus Queue.

I created one called stockanalyzer

and Queue called promptqueue

In this example, the Background service will execute in pollingmode instead of event driven. Given the advantages of event driven in most production scenarios it should be preferred over pollingmode.

Add appsetting.json to the project

"AppSettings": { 
    "Chat_DeploymentName": "gpt-4.1",
    "EndPoint": "Azure AI endpoint",
    "ApiKey": "Azure AI API key",
    "AzureQueue": "Azure Queue connection string"
}

Code

Now that we have all the underlying artifacts in place,add the following code to read the settings from appsettings.json

var configuration = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile("appsettings.json", optional: false)
.Build();

Create an instance of a host builder using .NET Generic Host.

HostApplicationBuilder builder=Host.CreateApplicationBuilder();

Read the credentials and register a Chatclient

var credential = new AzureKeyCredential(configuration["AppSettings:ApiKey"]);
builder.Services.AddKeyedChatClient(
    "ChatClient",
    (
        sp =>
            new AzureOpenAIClient(new Uri(configuration["AppSettings:EndPoint"]), credential)
                .GetChatClient(configuration["AppSettings:Chat_DeploymentName"])
                .AsIChatClient()
    )
);

In the code above I am reading the OpenAI API keys from the appsettings.json file. Honestly I really I don't like this method of authentication. For brevity I am using this approach. I prefer creating token credentials. I have written a separate article on that topic its specific for Microsoft Foundry here. I would rather use that approach that storing API keys in appsettings and reading them in the code the way I have done here.

In the next step, register the AIAgent in the hosted DI container.

builder.Services.AddSingleton<AIAgent>(sp =>
        {
            Func<ChatClientAgentOptions> func = () =>
            {
                return new ChatClientAgentOptions
                {
                    ChatOptions = new ChatOptions
                    {
                        Instructions = "You are a helpful stock market analysis assistant"
                    }
                };

            };

return new ChatClientAgent(sp.GetKeyedService<IChatClient>("ChatClient"), options: func();
 }

);

Also register a ServiceBusClient

 builder.Services.AddSingleton(sp =>
 {
  return new ServiceBusClient(configuration["AppSettings:AzureQueue"], new ServiceBusClientOptions
     {
         TransportType = ServiceBusTransportType.AmqpTcp
     });
 });

Now we need to register a Worker class that will run continuously that implements the IHostedService interface.

 internal sealed class Worker(AIAgent agent, ServiceBusClient servicebusClient, IHostApplicationLifetime appLifetime, ILogger<Worker> logger, IHost host) : IHostedService
 {
     private AgentSession? session;
     private Task? backgroundTask;

     public async Task StartAsync(CancellationToken cancellationToken)
     {
         session = await agent.CreateSessionAsync(cancellationToken);
         backgroundTask = RunAsync(appLifetime.ApplicationStopping);
     }
     public async Task RunAsync(CancellationToken cancellationToken)
     {
         await Task.Delay(1000, cancellationToken);      

         var receiver = servicebusClient!.CreateReceiver("promptqueue");


         while (!cancellationToken.IsCancellationRequested)
         {
             ServiceBusReceivedMessage message = await receiver.ReceiveMessageAsync(cancellationToken: cancellationToken);

             if (message == null)
             {
                 continue;
             }
             Console.WriteLine("----------------------------------------------------------------------------------------------------------------------------");
             Console.WriteLine("");
             Console.WriteLine(await agent.RunAsync(message.Body.ToString(), session) + "\n");
             Console.WriteLine("");
             Console.WriteLine("----------------------------------------------------------------------------------------------------------------------------");
             await receiver.CompleteMessageAsync(message);
         }
     }

     public async Task StopAsync(CancellationToken cancellationToken)
     {
         if (backgroundTask != null)
         {
             await backgroundTask;
         }
     }
 }

The code above is injecting dependencies like AIAgent,ServiceBusClient and IHostApplicationLifetime from the DI container through constructor injection and implements StartAsync,RunAsync and StopAsync methods as required by the IHostedService interface.

In the StartAsync method we set the agent session and the backgroundTask that gets a cancellationtoken and blocks shutdown until all the running Tasks are completed.

In the RunAsync method we create a receiver for the service bus queue and continuously execute the backgroundservice that feeds the input from the service queue into the agent.

The StopAsync method ensures proper shutdown by ensuring incomplete processing in background services are completed.

That's all !!! Go ahead and execute the application and test it by feeding data input through the Service Bus.

Conclusion

In this blog we saw how simple and seamless it is to integrate a hosted background service that processes data from Azure Service Bus and feeds it into the Microsoft Agent Framework to generate intelligent insights.

We also saw how this architecture enables event-driven processing for real-time decision-making through power of LLM's.

Thanks for reading !!!