Get started with the Ollama integrations

Ollama logo
⭐ Community Toolkit

Ollama is a powerful, open source language model that can be used to generate text based on a given prompt. The Aspire Ollama integration provides a way to host Ollama models using the docker.io/ollama/ollama container image and access them via the OllamaSharp client.

In this introduction, you'll see how to install and use the Aspire Ollama integrations in a simple configuration. If you already have this knowledge, see Ollama hosting integration for full reference details.

Note

To follow this guide, you must have created an Aspire solution to work with. To learn how to do that, see Build your first Aspire app.

Set up hosting integration

To begin, install the Aspire Ollama Hosting integration in your Aspire AppHost project. This integration allows you to create and manage Ollama model instances from your Aspire hosting projects:

Install the NuGet package
dotnet add package CommunityToolkit.Aspire.Hosting.Ollama

Next, in the AppHost project, register and consume the Ollama integration using the AddOllama extension method to add the Ollama container to the application builder. You can then add models to the container, which download and run when the container starts, using the AddModel extension method:

var builder = DistributedApplication.CreateBuilder(args);
  
var ollama = builder.AddOllama("ollama");
  
var phi35 = ollama.AddModel("phi3.5");
  
var exampleProject = builder.AddProject<Projects.ExampleProject>()
                            .WithReference(phi35);
  
builder.Build().Run();

Tip

This is the simplest implementation of Ollama resources in the AppHost. There are many more options you can choose from to address your requirements. For full details, see Ollama hosting integration.

Set up client integration

Now that the hosting integration is ready, the next step is to install and configure the client integration in any projects that need to use it.

Install the Aspire OllamaSharp client integration in the client-consuming project:

Install the NuGet package
dotnet add package CommunityToolkit.Aspire.OllamaSharp

In the Program.cs file of your client-consuming project, call the AddOllamaApiClient extension to register an IOllamaApiClient for use via the dependency injection container:

builder.AddOllamaApiClient("llama3");

After adding IOllamaApiClient to the builder, you can get the IOllamaApiClient instance using dependency injection. For example, to retrieve your context object from service:

public class ExampleService(IOllamaApiClient ollama)
{
    // Use ollama...
}

For full reference details, see Ollama hosting integration and Ollama client integration.

See also