OllamaSharp

OllamaSharp

The easiest way to use the Ollama API in .NET

Stars: 482

Visit
 screenshot

OllamaSharp is a .NET binding for the Ollama API, providing an intuitive API client to interact with Ollama. It offers support for all Ollama API endpoints, real-time streaming, progress reporting, and an API console for remote management. Users can easily set up the client, list models, pull models with progress feedback, stream completions, and build interactive chats. The project includes a demo console for exploring and managing the Ollama host.

README:

 ollama

OllamaSharp 🦙

OllamaSharp provides .NET bindings for the Ollama API, simplifying interactions with Ollama both locally and remotely.

Features

  • Ease of use: Interact with Ollama in just a few lines of code.
  • API endpoint coverage: Support for all the Ollama API endpoints, including chats, embeddings, listing models, pulling and creating new models, and more.
  • Real-time streaming: Stream responses directly to your application.
  • Progress reporting: Get real-time progress feedback on tasks like model pulling.
  • Support for vision models and tools (function calling).

Usage

OllamaSharp wraps each Ollama API endpoint in awaitable methods that fully support response streaming.

The following list shows a few simple code examples.

Try our full-featured Ollama API client app OllamaSharpConsole to interact with your Ollama instance.

Initializing

// set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);

// select a model which should be used for further operations
ollama.SelectedModel = "llama3.1:8b";

Listing all models that are available locally

var models = await ollama.ListLocalModels();

Pulling a model and reporting progress

await foreach (var status in ollama.PullModel("llama3.1:405b"))
    Console.WriteLine($"{status.Percent}% {status.Status}");

Generating a completion directly into the console

await foreach (var stream in ollama.Generate("How are you today?"))
    Console.Write(stream.Response);

Building interactive chats

var chat = new Chat(ollama);
while (true)
{
    var message = Console.ReadLine();
    await foreach (var answerToken in chat.Send(message))
        Console.Write(answerToken);
}
// messages including their roles and tool calls will automatically be tracked within the chat object
// and are accessible via the Messages property

Credits

The icon and name were reused from the amazing Ollama project.

I would like to thank all the contributors who take the time to improve OllamaSharp. First and foremost mili-tan, who always keeps OllamaSharp in sync with the Ollama API. ❤

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for OllamaSharp

Similar Open Source Tools

For similar tasks

For similar jobs