- 
                Notifications
    You must be signed in to change notification settings 
- Fork 0
Ollama.cs
The Ollama class is designed to interface with an Ollama API endpoint to retrieve model details, including properties like the context window size. It also provides a helper method to determine if a given collection of models is served by an Ollama provider.
- 
Namespace: LLMHelperFunctions
- Purpose: Facilitates interactions with the Ollama API to fetch model details.
- 
Ollama(Uri endpoint)
 Initializes a new instance of theOllamaclass.- 
Parameters:
- 
endpoint: The base URI of the Ollama API endpoint.
 
- 
 
- 
Parameters:
- 
Method: Show(string modelName, bool verbose = false)
 Retrieves details about a specified model from the Ollama API.- 
Parameters:
- 
modelName: The name of the model to query.
- 
verbose: When set totrue, requests more detailed information (default isfalse).
 
- 
- Returns: A task that resolves to a dictionary mapping keys to objects representing model details.
- 
Exceptions:
- 
HttpRequestExceptionif the HTTP request fails.
 
- 
 
- 
Parameters:
- 
Static Method: IsOllama(OpenAIModelCollection availableModels)
 Determines whether the provided collection of OpenAI models indicates that the underlying provider is Ollama.- 
Parameters:
- 
availableModels: A collection of OpenAI models.
 
- 
- 
Returns:
- 
trueif the provider is detected as Ollama (the first model’sOwnedByproperty equals"library"); otherwise,false.
 
- 
- 
Exceptions:
- Throws an OllamaExceptionif the collection is empty.
 
- Throws an 
 
- 
Parameters:
- 
Namespace: LLMHelperFunctions
- Purpose: Represents errors that occur during interactions with the Ollama API.
- 
OllamaException(string message)
 Initializes a new instance with a specified error message.
- 
OllamaException(string message, Exception innerException)
 Initializes a new instance with a specified error message and a reference to the inner exception.
using System;
using System.Threading.Tasks;
using LLMHelperFunctions;
public class OllamaExample
{
    public async Task Run()
    {
        // Base URI for your Ollama API endpoint
        Uri endpoint = new Uri("https://your-ollama-api-endpoint.com/");
        string modelName = "your-model-name";
        // Create an Ollama instance
        var ollama = new Ollama(endpoint);
        try
        {
            // Retrieve model details (with verbose output)
            var modelDetails = await ollama.Show(modelName, verbose: true);
            // Process the model details as needed
            Console.WriteLine("Model details received:");
            foreach (var kvp in modelDetails)
            {
                Console.WriteLine($"{kvp.Key}: {kvp.Value}");
            }
        }
        catch (HttpRequestException ex)
        {
            Console.Error.WriteLine($"HTTP error while retrieving model details: {ex.Message}");
        }
        catch (OllamaException ex)
        {
            Console.Error.WriteLine($"Error with Ollama API: {ex.Message}");
        }
    }
}