Ollama4j
A Java library (wrapper/binding) for Ollama server.
Loading...
Searching...
No Matches
io.github.ollama4j.OllamaAPI Class Reference

Public Member Functions

 OllamaAPI ()
 
 OllamaAPI (String host)
 
void setBasicAuth (String username, String password)
 
void setBearerAuth (String bearerToken)
 
boolean ping ()
 
ModelsProcessResponse ps () throws IOException, InterruptedException, OllamaBaseException
 
List< ModellistModels () throws OllamaBaseException, IOException, InterruptedException, URISyntaxException
 
List< LibraryModellistModelsFromLibrary () throws OllamaBaseException, IOException, InterruptedException, URISyntaxException
 
LibraryModelDetail getLibraryModelDetails (LibraryModel libraryModel) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException
 
LibraryModelTag findModelTagFromLibrary (String modelName, String tag) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException
 
void pullModel (String modelName) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException
 
String getVersion () throws URISyntaxException, IOException, InterruptedException, OllamaBaseException
 
void pullModel (LibraryModelTag libraryModelTag) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException
 
ModelDetail getModelDetails (String modelName) throws IOException, OllamaBaseException, InterruptedException, URISyntaxException
 
void createModelWithFilePath (String modelName, String modelFilePath) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException
 
void createModelWithModelFileContents (String modelName, String modelFileContents) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException
 
void createModel (CustomModelRequest customModelRequest) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException
 
void deleteModel (String modelName, boolean ignoreIfNotPresent) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException
 
List< Double > generateEmbeddings (String model, String prompt) throws IOException, InterruptedException, OllamaBaseException
 
List< Double > generateEmbeddings (OllamaEmbeddingsRequestModel modelRequest) throws IOException, InterruptedException, OllamaBaseException
 
OllamaEmbedResponseModel embed (String model, List< String > inputs) throws IOException, InterruptedException, OllamaBaseException
 
OllamaEmbedResponseModel embed (OllamaEmbedRequestModel modelRequest) throws IOException, InterruptedException, OllamaBaseException
 
OllamaResult generate (String model, String prompt, boolean raw, Options options, OllamaStreamHandler responseStreamHandler) throws OllamaBaseException, IOException, InterruptedException
 
OllamaResult generate (String model, String prompt, boolean raw, Options options, OllamaStreamHandler thinkingStreamHandler, OllamaStreamHandler responseStreamHandler) throws OllamaBaseException, IOException, InterruptedException
 
OllamaResult generate (String model, String prompt, boolean raw, boolean think, Options options) throws OllamaBaseException, IOException, InterruptedException
 
OllamaResult generate (String model, String prompt, Map< String, Object > format) throws OllamaBaseException, IOException, InterruptedException
 
OllamaToolsResult generateWithTools (String model, String prompt, Options options) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException
 
OllamaAsyncResultStreamer generateAsync (String model, String prompt, boolean raw, boolean think)
 
OllamaResult generateWithImageFiles (String model, String prompt, List< File > imageFiles, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException
 
OllamaResult generateWithImageFiles (String model, String prompt, List< File > imageFiles, Options options) throws OllamaBaseException, IOException, InterruptedException
 
OllamaResult generateWithImageURLs (String model, String prompt, List< String > imageURLs, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException
 
OllamaResult generateWithImageURLs (String model, String prompt, List< String > imageURLs, Options options) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException
 
OllamaResult generateWithImages (String model, String prompt, List< byte[]> images, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException
 
OllamaResult generateWithImages (String model, String prompt, List< byte[]> images, Options options) throws OllamaBaseException, IOException, InterruptedException
 
OllamaChatResult chat (String model, List< OllamaChatMessage > messages) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException
 
OllamaChatResult chat (OllamaChatRequest request) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException
 
OllamaChatResult chat (OllamaChatRequest request, OllamaStreamHandler thinkingStreamHandler, OllamaStreamHandler responseStreamHandler) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException
 
OllamaChatResult chatStreaming (OllamaChatRequest request, OllamaTokenHandler tokenHandler) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException
 
void registerTool (Tools.ToolSpecification toolSpecification)
 
void registerTools (List< Tools.ToolSpecification > toolSpecifications)
 
void deregisterTools ()
 
void registerAnnotatedTools ()
 
void registerAnnotatedTools (Object object)
 
OllamaChatMessageRole addCustomRole (String roleName)
 
List< OllamaChatMessageRolelistRoles ()
 
OllamaChatMessageRole getRole (String roleName) throws RoleNotFoundException
 

Detailed Description

The base Ollama API class.

Definition at line 56 of file OllamaAPI.java.

Constructor & Destructor Documentation

◆ OllamaAPI() [1/2]

io.github.ollama4j.OllamaAPI.OllamaAPI ( )

Instantiates the Ollama API with default Ollama host: http://localhost:11434

Definition at line 111 of file OllamaAPI.java.

◆ OllamaAPI() [2/2]

io.github.ollama4j.OllamaAPI.OllamaAPI ( String host)

Instantiates the Ollama API with specified Ollama host address.

Parameters
hostthe host address of Ollama server

Definition at line 120 of file OllamaAPI.java.

Member Function Documentation

◆ addCustomRole()

OllamaChatMessageRole io.github.ollama4j.OllamaAPI.addCustomRole ( String roleName)

Adds a custom role.

Parameters
roleNamethe name of the custom role to be added
Returns
the newly created OllamaChatMessageRole

Definition at line 1553 of file OllamaAPI.java.

◆ chat() [1/3]

OllamaChatResult io.github.ollama4j.OllamaAPI.chat ( OllamaChatRequest request) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException

Ask a question to a model using an OllamaChatRequest. This can be constructed using an OllamaChatRequestBuilder.

Hint: the OllamaChatRequestModel#getStream() property is not implemented.

Parameters
requestrequest object to be sent to the server
Returns
OllamaChatResult
Exceptions
OllamaBaseExceptionany response code than 200 has been returned
IOExceptionin case the responseStream can not be read
InterruptedExceptionin case the server is not reachable or network issues happen
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
ToolInvocationExceptionif the tool invocation fails

Definition at line 1309 of file OllamaAPI.java.

◆ chat() [2/3]

OllamaChatResult io.github.ollama4j.OllamaAPI.chat ( OllamaChatRequest request,
OllamaStreamHandler thinkingStreamHandler,
OllamaStreamHandler responseStreamHandler ) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException

Ask a question to a model using an OllamaChatRequest. This can be constructed using an OllamaChatRequestBuilder.

Hint: the OllamaChatRequestModel#getStream() property is not implemented.

Parameters
requestrequest object to be sent to the server
responseStreamHandlercallback handler to handle the last message from stream
thinkingStreamHandlercallback handler to handle the last thinking message from stream
Returns
OllamaChatResult
Exceptions
OllamaBaseExceptionany response code than 200 has been returned
IOExceptionin case the responseStream can not be read
InterruptedExceptionin case the server is not reachable or network issues happen
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
ToolInvocationExceptionif the tool invocation fails

Definition at line 1337 of file OllamaAPI.java.

◆ chat() [3/3]

OllamaChatResult io.github.ollama4j.OllamaAPI.chat ( String model,
List< OllamaChatMessage > messages ) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException

Ask a question to a model based on a given message stack (i.e. a chat history). Creates a synchronous call to the api 'api/chat'.

Parameters
modelthe ollama model to ask the question to
messageschat history / message stack to send to the model
Returns
OllamaChatResult containing the api response and the message history including the newly acquired assistant response.
Exceptions
OllamaBaseExceptionany response code than 200 has been returned
IOExceptionin case the responseStream can not be read
InterruptedExceptionin case the server is not reachable or network issues happen
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
ToolInvocationExceptionif the tool invocation fails

Definition at line 1284 of file OllamaAPI.java.

◆ chatStreaming()

OllamaChatResult io.github.ollama4j.OllamaAPI.chatStreaming ( OllamaChatRequest request,
OllamaTokenHandler tokenHandler ) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException

Ask a question to a model using an OllamaChatRequest. This can be constructed using an OllamaChatRequestBuilder.

Hint: the OllamaChatRequestModel#getStream() property is not implemented.

Parameters
requestrequest object to be sent to the server
tokenHandlercallback handler to handle the last token from stream (caution: the previous tokens from stream will not be concatenated)
Returns
OllamaChatResult
Exceptions
OllamaBaseExceptionany response code than 200 has been returned
IOExceptionin case the responseStream can not be read
InterruptedExceptionin case the server is not reachable or network issues happen
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted

Definition at line 1362 of file OllamaAPI.java.

◆ createModel()

void io.github.ollama4j.OllamaAPI.createModel ( CustomModelRequest customModelRequest) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException

Create a custom model. Read more about custom model creation here.

Parameters
customModelRequestcustom model spec
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
URISyntaxExceptionif the URI for the request is malformed

Definition at line 682 of file OllamaAPI.java.

◆ createModelWithFilePath()

void io.github.ollama4j.OllamaAPI.createModelWithFilePath ( String modelName,
String modelFilePath ) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException

Create a custom model from a model file. Read more about custom model file creation here.

Parameters
modelNamethe name of the custom model to be created.
modelFilePaththe path to model file that exists on the Ollama server.
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
URISyntaxExceptionif the URI for the request is malformed

Definition at line 607 of file OllamaAPI.java.

◆ createModelWithModelFileContents()

void io.github.ollama4j.OllamaAPI.createModelWithModelFileContents ( String modelName,
String modelFileContents ) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException

Create a custom model from a model file. Read more about custom model file creation here.

Parameters
modelNamethe name of the custom model to be created.
modelFileContentsthe path to model file that exists on the Ollama server.
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
URISyntaxExceptionif the URI for the request is malformed

Definition at line 648 of file OllamaAPI.java.

◆ deleteModel()

void io.github.ollama4j.OllamaAPI.deleteModel ( String modelName,
boolean ignoreIfNotPresent ) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException

Delete a model from Ollama server.

Parameters
modelNamethe name of the model to be deleted.
ignoreIfNotPresentignore errors if the specified model is not present on Ollama server.
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
URISyntaxExceptionif the URI for the request is malformed

Definition at line 716 of file OllamaAPI.java.

◆ deregisterTools()

void io.github.ollama4j.OllamaAPI.deregisterTools ( )

Deregisters all tools from the tool registry. This method removes all registered tools, effectively clearing the registry.

Definition at line 1445 of file OllamaAPI.java.

◆ embed() [1/2]

OllamaEmbedResponseModel io.github.ollama4j.OllamaAPI.embed ( OllamaEmbedRequestModel modelRequest) throws IOException, InterruptedException, OllamaBaseException

Generate embeddings using a OllamaEmbedRequestModel.

Parameters
modelRequestrequest for '/api/embed' endpoint
Returns
embeddings
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted

Definition at line 810 of file OllamaAPI.java.

◆ embed() [2/2]

OllamaEmbedResponseModel io.github.ollama4j.OllamaAPI.embed ( String model,
List< String > inputs ) throws IOException, InterruptedException, OllamaBaseException

Generate embeddings for a given text from a model

Parameters
modelname of model to generate embeddings from
inputstext/s to generate embeddings for
Returns
embeddings
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted

Definition at line 796 of file OllamaAPI.java.

◆ findModelTagFromLibrary()

LibraryModelTag io.github.ollama4j.OllamaAPI.findModelTagFromLibrary ( String modelName,
String tag ) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException

Finds a specific model using model name and tag from Ollama library.

Deprecated: This method relies on the HTML structure of the Ollama website, which is subject to change at any time. As a result, it is difficult to keep this API method consistently updated and reliable. Therefore, this method is deprecated and may be removed in future releases.

This method retrieves the model from the Ollama library by its name, then fetches its tags. It searches through the tags of the model to find one that matches the specified tag name. If the model or the tag is not found, it throws a NoSuchElementException.

Parameters
modelNameThe name of the model to search for in the library.
tagThe tag name to search for within the specified model.
Returns
The LibraryModelTag associated with the specified model and tag.
Exceptions
OllamaBaseExceptionIf there is a problem with the Ollama library operations.
IOExceptionIf an I/O error occurs during the operation.
URISyntaxExceptionIf there is an error with the URI syntax.
InterruptedExceptionIf the operation is interrupted.
NoSuchElementExceptionIf the model or the tag is not found.
Deprecated
This method relies on the HTML structure of the Ollama website, which can change at any time and break this API. It is deprecated and may be removed in the future.

Definition at line 404 of file OllamaAPI.java.

◆ generate() [1/4]

OllamaResult io.github.ollama4j.OllamaAPI.generate ( String model,
String prompt,
boolean raw,
boolean think,
Options options ) throws OllamaBaseException, IOException, InterruptedException

Generates response using the specified AI model and prompt (in blocking mode).

Uses generate(String, String, boolean, Options, OllamaStreamHandler)

Parameters
modelThe name or identifier of the AI model to use for generating the response.
promptThe input text or prompt to provide to the AI model.
rawIn some cases, you may wish to bypass the templating system and provide a full prompt. In this case, you can use the raw parameter to disable templating. Also note that raw mode will not return a context.
optionsAdditional options or configurations to use when generating the response.
thinkif true the model will "think" step-by-step before generating the final response
Returns
OllamaResult
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted

Definition at line 926 of file OllamaAPI.java.

◆ generate() [2/4]

OllamaResult io.github.ollama4j.OllamaAPI.generate ( String model,
String prompt,
boolean raw,
Options options,
OllamaStreamHandler responseStreamHandler ) throws OllamaBaseException, IOException, InterruptedException

Generate response for a question to a model running on Ollama server. This is a sync/blocking call. This API does not support "thinking" models.

Parameters
modelthe ollama model to ask the question to
promptthe prompt/question text
rawif true no formatting will be applied to the prompt. You may choose to use the raw parameter if you are specifying a full templated prompt in your request to the API
optionsthe Options object - More details on the options
responseStreamHandleroptional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false.
Returns
OllamaResult that includes response text and time taken for response
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted

Definition at line 857 of file OllamaAPI.java.

◆ generate() [3/4]

OllamaResult io.github.ollama4j.OllamaAPI.generate ( String model,
String prompt,
boolean raw,
Options options,
OllamaStreamHandler thinkingStreamHandler,
OllamaStreamHandler responseStreamHandler ) throws OllamaBaseException, IOException, InterruptedException

Generate thinking and response tokens for a question to a thinking model running on Ollama server. This is a sync/blocking call.

Parameters
modelthe ollama model to ask the question to
promptthe prompt/question text
rawif true no formatting will be applied to the prompt. You may choose to use the raw parameter if you are specifying a full templated prompt in your request to the API
optionsthe Options object - More details on the options
responseStreamHandleroptional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false.
Returns
OllamaResult that includes response text and time taken for response
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted

Definition at line 893 of file OllamaAPI.java.

◆ generate() [4/4]

OllamaResult io.github.ollama4j.OllamaAPI.generate ( String model,
String prompt,
Map< String, Object > format ) throws OllamaBaseException, IOException, InterruptedException

Generates structured output from the specified AI model and prompt.

Note: When formatting is specified, the 'think' parameter is not allowed.

Parameters
modelThe name or identifier of the AI model to use for generating the response.
promptThe input text or prompt to provide to the AI model.
formatA map containing the format specification for the structured output.
Returns
An instance of OllamaResult containing the structured response.
Exceptions
OllamaBaseExceptionif the response indicates an error status.
IOExceptionif an I/O error occurs during the HTTP request.
InterruptedExceptionif the operation is interrupted.

Definition at line 952 of file OllamaAPI.java.

◆ generateAsync()

OllamaAsyncResultStreamer io.github.ollama4j.OllamaAPI.generateAsync ( String model,
String prompt,
boolean raw,
boolean think )

Asynchronously generates a response for a prompt using a model running on the Ollama server.

This method returns an OllamaAsyncResultStreamer handle that can be used to poll for status and retrieve streamed "thinking" and response tokens from the model. The call is non-blocking.

Example usage:


OllamaAsyncResultStreamer resultStreamer = ollamaAPI.generateAsync("gpt-oss:20b", "Who are you", false, true);
int pollIntervalMilliseconds = 1000;
while (true) {
String thinkingTokens = resultStreamer.getThinkingResponseStream().poll();
String responseTokens = resultStreamer.getResponseStream().poll();
System.out.print(thinkingTokens != null ? thinkingTokens.toUpperCase() : "");
System.out.print(responseTokens != null ? responseTokens.toLowerCase() : "");
Thread.sleep(pollIntervalMilliseconds);
if (!resultStreamer.isAlive())
break;
}
System.out.println("Complete thinking response: " + resultStreamer.getCompleteThinkingResponse());
System.out.println("Complete response: " + resultStreamer.getCompleteResponse());
Parameters
modelthe Ollama model to use for generating the response
promptthe prompt or question text to send to the model
rawif true, returns the raw response from the model
thinkif true, streams "thinking" tokens as well as response tokens
Returns
an OllamaAsyncResultStreamer handle for polling and retrieving streamed results

Definition at line 1111 of file OllamaAPI.java.

◆ generateEmbeddings() [1/2]

List< Double > io.github.ollama4j.OllamaAPI.generateEmbeddings ( OllamaEmbeddingsRequestModel modelRequest) throws IOException, InterruptedException, OllamaBaseException

Generate embeddings using a OllamaEmbeddingsRequestModel.

Parameters
modelRequestrequest for '/api/embeddings' endpoint
Returns
embeddings
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
Deprecated
Use embed(OllamaEmbedRequestModel) instead.

Definition at line 765 of file OllamaAPI.java.

◆ generateEmbeddings() [2/2]

List< Double > io.github.ollama4j.OllamaAPI.generateEmbeddings ( String model,
String prompt ) throws IOException, InterruptedException, OllamaBaseException

Generate embeddings for a given text from a model

Parameters
modelname of model to generate embeddings from
prompttext to generate embeddings for
Returns
embeddings
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
Deprecated
Use embed(String, List) instead.

Definition at line 749 of file OllamaAPI.java.

◆ generateWithImageFiles() [1/2]

OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageFiles ( String model,
String prompt,
List< File > imageFiles,
Options options ) throws OllamaBaseException, IOException, InterruptedException

Convenience method to call Ollama API without streaming responses.

Uses generateWithImageFiles(String, String, List, Options, OllamaStreamHandler)

Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted

Definition at line 1163 of file OllamaAPI.java.

◆ generateWithImageFiles() [2/2]

OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageFiles ( String model,
String prompt,
List< File > imageFiles,
Options options,
OllamaStreamHandler streamHandler ) throws OllamaBaseException, IOException, InterruptedException

With one or more image files, ask a question to a model running on Ollama server. This is a sync/blocking call.

Parameters
modelthe ollama model to ask the question to
promptthe prompt/question text
imageFilesthe list of image files to use for the question
optionsthe Options object - More details on the options
streamHandleroptional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false.
Returns
OllamaResult that includes response text and time taken for response
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted

Definition at line 1142 of file OllamaAPI.java.

◆ generateWithImages() [1/2]

OllamaResult io.github.ollama4j.OllamaAPI.generateWithImages ( String model,
String prompt,
List< byte[]> images,
Options options ) throws OllamaBaseException, IOException, InterruptedException

Convenience method to call the Ollama API using image byte arrays without streaming responses.

Uses generateWithImages(String, String, List, Options, OllamaStreamHandler)

Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted

Definition at line 1259 of file OllamaAPI.java.

◆ generateWithImages() [2/2]

OllamaResult io.github.ollama4j.OllamaAPI.generateWithImages ( String model,
String prompt,
List< byte[]> images,
Options options,
OllamaStreamHandler streamHandler ) throws OllamaBaseException, IOException, InterruptedException

Synchronously generates a response using a list of image byte arrays.

This method encodes the provided byte arrays into Base64 and sends them to the Ollama server.

Parameters
modelthe Ollama model to use for generating the response
promptthe prompt or question text to send to the model
imagesthe list of image data as byte arrays
optionsthe Options object - More details on the options
streamHandleroptional callback that will be invoked with each streamed response; if null, streaming is disabled
Returns
OllamaResult containing the response text and the time taken for the response
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted

Definition at line 1237 of file OllamaAPI.java.

◆ generateWithImageURLs() [1/2]

OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageURLs ( String model,
String prompt,
List< String > imageURLs,
Options options ) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException

Convenience method to call Ollama API without streaming responses.

Uses generateWithImageURLs(String, String, List, Options, OllamaStreamHandler)

Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
URISyntaxExceptionif the URI for the request is malformed

Definition at line 1212 of file OllamaAPI.java.

◆ generateWithImageURLs() [2/2]

OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageURLs ( String model,
String prompt,
List< String > imageURLs,
Options options,
OllamaStreamHandler streamHandler ) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException

With one or more image URLs, ask a question to a model running on Ollama server. This is a sync/blocking call.

Parameters
modelthe ollama model to ask the question to
promptthe prompt/question text
imageURLsthe list of image URLs to use for the question
optionsthe Options object - More details on the options
streamHandleroptional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false.
Returns
OllamaResult that includes response text and time taken for response
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
URISyntaxExceptionif the URI for the request is malformed

Definition at line 1189 of file OllamaAPI.java.

◆ generateWithTools()

OllamaToolsResult io.github.ollama4j.OllamaAPI.generateWithTools ( String model,
String prompt,
Options options ) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException

Generates response using the specified AI model and prompt (in blocking mode), and then invokes a set of tools on the generated response.

Parameters
modelThe name or identifier of the AI model to use for generating the response.
promptThe input text or prompt to provide to the AI model.
optionsAdditional options or configurations to use when generating the response.
Returns
OllamaToolsResult An OllamaToolsResult object containing the response from the AI model and the results of invoking the tools on that output.
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted

Definition at line 1029 of file OllamaAPI.java.

◆ getLibraryModelDetails()

LibraryModelDetail io.github.ollama4j.OllamaAPI.getLibraryModelDetails ( LibraryModel libraryModel) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException

Fetches the tags associated with a specific model from Ollama library. This method fetches the available model tags directly from Ollama library model page, including model tag name, size and time when model was last updated into a list of LibraryModelTag objects.

Parameters
libraryModelthe LibraryModel object which contains the name of the library model for which the tags need to be fetched.
Returns
a list of LibraryModelTag objects containing the extracted tags and their associated metadata.
Exceptions
OllamaBaseExceptionif the HTTP response status code indicates an error (i.e., not 200 OK), or if there is any other issue during the request or response processing.
IOExceptionif an input/output exception occurs during the HTTP request or response handling.
InterruptedExceptionif the thread is interrupted while waiting for the HTTP response.
URISyntaxExceptionif the URI format is incorrect or invalid.

Definition at line 326 of file OllamaAPI.java.

◆ getModelDetails()

ModelDetail io.github.ollama4j.OllamaAPI.getModelDetails ( String modelName) throws IOException, OllamaBaseException, InterruptedException, URISyntaxException

Gets model details from the Ollama server.

Parameters
modelNamethe model
Returns
the model details
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
URISyntaxExceptionif the URI for the request is malformed

Definition at line 574 of file OllamaAPI.java.

◆ getRole()

OllamaChatMessageRole io.github.ollama4j.OllamaAPI.getRole ( String roleName) throws RoleNotFoundException

Retrieves a specific role by name.

Parameters
roleNamethe name of the role to retrieve
Returns
the OllamaChatMessageRole associated with the given name
Exceptions
RoleNotFoundExceptionif the role with the specified name does not exist

Definition at line 1574 of file OllamaAPI.java.

◆ getVersion()

String io.github.ollama4j.OllamaAPI.getVersion ( ) throws URISyntaxException, IOException, InterruptedException, OllamaBaseException

Definition at line 527 of file OllamaAPI.java.

◆ listModels()

List< Model > io.github.ollama4j.OllamaAPI.listModels ( ) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException

Lists available models from the Ollama server.

Returns
a list of models available on the server
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
URISyntaxExceptionif the URI for the request is malformed

Definition at line 223 of file OllamaAPI.java.

◆ listModelsFromLibrary()

List< LibraryModel > io.github.ollama4j.OllamaAPI.listModelsFromLibrary ( ) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException

Retrieves a list of models from the Ollama library. This method fetches the available models directly from Ollama library page, including model details such as the name, pull count, popular tags, tag count, and the time when model was updated.

Returns
A list of LibraryModel objects representing the models available in the Ollama library.
Exceptions
OllamaBaseExceptionIf the HTTP request fails or the response is not successful (non-200 status code).
IOExceptionIf an I/O error occurs during the HTTP request or response processing.
InterruptedExceptionIf the thread executing the request is interrupted.
URISyntaxExceptionIf there is an error creating the URI for the HTTP request.

Definition at line 257 of file OllamaAPI.java.

◆ listRoles()

List< OllamaChatMessageRole > io.github.ollama4j.OllamaAPI.listRoles ( )

Lists all available roles.

Returns
a list of available OllamaChatMessageRole objects

Definition at line 1562 of file OllamaAPI.java.

◆ ping()

boolean io.github.ollama4j.OllamaAPI.ping ( )

API to check the reachability of Ollama server.

Returns
true if the server is reachable, false otherwise.

Definition at line 157 of file OllamaAPI.java.

◆ ps()

ModelsProcessResponse io.github.ollama4j.OllamaAPI.ps ( ) throws IOException, InterruptedException, OllamaBaseException

Provides a list of running models and details about each model currently loaded into memory.

Returns
ModelsProcessResponse containing details about the running models
Exceptions
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
OllamaBaseExceptionif the response indicates an error status

Definition at line 191 of file OllamaAPI.java.

◆ pullModel() [1/2]

void io.github.ollama4j.OllamaAPI.pullModel ( LibraryModelTag libraryModelTag) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException

Pulls a model using the specified Ollama library model tag. The model is identified by a name and a tag, which are combined into a single identifier in the format "name:tag" to pull the corresponding model.

Parameters
libraryModelTagthe LibraryModelTag object containing the name and tag of the model to be pulled.
Exceptions
OllamaBaseExceptionif the response indicates an error status
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted
URISyntaxExceptionif the URI for the request is malformed

Definition at line 558 of file OllamaAPI.java.

◆ pullModel() [2/2]

void io.github.ollama4j.OllamaAPI.pullModel ( String modelName) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException

Pull a model on the Ollama server from the list of available models.

If numberOfRetriesForModelPull is greater than 0, this method will retry pulling the model up to the specified number of times if an OllamaBaseException occurs, using exponential backoff between retries (delay doubles after each failed attempt, starting at 1 second).

The backoff is only applied between retries, not after the final attempt.

Parameters
modelNamethe name of the model
Exceptions
OllamaBaseExceptionif the response indicates an error status or all retries fail
IOExceptionif an I/O error occurs during the HTTP request
InterruptedExceptionif the operation is interrupted or the thread is interrupted during backoff
URISyntaxExceptionif the URI for the request is malformed

Definition at line 437 of file OllamaAPI.java.

◆ registerAnnotatedTools() [1/2]

void io.github.ollama4j.OllamaAPI.registerAnnotatedTools ( )

Registers tools based on the annotations found on the methods of the caller's class and its providers. This method scans the caller's class for the OllamaToolService annotation and recursively registers annotated tools from all the providers specified in the annotation.

Exceptions
IllegalStateExceptionif the caller's class is not annotated with OllamaToolService.
RuntimeExceptionif any reflection-based instantiation or invocation fails.

Definition at line 1464 of file OllamaAPI.java.

◆ registerAnnotatedTools() [2/2]

void io.github.ollama4j.OllamaAPI.registerAnnotatedTools ( Object object)

Registers tools based on the annotations found on the methods of the provided object. This method scans the methods of the given object and registers tools using the ToolSpec annotation and associated ToolProperty annotations. It constructs tool specifications and stores them in a tool registry.

Parameters
objectthe object whose methods are to be inspected for annotated tools.
Exceptions
RuntimeExceptionif any reflection-based instantiation or invocation fails.

Definition at line 1501 of file OllamaAPI.java.

◆ registerTool()

void io.github.ollama4j.OllamaAPI.registerTool ( Tools.ToolSpecification toolSpecification)

Registers a single tool in the tool registry using the provided tool specification.

Parameters
toolSpecificationthe specification of the tool to register. It contains the tool's function name and other relevant information.

Definition at line 1418 of file OllamaAPI.java.

◆ registerTools()

void io.github.ollama4j.OllamaAPI.registerTools ( List< Tools.ToolSpecification > toolSpecifications)

Registers multiple tools in the tool registry using a list of tool specifications. Iterates over the list and adds each tool specification to the registry.

Parameters
toolSpecificationsa list of tool specifications to register. Each specification contains information about a tool, such as its function name.

Definition at line 1435 of file OllamaAPI.java.

◆ setBasicAuth()

void io.github.ollama4j.OllamaAPI.setBasicAuth ( String username,
String password )

Set basic authentication for accessing Ollama server that's behind a reverse-proxy/gateway.

Parameters
usernamethe username
passwordthe password

Definition at line 138 of file OllamaAPI.java.

◆ setBearerAuth()

void io.github.ollama4j.OllamaAPI.setBearerAuth ( String bearerToken)

Set Bearer authentication for accessing Ollama server that's behind a reverse-proxy/gateway.

Parameters
bearerTokenthe Bearer authentication token to provide

Definition at line 148 of file OllamaAPI.java.


The documentation for this class was generated from the following file: