Ollama4j
A Java library (wrapper/binding) for Ollama server.
Loading...
Searching...
No Matches
io.github.ollama4j.OllamaAPI Class Reference

Public Member Functions

 OllamaAPI ()
 
 OllamaAPI (String host)
 
void setBasicAuth (String username, String password)
 
boolean ping ()
 
ModelsProcessResponse ps () throws IOException, InterruptedException, OllamaBaseException
 
List< ModellistModels () throws OllamaBaseException, IOException, InterruptedException, URISyntaxException
 
void pullModel (String modelName) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException
 
ModelDetail getModelDetails (String modelName) throws IOException, OllamaBaseException, InterruptedException, URISyntaxException
 
void createModelWithFilePath (String modelName, String modelFilePath) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException
 
void createModelWithModelFileContents (String modelName, String modelFileContents) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException
 
void deleteModel (String modelName, boolean ignoreIfNotPresent) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException
 
List< Double > generateEmbeddings (String model, String prompt) throws IOException, InterruptedException, OllamaBaseException
 
List< Double > generateEmbeddings (OllamaEmbeddingsRequestModel modelRequest) throws IOException, InterruptedException, OllamaBaseException
 
OllamaResult generate (String model, String prompt, boolean raw, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException
 
OllamaResult generate (String model, String prompt, boolean raw, Options options) throws OllamaBaseException, IOException, InterruptedException
 
OllamaToolsResult generateWithTools (String model, String prompt, Options options) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException
 
OllamaAsyncResultStreamer generateAsync (String model, String prompt, boolean raw)
 
OllamaResult generateWithImageFiles (String model, String prompt, List< File > imageFiles, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException
 
OllamaResult generateWithImageFiles (String model, String prompt, List< File > imageFiles, Options options) throws OllamaBaseException, IOException, InterruptedException
 
OllamaResult generateWithImageURLs (String model, String prompt, List< String > imageURLs, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException
 
OllamaResult generateWithImageURLs (String model, String prompt, List< String > imageURLs, Options options) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException
 
OllamaChatResult chat (String model, List< OllamaChatMessage > messages) throws OllamaBaseException, IOException, InterruptedException
 
OllamaChatResult chat (OllamaChatRequest request) throws OllamaBaseException, IOException, InterruptedException
 
OllamaChatResult chat (OllamaChatRequest request, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException
 
void registerTool (Tools.ToolSpecification toolSpecification)
 

Detailed Description

The base Ollama API class.

Definition at line 40 of file OllamaAPI.java.

Constructor & Destructor Documentation

◆ OllamaAPI() [1/2]

io.github.ollama4j.OllamaAPI.OllamaAPI ( )

Instantiates the Ollama API with default Ollama host: http://localhost:11434

Definition at line 63 of file OllamaAPI.java.

◆ OllamaAPI() [2/2]

io.github.ollama4j.OllamaAPI.OllamaAPI ( String host)

Instantiates the Ollama API with specified Ollama host address.

Parameters
hostthe host address of Ollama server

Definition at line 72 of file OllamaAPI.java.

Member Function Documentation

◆ chat() [1/3]

OllamaChatResult io.github.ollama4j.OllamaAPI.chat ( OllamaChatRequest request) throws OllamaBaseException, IOException, InterruptedException

Ask a question to a model using an OllamaChatRequest. This can be constructed using an OllamaChatRequestBuilder.

Hint: the OllamaChatRequestModel#getStream() property is not implemented.

Parameters
requestrequest object to be sent to the server
Returns
OllamaChatResult
Exceptions
OllamaBaseExceptionany response code than 200 has been returned
IOExceptionin case the responseStream can not be read
InterruptedExceptionin case the server is not reachable or network issues happen

Definition at line 572 of file OllamaAPI.java.

◆ chat() [2/3]

OllamaChatResult io.github.ollama4j.OllamaAPI.chat ( OllamaChatRequest request,
OllamaStreamHandler streamHandler ) throws OllamaBaseException, IOException, InterruptedException

Ask a question to a model using an OllamaChatRequest. This can be constructed using an OllamaChatRequestBuilder.

Hint: the OllamaChatRequestModel#getStream() property is not implemented.

Parameters
requestrequest object to be sent to the server
streamHandlercallback handler to handle the last message from stream (caution: all previous messages from stream will be concatenated)
Returns
OllamaChatResult
Exceptions
OllamaBaseExceptionany response code than 200 has been returned
IOExceptionin case the responseStream can not be read
InterruptedExceptionin case the server is not reachable or network issues happen

Definition at line 588 of file OllamaAPI.java.

◆ chat() [3/3]

OllamaChatResult io.github.ollama4j.OllamaAPI.chat ( String model,
List< OllamaChatMessage > messages ) throws OllamaBaseException, IOException, InterruptedException

Ask a question to a model based on a given message stack (i.e. a chat history). Creates a synchronous call to the api 'api/chat'.

Parameters
modelthe ollama model to ask the question to
messageschat history / message stack to send to the model
Returns
OllamaChatResult containing the api response and the message history including the newly aqcuired assistant response.
Exceptions
OllamaBaseExceptionany response code than 200 has been returned
IOExceptionin case the responseStream can not be read
InterruptedExceptionin case the server is not reachable or network issues happen

Definition at line 556 of file OllamaAPI.java.

◆ createModelWithFilePath()

void io.github.ollama4j.OllamaAPI.createModelWithFilePath ( String modelName,
String modelFilePath ) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException

Create a custom model from a model file. Read more about custom model file creation here.

Parameters
modelNamethe name of the custom model to be created.
modelFilePaththe path to model file that exists on the Ollama server.

Definition at line 252 of file OllamaAPI.java.

◆ createModelWithModelFileContents()

void io.github.ollama4j.OllamaAPI.createModelWithModelFileContents ( String modelName,
String modelFileContents ) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException

Create a custom model from a model file. Read more about custom model file creation here.

Parameters
modelNamethe name of the custom model to be created.
modelFileContentsthe path to model file that exists on the Ollama server.

Definition at line 286 of file OllamaAPI.java.

◆ deleteModel()

void io.github.ollama4j.OllamaAPI.deleteModel ( String modelName,
boolean ignoreIfNotPresent ) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException

Delete a model from Ollama server.

Parameters
modelNamethe name of the model to be deleted.
ignoreIfNotPresentignore errors if the specified model is not present on Ollama server.

Definition at line 317 of file OllamaAPI.java.

◆ generate() [1/2]

OllamaResult io.github.ollama4j.OllamaAPI.generate ( String model,
String prompt,
boolean raw,
Options options ) throws OllamaBaseException, IOException, InterruptedException

Generates response using the specified AI model and prompt (in blocking mode).

Uses generate(String, String, boolean, Options, OllamaStreamHandler)

Parameters
modelThe name or identifier of the AI model to use for generating the response.
promptThe input text or prompt to provide to the AI model.
rawIn some cases, you may wish to bypass the templating system and provide a full prompt. In this case, you can use the raw parameter to disable templating. Also note that raw mode will not return a context.
optionsAdditional options or configurations to use when generating the response.
Returns
OllamaResult

Definition at line 410 of file OllamaAPI.java.

◆ generate() [2/2]

OllamaResult io.github.ollama4j.OllamaAPI.generate ( String model,
String prompt,
boolean raw,
Options options,
OllamaStreamHandler streamHandler ) throws OllamaBaseException, IOException, InterruptedException

Generate response for a question to a model running on Ollama server. This is a sync/blocking call.

Parameters
modelthe ollama model to ask the question to
promptthe prompt/question text
optionsthe Options object - More details on the options
streamHandleroptional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false.
Returns
OllamaResult that includes response text and time taken for response

Definition at line 391 of file OllamaAPI.java.

◆ generateAsync()

OllamaAsyncResultStreamer io.github.ollama4j.OllamaAPI.generateAsync ( String model,
String prompt,
boolean raw )

Generate response for a question to a model running on Ollama server and get a callback handle that can be used to check for status and get the response from the model later. This would be an async/non-blocking call.

Parameters
modelthe ollama model to ask the question to
promptthe prompt/question text
Returns
the ollama async result callback handle

Definition at line 461 of file OllamaAPI.java.

◆ generateEmbeddings() [1/2]

List< Double > io.github.ollama4j.OllamaAPI.generateEmbeddings ( OllamaEmbeddingsRequestModel modelRequest) throws IOException, InterruptedException, OllamaBaseException

Generate embeddings using a OllamaEmbeddingsRequestModel.

Parameters
modelRequestrequest for '/api/embeddings' endpoint
Returns
embeddings

Definition at line 357 of file OllamaAPI.java.

◆ generateEmbeddings() [2/2]

List< Double > io.github.ollama4j.OllamaAPI.generateEmbeddings ( String model,
String prompt ) throws IOException, InterruptedException, OllamaBaseException

Generate embeddings for a given text from a model

Parameters
modelname of model to generate embeddings from
prompttext to generate embeddings for
Returns
embeddings

Definition at line 346 of file OllamaAPI.java.

◆ generateWithImageFiles() [1/2]

OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageFiles ( String model,
String prompt,
List< File > imageFiles,
Options options ) throws OllamaBaseException, IOException, InterruptedException

Convenience method to call Ollama API without streaming responses.

Uses generateWithImageFiles(String, String, List, Options, OllamaStreamHandler)

Definition at line 502 of file OllamaAPI.java.

◆ generateWithImageFiles() [2/2]

OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageFiles ( String model,
String prompt,
List< File > imageFiles,
Options options,
OllamaStreamHandler streamHandler ) throws OllamaBaseException, IOException, InterruptedException

With one or more image files, ask a question to a model running on Ollama server. This is a sync/blocking call.

Parameters
modelthe ollama model to ask the question to
promptthe prompt/question text
imageFilesthe list of image files to use for the question
optionsthe Options object - More details on the options
streamHandleroptional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false.
Returns
OllamaResult that includes response text and time taken for response

Definition at line 485 of file OllamaAPI.java.

◆ generateWithImageURLs() [1/2]

OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageURLs ( String model,
String prompt,
List< String > imageURLs,
Options options ) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException

Convenience method to call Ollama API without streaming responses.

Uses generateWithImageURLs(String, String, List, Options, OllamaStreamHandler)

Definition at line 538 of file OllamaAPI.java.

◆ generateWithImageURLs() [2/2]

OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageURLs ( String model,
String prompt,
List< String > imageURLs,
Options options,
OllamaStreamHandler streamHandler ) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException

With one or more image URLs, ask a question to a model running on Ollama server. This is a sync/blocking call.

Parameters
modelthe ollama model to ask the question to
promptthe prompt/question text
imageURLsthe list of image URLs to use for the question
optionsthe Options object - More details on the options
streamHandleroptional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false.
Returns
OllamaResult that includes response text and time taken for response

Definition at line 521 of file OllamaAPI.java.

◆ generateWithTools()

OllamaToolsResult io.github.ollama4j.OllamaAPI.generateWithTools ( String model,
String prompt,
Options options ) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException

Generates response using the specified AI model and prompt (in blocking mode), and then invokes a set of tools on the generated response.

Parameters
modelThe name or identifier of the AI model to use for generating the response.
promptThe input text or prompt to provide to the AI model.
optionsAdditional options or configurations to use when generating the response.
Returns
OllamaToolsResult An OllamaToolsResult object containing the response from the AI model and the results of invoking the tools on that output.
Exceptions
OllamaBaseExceptionIf there is an error related to the Ollama API or service.
IOExceptionIf there is an error related to input/output operations.
InterruptedExceptionIf the method is interrupted while waiting for the AI model to generate the response or for the tools to be invoked.

Definition at line 429 of file OllamaAPI.java.

◆ getModelDetails()

ModelDetail io.github.ollama4j.OllamaAPI.getModelDetails ( String modelName) throws IOException, OllamaBaseException, InterruptedException, URISyntaxException

Gets model details from the Ollama server.

Parameters
modelNamethe model
Returns
the model details

Definition at line 224 of file OllamaAPI.java.

◆ listModels()

List< Model > io.github.ollama4j.OllamaAPI.listModels ( ) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException

List available models from Ollama server.

Returns
the list

Definition at line 157 of file OllamaAPI.java.

◆ ping()

boolean io.github.ollama4j.OllamaAPI.ping ( )

API to check the reachability of Ollama server.

Returns
true if the server is reachable, false otherwise.

Definition at line 95 of file OllamaAPI.java.

◆ ps()

ModelsProcessResponse io.github.ollama4j.OllamaAPI.ps ( ) throws IOException, InterruptedException, OllamaBaseException

Provides a list of running models and details about each model currently loaded into memory.

Returns
ModelsProcessResponse

Definition at line 126 of file OllamaAPI.java.

◆ pullModel()

void io.github.ollama4j.OllamaAPI.pullModel ( String modelName) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException

Pull a model on the Ollama server from the list of available models.

Parameters
modelNamethe name of the model

Definition at line 186 of file OllamaAPI.java.

◆ registerTool()

void io.github.ollama4j.OllamaAPI.registerTool ( Tools.ToolSpecification toolSpecification)

Definition at line 600 of file OllamaAPI.java.

◆ setBasicAuth()

void io.github.ollama4j.OllamaAPI.setBasicAuth ( String username,
String password )

Set basic authentication for accessing Ollama server that's behind a reverse-proxy/gateway.

Parameters
usernamethe username
passwordthe password

Definition at line 86 of file OllamaAPI.java.


The documentation for this class was generated from the following file: