![]() |
Ollama4j
A Java library (wrapper/binding) for Ollama server.
|
Public Member Functions | |
OllamaAPI () | |
OllamaAPI (String host) | |
void | setBasicAuth (String username, String password) |
void | setBearerAuth (String bearerToken) |
boolean | ping () |
ModelsProcessResponse | ps () throws IOException, InterruptedException, OllamaBaseException |
List< Model > | listModels () throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
List< LibraryModel > | listModelsFromLibrary () throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
LibraryModelDetail | getLibraryModelDetails (LibraryModel libraryModel) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
LibraryModelTag | findModelTagFromLibrary (String modelName, String tag) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException |
void | pullModel (String modelName) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException |
String | getVersion () throws URISyntaxException, IOException, InterruptedException, OllamaBaseException |
void | pullModel (LibraryModelTag libraryModelTag) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException |
ModelDetail | getModelDetails (String modelName) throws IOException, OllamaBaseException, InterruptedException, URISyntaxException |
void | createModelWithFilePath (String modelName, String modelFilePath) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
void | createModelWithModelFileContents (String modelName, String modelFileContents) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
void | createModel (CustomModelRequest customModelRequest) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
void | deleteModel (String modelName, boolean ignoreIfNotPresent) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
List< Double > | generateEmbeddings (String model, String prompt) throws IOException, InterruptedException, OllamaBaseException |
List< Double > | generateEmbeddings (OllamaEmbeddingsRequestModel modelRequest) throws IOException, InterruptedException, OllamaBaseException |
OllamaEmbedResponseModel | embed (String model, List< String > inputs) throws IOException, InterruptedException, OllamaBaseException |
OllamaEmbedResponseModel | embed (OllamaEmbedRequestModel modelRequest) throws IOException, InterruptedException, OllamaBaseException |
OllamaResult | generate (String model, String prompt, boolean raw, Options options, OllamaStreamHandler responseStreamHandler) throws OllamaBaseException, IOException, InterruptedException |
OllamaResult | generate (String model, String prompt, boolean raw, Options options, OllamaStreamHandler thinkingStreamHandler, OllamaStreamHandler responseStreamHandler) throws OllamaBaseException, IOException, InterruptedException |
OllamaResult | generate (String model, String prompt, boolean raw, boolean think, Options options) throws OllamaBaseException, IOException, InterruptedException |
OllamaResult | generate (String model, String prompt, Map< String, Object > format) throws OllamaBaseException, IOException, InterruptedException |
OllamaToolsResult | generateWithTools (String model, String prompt, Options options) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException |
OllamaAsyncResultStreamer | generateAsync (String model, String prompt, boolean raw, boolean think) |
OllamaResult | generateWithImageFiles (String model, String prompt, List< File > imageFiles, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException |
OllamaResult | generateWithImageFiles (String model, String prompt, List< File > imageFiles, Options options) throws OllamaBaseException, IOException, InterruptedException |
OllamaResult | generateWithImageURLs (String model, String prompt, List< String > imageURLs, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
OllamaResult | generateWithImageURLs (String model, String prompt, List< String > imageURLs, Options options) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
OllamaResult | generateWithImages (String model, String prompt, List< byte[]> images, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException |
OllamaResult | generateWithImages (String model, String prompt, List< byte[]> images, Options options) throws OllamaBaseException, IOException, InterruptedException |
OllamaChatResult | chat (String model, List< OllamaChatMessage > messages) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException |
OllamaChatResult | chat (OllamaChatRequest request) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException |
OllamaChatResult | chat (OllamaChatRequest request, OllamaStreamHandler thinkingStreamHandler, OllamaStreamHandler responseStreamHandler) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException |
OllamaChatResult | chatStreaming (OllamaChatRequest request, OllamaTokenHandler tokenHandler) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException |
void | registerTool (Tools.ToolSpecification toolSpecification) |
void | registerTools (List< Tools.ToolSpecification > toolSpecifications) |
void | deregisterTools () |
void | registerAnnotatedTools () |
void | registerAnnotatedTools (Object object) |
OllamaChatMessageRole | addCustomRole (String roleName) |
List< OllamaChatMessageRole > | listRoles () |
OllamaChatMessageRole | getRole (String roleName) throws RoleNotFoundException |
The base Ollama API class.
Definition at line 56 of file OllamaAPI.java.
io.github.ollama4j.OllamaAPI.OllamaAPI | ( | ) |
Instantiates the Ollama API with default Ollama host: http://localhost:11434
Definition at line 111 of file OllamaAPI.java.
io.github.ollama4j.OllamaAPI.OllamaAPI | ( | String | host | ) |
Instantiates the Ollama API with specified Ollama host address.
host | the host address of Ollama server |
Definition at line 120 of file OllamaAPI.java.
OllamaChatMessageRole io.github.ollama4j.OllamaAPI.addCustomRole | ( | String | roleName | ) |
Adds a custom role.
roleName | the name of the custom role to be added |
Definition at line 1553 of file OllamaAPI.java.
OllamaChatResult io.github.ollama4j.OllamaAPI.chat | ( | OllamaChatRequest | request | ) | throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException |
Ask a question to a model using an OllamaChatRequest
. This can be constructed using an OllamaChatRequestBuilder
.
Hint: the OllamaChatRequestModel#getStream() property is not implemented.
request | request object to be sent to the server |
OllamaChatResult
OllamaBaseException | any response code than 200 has been returned |
IOException | in case the responseStream can not be read |
InterruptedException | in case the server is not reachable or network issues happen |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
ToolInvocationException | if the tool invocation fails |
Definition at line 1309 of file OllamaAPI.java.
OllamaChatResult io.github.ollama4j.OllamaAPI.chat | ( | OllamaChatRequest | request, |
OllamaStreamHandler | thinkingStreamHandler, | ||
OllamaStreamHandler | responseStreamHandler ) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException |
Ask a question to a model using an OllamaChatRequest
. This can be constructed using an OllamaChatRequestBuilder
.
Hint: the OllamaChatRequestModel#getStream() property is not implemented.
request | request object to be sent to the server |
responseStreamHandler | callback handler to handle the last message from stream |
thinkingStreamHandler | callback handler to handle the last thinking message from stream |
OllamaChatResult
OllamaBaseException | any response code than 200 has been returned |
IOException | in case the responseStream can not be read |
InterruptedException | in case the server is not reachable or network issues happen |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
ToolInvocationException | if the tool invocation fails |
Definition at line 1337 of file OllamaAPI.java.
OllamaChatResult io.github.ollama4j.OllamaAPI.chat | ( | String | model, |
List< OllamaChatMessage > | messages ) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException |
Ask a question to a model based on a given message stack (i.e. a chat history). Creates a synchronous call to the api 'api/chat'.
model | the ollama model to ask the question to |
messages | chat history / message stack to send to the model |
OllamaChatResult
containing the api response and the message history including the newly acquired assistant response. OllamaBaseException | any response code than 200 has been returned |
IOException | in case the responseStream can not be read |
InterruptedException | in case the server is not reachable or network issues happen |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
ToolInvocationException | if the tool invocation fails |
Definition at line 1284 of file OllamaAPI.java.
OllamaChatResult io.github.ollama4j.OllamaAPI.chatStreaming | ( | OllamaChatRequest | request, |
OllamaTokenHandler | tokenHandler ) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException |
Ask a question to a model using an OllamaChatRequest
. This can be constructed using an OllamaChatRequestBuilder
.
Hint: the OllamaChatRequestModel#getStream() property is not implemented.
request | request object to be sent to the server |
tokenHandler | callback handler to handle the last token from stream (caution: the previous tokens from stream will not be concatenated) |
OllamaChatResult
OllamaBaseException | any response code than 200 has been returned |
IOException | in case the responseStream can not be read |
InterruptedException | in case the server is not reachable or network issues happen |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
Definition at line 1362 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.createModel | ( | CustomModelRequest | customModelRequest | ) | throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
Create a custom model. Read more about custom model creation here.
customModelRequest | custom model spec |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
URISyntaxException | if the URI for the request is malformed |
Definition at line 682 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.createModelWithFilePath | ( | String | modelName, |
String | modelFilePath ) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
Create a custom model from a model file. Read more about custom model file creation here.
modelName | the name of the custom model to be created. |
modelFilePath | the path to model file that exists on the Ollama server. |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
URISyntaxException | if the URI for the request is malformed |
Definition at line 607 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.createModelWithModelFileContents | ( | String | modelName, |
String | modelFileContents ) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
Create a custom model from a model file. Read more about custom model file creation here.
modelName | the name of the custom model to be created. |
modelFileContents | the path to model file that exists on the Ollama server. |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
URISyntaxException | if the URI for the request is malformed |
Definition at line 648 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.deleteModel | ( | String | modelName, |
boolean | ignoreIfNotPresent ) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
Delete a model from Ollama server.
modelName | the name of the model to be deleted. |
ignoreIfNotPresent | ignore errors if the specified model is not present on Ollama server. |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
URISyntaxException | if the URI for the request is malformed |
Definition at line 716 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.deregisterTools | ( | ) |
Deregisters all tools from the tool registry. This method removes all registered tools, effectively clearing the registry.
Definition at line 1445 of file OllamaAPI.java.
OllamaEmbedResponseModel io.github.ollama4j.OllamaAPI.embed | ( | OllamaEmbedRequestModel | modelRequest | ) | throws IOException, InterruptedException, OllamaBaseException |
Generate embeddings using a OllamaEmbedRequestModel
.
modelRequest | request for '/api/embed' endpoint |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
Definition at line 810 of file OllamaAPI.java.
OllamaEmbedResponseModel io.github.ollama4j.OllamaAPI.embed | ( | String | model, |
List< String > | inputs ) throws IOException, InterruptedException, OllamaBaseException |
Generate embeddings for a given text from a model
model | name of model to generate embeddings from |
inputs | text/s to generate embeddings for |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
Definition at line 796 of file OllamaAPI.java.
LibraryModelTag io.github.ollama4j.OllamaAPI.findModelTagFromLibrary | ( | String | modelName, |
String | tag ) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException |
Finds a specific model using model name and tag from Ollama library.
Deprecated: This method relies on the HTML structure of the Ollama website, which is subject to change at any time. As a result, it is difficult to keep this API method consistently updated and reliable. Therefore, this method is deprecated and may be removed in future releases.
This method retrieves the model from the Ollama library by its name, then fetches its tags. It searches through the tags of the model to find one that matches the specified tag name. If the model or the tag is not found, it throws a NoSuchElementException
.
modelName | The name of the model to search for in the library. |
tag | The tag name to search for within the specified model. |
LibraryModelTag
associated with the specified model and tag. OllamaBaseException | If there is a problem with the Ollama library operations. |
IOException | If an I/O error occurs during the operation. |
URISyntaxException | If there is an error with the URI syntax. |
InterruptedException | If the operation is interrupted. |
NoSuchElementException | If the model or the tag is not found. |
Definition at line 404 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generate | ( | String | model, |
String | prompt, | ||
boolean | raw, | ||
boolean | think, | ||
Options | options ) throws OllamaBaseException, IOException, InterruptedException |
Generates response using the specified AI model and prompt (in blocking mode).
Uses generate(String, String, boolean, Options, OllamaStreamHandler)
model | The name or identifier of the AI model to use for generating the response. |
prompt | The input text or prompt to provide to the AI model. |
raw | In some cases, you may wish to bypass the templating system and provide a full prompt. In this case, you can use the raw parameter to disable templating. Also note that raw mode will not return a context. |
options | Additional options or configurations to use when generating the response. |
think | if true the model will "think" step-by-step before generating the final response |
OllamaResult
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
Definition at line 926 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generate | ( | String | model, |
String | prompt, | ||
boolean | raw, | ||
Options | options, | ||
OllamaStreamHandler | responseStreamHandler ) throws OllamaBaseException, IOException, InterruptedException |
Generate response for a question to a model running on Ollama server. This is a sync/blocking call. This API does not support "thinking" models.
model | the ollama model to ask the question to |
prompt | the prompt/question text |
raw | if true no formatting will be applied to the prompt. You may choose to use the raw parameter if you are specifying a full templated prompt in your request to the API |
options | the Options object - More details on the options |
responseStreamHandler | optional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false. |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
Definition at line 857 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generate | ( | String | model, |
String | prompt, | ||
boolean | raw, | ||
Options | options, | ||
OllamaStreamHandler | thinkingStreamHandler, | ||
OllamaStreamHandler | responseStreamHandler ) throws OllamaBaseException, IOException, InterruptedException |
Generate thinking and response tokens for a question to a thinking model running on Ollama server. This is a sync/blocking call.
model | the ollama model to ask the question to |
prompt | the prompt/question text |
raw | if true no formatting will be applied to the prompt. You may choose to use the raw parameter if you are specifying a full templated prompt in your request to the API |
options | the Options object - More details on the options |
responseStreamHandler | optional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false. |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
Definition at line 893 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generate | ( | String | model, |
String | prompt, | ||
Map< String, Object > | format ) throws OllamaBaseException, IOException, InterruptedException |
Generates structured output from the specified AI model and prompt.
Note: When formatting is specified, the 'think' parameter is not allowed.
model | The name or identifier of the AI model to use for generating the response. |
prompt | The input text or prompt to provide to the AI model. |
format | A map containing the format specification for the structured output. |
OllamaResult
containing the structured response. OllamaBaseException | if the response indicates an error status. |
IOException | if an I/O error occurs during the HTTP request. |
InterruptedException | if the operation is interrupted. |
Definition at line 952 of file OllamaAPI.java.
OllamaAsyncResultStreamer io.github.ollama4j.OllamaAPI.generateAsync | ( | String | model, |
String | prompt, | ||
boolean | raw, | ||
boolean | think ) |
Asynchronously generates a response for a prompt using a model running on the Ollama server.
This method returns an OllamaAsyncResultStreamer
handle that can be used to poll for status and retrieve streamed "thinking" and response tokens from the model. The call is non-blocking.
Example usage:
OllamaAsyncResultStreamer resultStreamer = ollamaAPI.generateAsync("gpt-oss:20b", "Who are you", false, true);
int pollIntervalMilliseconds = 1000;
while (true) {
String thinkingTokens = resultStreamer.getThinkingResponseStream().poll();
String responseTokens = resultStreamer.getResponseStream().poll();
System.out.print(thinkingTokens != null ? thinkingTokens.toUpperCase() : "");
System.out.print(responseTokens != null ? responseTokens.toLowerCase() : "");
Thread.sleep(pollIntervalMilliseconds);
if (!resultStreamer.isAlive())
break;
}
System.out.println("Complete thinking response: " + resultStreamer.getCompleteThinkingResponse());
System.out.println("Complete response: " + resultStreamer.getCompleteResponse());
model | the Ollama model to use for generating the response |
prompt | the prompt or question text to send to the model |
raw | if true , returns the raw response from the model |
think | if true , streams "thinking" tokens as well as response tokens |
OllamaAsyncResultStreamer
handle for polling and retrieving streamed results Definition at line 1111 of file OllamaAPI.java.
List< Double > io.github.ollama4j.OllamaAPI.generateEmbeddings | ( | OllamaEmbeddingsRequestModel | modelRequest | ) | throws IOException, InterruptedException, OllamaBaseException |
Generate embeddings using a OllamaEmbeddingsRequestModel
.
modelRequest | request for '/api/embeddings' endpoint |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
embed(OllamaEmbedRequestModel)
instead. Definition at line 765 of file OllamaAPI.java.
List< Double > io.github.ollama4j.OllamaAPI.generateEmbeddings | ( | String | model, |
String | prompt ) throws IOException, InterruptedException, OllamaBaseException |
Generate embeddings for a given text from a model
model | name of model to generate embeddings from |
prompt | text to generate embeddings for |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
embed(String, List)
instead. Definition at line 749 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageFiles | ( | String | model, |
String | prompt, | ||
List< File > | imageFiles, | ||
Options | options ) throws OllamaBaseException, IOException, InterruptedException |
Convenience method to call Ollama API without streaming responses.
Uses generateWithImageFiles(String, String, List, Options, OllamaStreamHandler)
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
Definition at line 1163 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageFiles | ( | String | model, |
String | prompt, | ||
List< File > | imageFiles, | ||
Options | options, | ||
OllamaStreamHandler | streamHandler ) throws OllamaBaseException, IOException, InterruptedException |
With one or more image files, ask a question to a model running on Ollama server. This is a sync/blocking call.
model | the ollama model to ask the question to |
prompt | the prompt/question text |
imageFiles | the list of image files to use for the question |
options | the Options object - More details on the options |
streamHandler | optional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false. |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
Definition at line 1142 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generateWithImages | ( | String | model, |
String | prompt, | ||
List< byte[]> | images, | ||
Options | options ) throws OllamaBaseException, IOException, InterruptedException |
Convenience method to call the Ollama API using image byte arrays without streaming responses.
Uses generateWithImages(String, String, List, Options, OllamaStreamHandler)
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
Definition at line 1259 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generateWithImages | ( | String | model, |
String | prompt, | ||
List< byte[]> | images, | ||
Options | options, | ||
OllamaStreamHandler | streamHandler ) throws OllamaBaseException, IOException, InterruptedException |
Synchronously generates a response using a list of image byte arrays.
This method encodes the provided byte arrays into Base64 and sends them to the Ollama server.
model | the Ollama model to use for generating the response |
prompt | the prompt or question text to send to the model |
images | the list of image data as byte arrays |
options | the Options object - More details on the options |
streamHandler | optional callback that will be invoked with each streamed response; if null, streaming is disabled |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
Definition at line 1237 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageURLs | ( | String | model, |
String | prompt, | ||
List< String > | imageURLs, | ||
Options | options ) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
Convenience method to call Ollama API without streaming responses.
Uses generateWithImageURLs(String, String, List, Options, OllamaStreamHandler)
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
URISyntaxException | if the URI for the request is malformed |
Definition at line 1212 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageURLs | ( | String | model, |
String | prompt, | ||
List< String > | imageURLs, | ||
Options | options, | ||
OllamaStreamHandler | streamHandler ) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
With one or more image URLs, ask a question to a model running on Ollama server. This is a sync/blocking call.
model | the ollama model to ask the question to |
prompt | the prompt/question text |
imageURLs | the list of image URLs to use for the question |
options | the Options object - More details on the options |
streamHandler | optional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false. |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
URISyntaxException | if the URI for the request is malformed |
Definition at line 1189 of file OllamaAPI.java.
OllamaToolsResult io.github.ollama4j.OllamaAPI.generateWithTools | ( | String | model, |
String | prompt, | ||
Options | options ) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException |
Generates response using the specified AI model and prompt (in blocking mode), and then invokes a set of tools on the generated response.
model | The name or identifier of the AI model to use for generating the response. |
prompt | The input text or prompt to provide to the AI model. |
options | Additional options or configurations to use when generating the response. |
OllamaToolsResult
An OllamaToolsResult object containing the response from the AI model and the results of invoking the tools on that output. OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
Definition at line 1029 of file OllamaAPI.java.
LibraryModelDetail io.github.ollama4j.OllamaAPI.getLibraryModelDetails | ( | LibraryModel | libraryModel | ) | throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
Fetches the tags associated with a specific model from Ollama library. This method fetches the available model tags directly from Ollama library model page, including model tag name, size and time when model was last updated into a list of LibraryModelTag
objects.
libraryModel | the LibraryModel object which contains the name of the library model for which the tags need to be fetched. |
LibraryModelTag
objects containing the extracted tags and their associated metadata. OllamaBaseException | if the HTTP response status code indicates an error (i.e., not 200 OK), or if there is any other issue during the request or response processing. |
IOException | if an input/output exception occurs during the HTTP request or response handling. |
InterruptedException | if the thread is interrupted while waiting for the HTTP response. |
URISyntaxException | if the URI format is incorrect or invalid. |
Definition at line 326 of file OllamaAPI.java.
ModelDetail io.github.ollama4j.OllamaAPI.getModelDetails | ( | String | modelName | ) | throws IOException, OllamaBaseException, InterruptedException, URISyntaxException |
Gets model details from the Ollama server.
modelName | the model |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
URISyntaxException | if the URI for the request is malformed |
Definition at line 574 of file OllamaAPI.java.
OllamaChatMessageRole io.github.ollama4j.OllamaAPI.getRole | ( | String | roleName | ) | throws RoleNotFoundException |
Retrieves a specific role by name.
roleName | the name of the role to retrieve |
RoleNotFoundException | if the role with the specified name does not exist |
Definition at line 1574 of file OllamaAPI.java.
String io.github.ollama4j.OllamaAPI.getVersion | ( | ) | throws URISyntaxException, IOException, InterruptedException, OllamaBaseException |
Definition at line 527 of file OllamaAPI.java.
List< Model > io.github.ollama4j.OllamaAPI.listModels | ( | ) | throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
Lists available models from the Ollama server.
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
URISyntaxException | if the URI for the request is malformed |
Definition at line 223 of file OllamaAPI.java.
List< LibraryModel > io.github.ollama4j.OllamaAPI.listModelsFromLibrary | ( | ) | throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
Retrieves a list of models from the Ollama library. This method fetches the available models directly from Ollama library page, including model details such as the name, pull count, popular tags, tag count, and the time when model was updated.
LibraryModel
objects representing the models available in the Ollama library. OllamaBaseException | If the HTTP request fails or the response is not successful (non-200 status code). |
IOException | If an I/O error occurs during the HTTP request or response processing. |
InterruptedException | If the thread executing the request is interrupted. |
URISyntaxException | If there is an error creating the URI for the HTTP request. |
Definition at line 257 of file OllamaAPI.java.
List< OllamaChatMessageRole > io.github.ollama4j.OllamaAPI.listRoles | ( | ) |
Lists all available roles.
Definition at line 1562 of file OllamaAPI.java.
boolean io.github.ollama4j.OllamaAPI.ping | ( | ) |
API to check the reachability of Ollama server.
Definition at line 157 of file OllamaAPI.java.
ModelsProcessResponse io.github.ollama4j.OllamaAPI.ps | ( | ) | throws IOException, InterruptedException, OllamaBaseException |
Provides a list of running models and details about each model currently loaded into memory.
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
OllamaBaseException | if the response indicates an error status |
Definition at line 191 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.pullModel | ( | LibraryModelTag | libraryModelTag | ) | throws OllamaBaseException, IOException, URISyntaxException, InterruptedException |
Pulls a model using the specified Ollama library model tag. The model is identified by a name and a tag, which are combined into a single identifier in the format "name:tag" to pull the corresponding model.
libraryModelTag | the LibraryModelTag object containing the name and tag of the model to be pulled. |
OllamaBaseException | if the response indicates an error status |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted |
URISyntaxException | if the URI for the request is malformed |
Definition at line 558 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.pullModel | ( | String | modelName | ) | throws OllamaBaseException, IOException, URISyntaxException, InterruptedException |
Pull a model on the Ollama server from the list of available models.
If numberOfRetriesForModelPull
is greater than 0, this method will retry pulling the model up to the specified number of times if an OllamaBaseException
occurs, using exponential backoff between retries (delay doubles after each failed attempt, starting at 1 second).
The backoff is only applied between retries, not after the final attempt.
modelName | the name of the model |
OllamaBaseException | if the response indicates an error status or all retries fail |
IOException | if an I/O error occurs during the HTTP request |
InterruptedException | if the operation is interrupted or the thread is interrupted during backoff |
URISyntaxException | if the URI for the request is malformed |
Definition at line 437 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.registerAnnotatedTools | ( | ) |
Registers tools based on the annotations found on the methods of the caller's class and its providers. This method scans the caller's class for the OllamaToolService
annotation and recursively registers annotated tools from all the providers specified in the annotation.
IllegalStateException | if the caller's class is not annotated with OllamaToolService . |
RuntimeException | if any reflection-based instantiation or invocation fails. |
Definition at line 1464 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.registerAnnotatedTools | ( | Object | object | ) |
Registers tools based on the annotations found on the methods of the provided object. This method scans the methods of the given object and registers tools using the ToolSpec
annotation and associated ToolProperty
annotations. It constructs tool specifications and stores them in a tool registry.
object | the object whose methods are to be inspected for annotated tools. |
RuntimeException | if any reflection-based instantiation or invocation fails. |
Definition at line 1501 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.registerTool | ( | Tools.ToolSpecification | toolSpecification | ) |
Registers a single tool in the tool registry using the provided tool specification.
toolSpecification | the specification of the tool to register. It contains the tool's function name and other relevant information. |
Definition at line 1418 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.registerTools | ( | List< Tools.ToolSpecification > | toolSpecifications | ) |
Registers multiple tools in the tool registry using a list of tool specifications. Iterates over the list and adds each tool specification to the registry.
toolSpecifications | a list of tool specifications to register. Each specification contains information about a tool, such as its function name. |
Definition at line 1435 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.setBasicAuth | ( | String | username, |
String | password ) |
Set basic authentication for accessing Ollama server that's behind a reverse-proxy/gateway.
username | the username |
password | the password |
Definition at line 138 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.setBearerAuth | ( | String | bearerToken | ) |
Set Bearer authentication for accessing Ollama server that's behind a reverse-proxy/gateway.
bearerToken | the Bearer authentication token to provide |
Definition at line 148 of file OllamaAPI.java.