Ollama4j
A Java library (wrapper/binding) for Ollama server.
|
Public Member Functions | |
OllamaAPI () | |
OllamaAPI (String host) | |
void | setBasicAuth (String username, String password) |
boolean | ping () |
ModelsProcessResponse | ps () throws IOException, InterruptedException, OllamaBaseException |
List< Model > | listModels () throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
void | pullModel (String modelName) throws OllamaBaseException, IOException, URISyntaxException, InterruptedException |
ModelDetail | getModelDetails (String modelName) throws IOException, OllamaBaseException, InterruptedException, URISyntaxException |
void | createModelWithFilePath (String modelName, String modelFilePath) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
void | createModelWithModelFileContents (String modelName, String modelFileContents) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
void | deleteModel (String modelName, boolean ignoreIfNotPresent) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
List< Double > | generateEmbeddings (String model, String prompt) throws IOException, InterruptedException, OllamaBaseException |
List< Double > | generateEmbeddings (OllamaEmbeddingsRequestModel modelRequest) throws IOException, InterruptedException, OllamaBaseException |
OllamaResult | generate (String model, String prompt, boolean raw, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException |
OllamaResult | generate (String model, String prompt, boolean raw, Options options) throws OllamaBaseException, IOException, InterruptedException |
OllamaToolsResult | generateWithTools (String model, String prompt, Options options) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException |
OllamaAsyncResultStreamer | generateAsync (String model, String prompt, boolean raw) |
OllamaResult | generateWithImageFiles (String model, String prompt, List< File > imageFiles, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException |
OllamaResult | generateWithImageFiles (String model, String prompt, List< File > imageFiles, Options options) throws OllamaBaseException, IOException, InterruptedException |
OllamaResult | generateWithImageURLs (String model, String prompt, List< String > imageURLs, Options options, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
OllamaResult | generateWithImageURLs (String model, String prompt, List< String > imageURLs, Options options) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
OllamaChatResult | chat (String model, List< OllamaChatMessage > messages) throws OllamaBaseException, IOException, InterruptedException |
OllamaChatResult | chat (OllamaChatRequest request) throws OllamaBaseException, IOException, InterruptedException |
OllamaChatResult | chat (OllamaChatRequest request, OllamaStreamHandler streamHandler) throws OllamaBaseException, IOException, InterruptedException |
void | registerTool (Tools.ToolSpecification toolSpecification) |
The base Ollama API class.
Definition at line 40 of file OllamaAPI.java.
io.github.ollama4j.OllamaAPI.OllamaAPI | ( | ) |
Instantiates the Ollama API with default Ollama host: http://localhost:11434
Definition at line 63 of file OllamaAPI.java.
io.github.ollama4j.OllamaAPI.OllamaAPI | ( | String | host | ) |
Instantiates the Ollama API with specified Ollama host address.
host | the host address of Ollama server |
Definition at line 72 of file OllamaAPI.java.
OllamaChatResult io.github.ollama4j.OllamaAPI.chat | ( | OllamaChatRequest | request | ) | throws OllamaBaseException, IOException, InterruptedException |
Ask a question to a model using an OllamaChatRequest
. This can be constructed using an OllamaChatRequestBuilder
.
Hint: the OllamaChatRequestModel#getStream() property is not implemented.
request | request object to be sent to the server |
OllamaChatResult
OllamaBaseException | any response code than 200 has been returned |
IOException | in case the responseStream can not be read |
InterruptedException | in case the server is not reachable or network issues happen |
Definition at line 572 of file OllamaAPI.java.
OllamaChatResult io.github.ollama4j.OllamaAPI.chat | ( | OllamaChatRequest | request, |
OllamaStreamHandler | streamHandler ) throws OllamaBaseException, IOException, InterruptedException |
Ask a question to a model using an OllamaChatRequest
. This can be constructed using an OllamaChatRequestBuilder
.
Hint: the OllamaChatRequestModel#getStream() property is not implemented.
request | request object to be sent to the server |
streamHandler | callback handler to handle the last message from stream (caution: all previous messages from stream will be concatenated) |
OllamaChatResult
OllamaBaseException | any response code than 200 has been returned |
IOException | in case the responseStream can not be read |
InterruptedException | in case the server is not reachable or network issues happen |
Definition at line 588 of file OllamaAPI.java.
OllamaChatResult io.github.ollama4j.OllamaAPI.chat | ( | String | model, |
List< OllamaChatMessage > | messages ) throws OllamaBaseException, IOException, InterruptedException |
Ask a question to a model based on a given message stack (i.e. a chat history). Creates a synchronous call to the api 'api/chat'.
model | the ollama model to ask the question to |
messages | chat history / message stack to send to the model |
OllamaChatResult
containing the api response and the message history including the newly aqcuired assistant response. OllamaBaseException | any response code than 200 has been returned |
IOException | in case the responseStream can not be read |
InterruptedException | in case the server is not reachable or network issues happen |
Definition at line 556 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.createModelWithFilePath | ( | String | modelName, |
String | modelFilePath ) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
Create a custom model from a model file. Read more about custom model file creation here.
modelName | the name of the custom model to be created. |
modelFilePath | the path to model file that exists on the Ollama server. |
Definition at line 252 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.createModelWithModelFileContents | ( | String | modelName, |
String | modelFileContents ) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
Create a custom model from a model file. Read more about custom model file creation here.
modelName | the name of the custom model to be created. |
modelFileContents | the path to model file that exists on the Ollama server. |
Definition at line 286 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.deleteModel | ( | String | modelName, |
boolean | ignoreIfNotPresent ) throws IOException, InterruptedException, OllamaBaseException, URISyntaxException |
Delete a model from Ollama server.
modelName | the name of the model to be deleted. |
ignoreIfNotPresent | ignore errors if the specified model is not present on Ollama server. |
Definition at line 317 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generate | ( | String | model, |
String | prompt, | ||
boolean | raw, | ||
Options | options ) throws OllamaBaseException, IOException, InterruptedException |
Generates response using the specified AI model and prompt (in blocking mode).
Uses generate(String, String, boolean, Options, OllamaStreamHandler)
model | The name or identifier of the AI model to use for generating the response. |
prompt | The input text or prompt to provide to the AI model. |
raw | In some cases, you may wish to bypass the templating system and provide a full prompt. In this case, you can use the raw parameter to disable templating. Also note that raw mode will not return a context. |
options | Additional options or configurations to use when generating the response. |
OllamaResult
Definition at line 410 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generate | ( | String | model, |
String | prompt, | ||
boolean | raw, | ||
Options | options, | ||
OllamaStreamHandler | streamHandler ) throws OllamaBaseException, IOException, InterruptedException |
Generate response for a question to a model running on Ollama server. This is a sync/blocking call.
model | the ollama model to ask the question to |
prompt | the prompt/question text |
options | the Options object - More details on the options |
streamHandler | optional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false. |
Definition at line 391 of file OllamaAPI.java.
OllamaAsyncResultStreamer io.github.ollama4j.OllamaAPI.generateAsync | ( | String | model, |
String | prompt, | ||
boolean | raw ) |
Generate response for a question to a model running on Ollama server and get a callback handle that can be used to check for status and get the response from the model later. This would be an async/non-blocking call.
model | the ollama model to ask the question to |
prompt | the prompt/question text |
Definition at line 461 of file OllamaAPI.java.
List< Double > io.github.ollama4j.OllamaAPI.generateEmbeddings | ( | OllamaEmbeddingsRequestModel | modelRequest | ) | throws IOException, InterruptedException, OllamaBaseException |
Generate embeddings using a OllamaEmbeddingsRequestModel
.
modelRequest | request for '/api/embeddings' endpoint |
Definition at line 357 of file OllamaAPI.java.
List< Double > io.github.ollama4j.OllamaAPI.generateEmbeddings | ( | String | model, |
String | prompt ) throws IOException, InterruptedException, OllamaBaseException |
Generate embeddings for a given text from a model
model | name of model to generate embeddings from |
prompt | text to generate embeddings for |
Definition at line 346 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageFiles | ( | String | model, |
String | prompt, | ||
List< File > | imageFiles, | ||
Options | options ) throws OllamaBaseException, IOException, InterruptedException |
Convenience method to call Ollama API without streaming responses.
Uses generateWithImageFiles(String, String, List, Options, OllamaStreamHandler)
Definition at line 502 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageFiles | ( | String | model, |
String | prompt, | ||
List< File > | imageFiles, | ||
Options | options, | ||
OllamaStreamHandler | streamHandler ) throws OllamaBaseException, IOException, InterruptedException |
With one or more image files, ask a question to a model running on Ollama server. This is a sync/blocking call.
model | the ollama model to ask the question to |
prompt | the prompt/question text |
imageFiles | the list of image files to use for the question |
options | the Options object - More details on the options |
streamHandler | optional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false. |
Definition at line 485 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageURLs | ( | String | model, |
String | prompt, | ||
List< String > | imageURLs, | ||
Options | options ) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
Convenience method to call Ollama API without streaming responses.
Uses generateWithImageURLs(String, String, List, Options, OllamaStreamHandler)
Definition at line 538 of file OllamaAPI.java.
OllamaResult io.github.ollama4j.OllamaAPI.generateWithImageURLs | ( | String | model, |
String | prompt, | ||
List< String > | imageURLs, | ||
Options | options, | ||
OllamaStreamHandler | streamHandler ) throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
With one or more image URLs, ask a question to a model running on Ollama server. This is a sync/blocking call.
model | the ollama model to ask the question to |
prompt | the prompt/question text |
imageURLs | the list of image URLs to use for the question |
options | the Options object - More details on the options |
streamHandler | optional callback consumer that will be applied every time a streamed response is received. If not set, the stream parameter of the request is set to false. |
Definition at line 521 of file OllamaAPI.java.
OllamaToolsResult io.github.ollama4j.OllamaAPI.generateWithTools | ( | String | model, |
String | prompt, | ||
Options | options ) throws OllamaBaseException, IOException, InterruptedException, ToolInvocationException |
Generates response using the specified AI model and prompt (in blocking mode), and then invokes a set of tools on the generated response.
model | The name or identifier of the AI model to use for generating the response. |
prompt | The input text or prompt to provide to the AI model. |
options | Additional options or configurations to use when generating the response. |
OllamaToolsResult
An OllamaToolsResult object containing the response from the AI model and the results of invoking the tools on that output. OllamaBaseException | If there is an error related to the Ollama API or service. |
IOException | If there is an error related to input/output operations. |
InterruptedException | If the method is interrupted while waiting for the AI model to generate the response or for the tools to be invoked. |
Definition at line 429 of file OllamaAPI.java.
ModelDetail io.github.ollama4j.OllamaAPI.getModelDetails | ( | String | modelName | ) | throws IOException, OllamaBaseException, InterruptedException, URISyntaxException |
Gets model details from the Ollama server.
modelName | the model |
Definition at line 224 of file OllamaAPI.java.
List< Model > io.github.ollama4j.OllamaAPI.listModels | ( | ) | throws OllamaBaseException, IOException, InterruptedException, URISyntaxException |
List available models from Ollama server.
Definition at line 157 of file OllamaAPI.java.
boolean io.github.ollama4j.OllamaAPI.ping | ( | ) |
API to check the reachability of Ollama server.
Definition at line 95 of file OllamaAPI.java.
ModelsProcessResponse io.github.ollama4j.OllamaAPI.ps | ( | ) | throws IOException, InterruptedException, OllamaBaseException |
Provides a list of running models and details about each model currently loaded into memory.
Definition at line 126 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.pullModel | ( | String | modelName | ) | throws OllamaBaseException, IOException, URISyntaxException, InterruptedException |
Pull a model on the Ollama server from the list of available models.
modelName | the name of the model |
Definition at line 186 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.registerTool | ( | Tools.ToolSpecification | toolSpecification | ) |
Definition at line 600 of file OllamaAPI.java.
void io.github.ollama4j.OllamaAPI.setBasicAuth | ( | String | username, |
String | password ) |
Set basic authentication for accessing Ollama server that's behind a reverse-proxy/gateway.
username | the username |
password | the password |
Definition at line 86 of file OllamaAPI.java.