public class BoxAIAgentTextGenBasicGen extends BoxJSONObject
| Constructor and Description |
|---|
BoxAIAgentTextGenBasicGen(JsonObject jsonObject)
Constructs an AI agent with default settings.
|
BoxAIAgentTextGenBasicGen(String contentTemplate,
BoxAIAgentEmbeddings embeddings,
BoxAIAgentLLMEndpointParamsOpenAI llmEndpointParams,
String model,
int numTokensForCompletion,
String promptTemplate,
String systemMessage)
Constructs an AI agent with default settings.
|
| Modifier and Type | Method and Description |
|---|---|
String |
getContentTemplate()
Gets how the content should be included in a request to the LLM.
|
BoxAIAgentEmbeddings |
getEmbeddings()
Gets the embeddings used by the AI agent.
|
JsonObject |
getJSONObject() |
BoxAIAgentLLMEndpointParams |
getLlmEndpointParams()
Gets the parameters for the LLM endpoint specific to OpenAI / Google models.
|
String |
getModel()
Gets the model used for the AI Agent for generating text.
|
int |
getNumTokensForCompletion()
Gets the number of tokens for completion.
|
String |
getPromptTemplate()
Gets the prompt template contains contextual information of the request and the user prompt.
|
String |
getSystemMessage()
Gets the system messages try to help the LLM "understand" its role and what it is supposed to do.
|
void |
setContentTemplate(String contentTemplate)
Sets how the content should be included in a request to the LLM.
|
void |
setEmbeddings(BoxAIAgentEmbeddings embeddings)
Sets the embeddings used by the AI agent.
|
void |
setLlmEndpointParams(BoxAIAgentLLMEndpointParamsOpenAI llmEndpointParams)
Sets the parameters for the LLM endpoint specific to OpenAI / Google models.
|
void |
setModel(String model)
Sets the model used for the AI Agent for generating text.
|
void |
setNumTokensForCompletion(int numTokensForCompletion)
Sets the number of tokens for completion.
|
void |
setPromptTemplate(String promptTemplate)
Sets the prompt template contains contextual information of the request and the user prompt.
|
void |
setSystemMessage(String systemMessage)
Sets the system messages try to help the LLM "understand" its role and what it is supposed to do.
|
clearPendingChanges, getJson, getPendingChanges, getPendingChangesAsJsonObject, getPendingJSONObjectpublic BoxAIAgentTextGenBasicGen(String contentTemplate, BoxAIAgentEmbeddings embeddings, BoxAIAgentLLMEndpointParamsOpenAI llmEndpointParams, String model, int numTokensForCompletion, String promptTemplate, String systemMessage)
contentTemplate - How the content should be included in a request to the LLM. Input for {content} is optional, depending on the use.embeddings - Embeddings used by the AI agent.llmEndpointParams - The parameters for the LLM endpoint specific to OpenAI / Google models.model - The model used for the AI Agent for generating text.numTokensForCompletion - The number of tokens for completion.promptTemplate - The prompt template contains contextual information of the request and the user prompt.
When passing prompt_template parameters, you must include inputs for {user_question} and {content}.
Input for {current_date} is optional, depending on the use.systemMessage - System messages try to help the LLM "understand" its role and what it is supposed to do.public BoxAIAgentTextGenBasicGen(JsonObject jsonObject)
jsonObject - JSON object representing the AI agent.public String getContentTemplate()
public void setContentTemplate(String contentTemplate)
contentTemplate - How the content should be included in a request to the LLM.
Input for {content} is optional, depending on the use.public BoxAIAgentEmbeddings getEmbeddings()
public void setEmbeddings(BoxAIAgentEmbeddings embeddings)
embeddings - The embeddings used by the AI agent.public BoxAIAgentLLMEndpointParams getLlmEndpointParams()
public void setLlmEndpointParams(BoxAIAgentLLMEndpointParamsOpenAI llmEndpointParams)
llmEndpointParams - The parameters for the LLM endpoint specific to OpenAI / Google models.public String getModel()
public void setModel(String model)
model - The model used for the AI Agent for generating text.public int getNumTokensForCompletion()
public void setNumTokensForCompletion(int numTokensForCompletion)
numTokensForCompletion - The number of tokens for completion.public String getPromptTemplate()
public void setPromptTemplate(String promptTemplate)
promptTemplate - The prompt template contains contextual information of the request and the user prompt.public String getSystemMessage()
public void setSystemMessage(String systemMessage)
systemMessage - The system messages try to help the LLM "understand" its role and what it is supposed to do.public JsonObject getJSONObject()