Access TaskingAI Agents through OpenAI-Compatible APIs

May 13, 2024

TaskingAI is a developer-friendly cloud platform for building and running LLM agents for AI-native applications. When you use TaskingAI's API to interact with models or agents, you get responses in a consistent format defined by TaskingAI even if the model providers differ from each other.

TaskingAI also supports responses in OpenAI's standard format, making it easy for users to integrate their services with a broader array of existing frameworks. If your previous project was built using OpenAI's standards, you can quickly switch to TaskingAI with only a few lines of code changes.

Here's a step-by-step tutorial for accessing TaskingAI services using OpenAI-compatible APIs for both models and assistants.

Accessing a TaskingAI Model via OpenAI-Compatible API

TaskingAI integrates not only OpenAI's models but also a variety of other models from providers such as Gemini, Claude, and open-source models like Mistral and Llama. This wide integration allows you to choose the model that best fits your needs directly from TaskingAI's platform.

Normal API calls to TaskingAI's service use the URL "https://api.tasking.ai". However, when OpenAI-standard responses are expected, the service URL should be "https://oapi.tasking.ai". In terms of the actual code, simply set the OpenAI client's base URL to TaskingAI's service URL, and set the model to your TaskingAI model ID.

1from openai import OpenAI
2
3client = OpenAI(
4    api_key="YOUR_TASKINGAI_API_KEY",
5    base_url="http://oapi.tasking.ai/v1",
6)
7
8response = client.chat.completions.create(
9    model="YOUR_TASKINGAI_MODEL_ID",
10    messages=[
11        {"role": "user", "content": "Hello, how are you?"},
12    ],
13)
14
15print(response)

This code sends a simple greeting to the model and prints the response, demonstrating how effortlessly you can interact with different AI models integrated into TaskingAI.

Accessing a TaskingAI Agent via OpenAI-Compatible API

TaskingAI's OpenAI-Compatible API also supports interacting with your TaskingAI assistants.  Interacting with TaskingAI assistants follows the same schema as interacting with TaskingAI models, but the assistants can use all the available resources (retrievals, plugins, actions) before generating responses when they think necessary.

For example, you can create assistants through the cloud UI console or client SDK, and grant this assistant the access to your retrieval systems as well as tools.

1import taskingai
2
3assistant = taskingai.create_assistant(
4    model_id="YOUR_ASSISTANT_ID",
5    memory={"type": "AssistantNaiveMemory"},
6    tools=[
7        {"type": "action", "id": "YOUR_ACTION_ID"},
8        {"type": "plugin", "id": "YOUR_PLUGIN_ID"}
9    ],
10    retrievals=[
11        {"type": "collection", "id": "YOUR_COLLECTION_ID"}
12    ]
13)

Utilizing the assistant involves sending queries and receiving informed responses. The assistant will effectively utilize integrated tools and extra knowledgebase to provide better answers.

1from openai import OpenAI
2
3client = OpenAI(
4    api_key="YOUR_TASKINGAI_API_KEY",
5    base_url="http://oapi.tasking.ai/v1",
6)
7
8response = client.chat.completions.create(
9    model="YOUR_TASKINGAI_ASSISTANT_ID",
10    messages=[
11        {"role": "user", "content": "Please help me get the latest BTC price in EUR"},
12    ],
13)
14
15print(response)

Note also that the invocation of the agent chat completion is stateless; this means that the chat history is not preserved after each chat completion call.

For detailed information about the OpenAI-Compatible API, please refer to the documentation.

These examples showcase how TaskingAI provides a versatile platform for integrating diverse AI models and managing complex, data-driven interactions through its assistant. If you're looking to leverage cutting-edge AI capabilities for your projects, TaskingAI is an excellent place to start.

On this page

©️ 2024 TaskingAI All Copyright Reserved