The Ragwalla Assistants API is designed to be fully wire-compatible with OpenAI's Assistants API, providing developers seamless integration through familiar SDKs and interfaces. This guide helps you quickly provision your own unique endpoint and start building intelligent applications.
Getting Started with the Ragwalla Assistants API
The Ragwalla Assistants API is designed to be wire-compatible with OpenAI's Assistants API, providing developers seamless integration through familiar SDKs and interfaces. This guide helps you quickly provision your own unique endpoint and start building intelligent applications.
Step 1: Provision Your Unique Endpoint
When you sign up with Ragwalla, you'll provision an endpoint that's unique to your account. Your endpoint URL will follow this format:
https://example.ai.ragwalla.com/v1
Replace example with your custom subdomain provided by Ragwalla.
Step 2: Set Up Your OpenAI Client
Use the provided unique endpoint as your baseURL when instantiating the OpenAI class. This allows you to leverage existing OpenAI SDKs seamlessly. Here's a simple example in JavaScript:
import OpenAI from 'openai';
export const createOpenAIClient = (env) => {
const baseURL = 'https://example.ai.ragwalla.com/v1';
const apiKey = env.RAGWALLA_API_KEY; // Your Ragwalla-provided API key
return new OpenAI({
apiKey,
baseURL
});
};
Step 3: Use Standard Assistants API Methods
Once your client is configured, you can use standard OpenAI Assistants API methods directly:
Creating Threads and Adding Messages
const openai = createOpenAIClient(env);
// Create a thread
async function main() {
const thread = await openai.beta.threads.create();
console.log(thread);
}
// Add a message to the thread
await openai.beta.threads.messages.create(thread.id, {
role: "user",
content: "Hello, Ragwalla Assistant!"
});
Running Threads
To process messages and generate responses:
async function main() {
const run = await openai.beta.threads.runs.create(
"thread_abc123",
{ assistant_id: "asst_abc123" }
);
console.log(run);
}
Step 4: Streaming, Function Calls, and Annotations
Streaming responses, function calling, and annotations work seamlessly out of the box with existing OpenAI SDKs:
async function main() {
const stream = await openai.beta.threads.runs.create(
"thread_123",
{ assistant_id: "asst_123", stream: true }
);
for await (const event of stream) {
console.log(event);
}
}
Step 5: File Management and Vector Stores
You can manage files, add them to vector stores, and associate them with Assistants for enhanced context retrieval:
// Upload a file
async function main() {
const file = await openai.files.create({
file: fs.createReadStream("myfile.pdf"),
purpose: "assistants",
});
console.log(file);
}
Conclusion
With Ragwalla's Assistants API, you can quickly leverage powerful assistant capabilities, including thread management, real-time streaming, function calls, annotations, and robust file management, all within a familiar OpenAI-compatible interface.
Start integrating today to rapidly build intelligent and responsive applications.