Skip to main content

Streaming

Lucid supports SSE (Server-Sent Events) streaming for real-time token-by-token responses. Streaming works identically to the OpenAI streaming API — any OpenAI-compatible client works out of the box.

TypeScript SDK

const stream = await lucid.chat.completions({
  model: "gpt-4o",
  messages: [
    { role: "user", content: "Write a short poem about AI" }
  ],
  stream: true,
});

for await (const chunk of stream) {
  const content = chunk.choices[0]?.delta?.content;
  if (content) process.stdout.write(content);
}

// Receipt is available after stream completes
console.log("\nReceipt:", stream.receiptId);

OpenAI SDK (Drop-in)

Since TrustGate is OpenAI-compatible, the OpenAI SDK works directly:
import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.LUCID_API_KEY,
  baseURL: "https://api.lucid.foundation/v1",
});

const stream = await client.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Hello!" }],
  stream: true,
});

for await (const chunk of stream) {
  process.stdout.write(chunk.choices[0]?.delta?.content || "");
}

Python (httpx)

import httpx

with httpx.stream(
    "POST",
    "https://api.lucid.foundation/v1/chat/completions",
    headers={"Authorization": f"Bearer {LUCID_API_KEY}"},
    json={
        "model": "gpt-4o",
        "messages": [{"role": "user", "content": "Hello!"}],
        "stream": True,
    },
) as response:
    for line in response.iter_lines():
        if line.startswith("data: ") and line != "data: [DONE]":
            chunk = json.loads(line[6:])
            content = chunk["choices"][0]["delta"].get("content", "")
            print(content, end="", flush=True)

curl

curl -N "https://api.lucid.foundation/v1/chat/completions" \
  -H "Authorization: Bearer $LUCID_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o",
    "messages": [{"role": "user", "content": "Hello"}],
    "stream": true
  }'

Next.js Integration

// app/api/chat/route.ts
import { LucidAI } from "raijin-labs-lucid-ai";

export async function POST(req: Request) {
  const { messages } = await req.json();
  const lucid = new LucidAI({ bearerAuth: process.env.LUCID_API_KEY });

  const stream = await lucid.chat.completions({
    model: "gpt-4o",
    messages,
    stream: true,
  });

  return new Response(stream, {
    headers: { "Content-Type": "text/event-stream" },
  });
}

Vercel AI SDK

import { createOpenAI } from "@ai-sdk/openai";
import { streamText } from "ai";

const lucid = createOpenAI({
  apiKey: process.env.LUCID_API_KEY,
  baseURL: "https://api.lucid.foundation/v1",
});

const result = streamText({
  model: lucid("gpt-4o"),
  messages: [{ role: "user", content: "Hello!" }],
});

return result.toDataStreamResponse();

Receipts with Streaming

When streaming, the receipt is generated after the full response completes. The receipt ID is included in the final SSE event:
data: {"id":"chatcmpl-...","choices":[{"delta":{"content":""},"finish_reason":"stop"}]}
data: {"receipt_id":"rec_abc123"}
data: [DONE]
You can then verify the receipt:
const receipt = await lucid.receipts.get("rec_abc123");
console.log("Signature valid:", receipt.signatureValid);