quick start
installation
api keys
environment variables are detected automatically
| provider | env var |
|---|---|
| openai | OPENAI_API_KEY |
| anthropic | ANTHROPIC_API_KEY |
GEMINI_API_KEY or GOOGLE_AI_API_KEY |
|
| xai | XAI_API_KEY |
or set them programmatically
import { setKeys } from "@threaded/ai";
setKeys({
openai: "sk-...",
anthropic: "sk-ant-...",
google: "...",
xai: "xai-...",
});
or override per-call
useful for multi-tenant apps or using different keys per request
basic usage
simple model call without conversation history
import { model } from "@threaded/ai";
const result = await model()("what is 2+2?");
console.log(result.lastResponse.content);
different providers
import { model } from "@threaded/ai";
const openai = model({ model: "openai/gpt-4.1-mini" });
const anthropic = model({ model: "anthropic/claude-sonnet-4-20250514" });
const google = model({ model: "google/gemini-2.5-flash" });
const xai = model({ model: "xai/grok-3-mini" });
const result = await xai("what is 2+2?");
console.log(result.lastResponse.content);
passes message directly to model and returns conversation context
with threads
persist conversation history across messages
import { getOrCreateThread, model } from "@threaded/ai";
const thread = getOrCreateThread("user-123");
await thread.message("hello, i'm building a todo app");
await thread.message("what should i call it?");
thread automatically manages history between calls
with tools
give the model functions to execute
import { compose, model, scope } from "@threaded/ai";
const weather = {
name: "get_weather",
description: "Get weather for a city",
schema: {
city: { type: "string", description: "City name" },
},
execute: async ({ city }) => {
return { city, temp: "72°F", condition: "sunny" };
},
};
const workflow = compose(
scope(
{
tools: [weather],
},
model(),
),
);
const result = await workflow("what's the weather in san francisco?");
model calls the tool automatically and uses results in response
streaming
stream content and tool execution updates
const workflow = compose(
scope(
{
tools: [weather],
stream: (event) => {
if (event.type === "content") {
process.stdout.write(event.content);
}
if (event.type === "tool_executing") {
console.log(`calling ${event.call.function.name}...`);
}
},
},
model(),
),
);
await workflow("what's the weather?");
get real-time updates as the model generates responses
next: learn about threads