AI
❓ How it works
Section titled “❓ How it works”This brick is heavily opinionated. It combines Langfuse for prompts management (storage, versionning, combining, etc) and Vercel AI SDK for the LLms calls.
Prompt management - Langfuse
Section titled “Prompt management - Langfuse”Langfuse uses OpenTelemetry to trace the calls to the LLMs (it plays well with Vercel SDK). You can see how OPT is setup in the AI service file.
Langfuse can be self hosted, and is entirely open source. See their docs for more advanced usage.
LLM calls - Vercel AI SDK
Section titled “LLM calls - Vercel AI SDK”The Vercel SDK allows us to easily call the LLMs, and to use Langfuse for tracing. It is also quite easy to switch from one model to another.
Orchestration - TBD
Section titled “Orchestration - TBD”The boilerplate does not use any orchestration tool. You can use your own, like Mastra or XState.
📝 How to use
Section titled “📝 How to use”- Create the Langfuse project in your Langfuse dashboard
- Setup the necessary env variables (LANGFUSE_API_KEY, LANGFUSE_SECRET_KEY, OPENAI_API_KEY, etc) in the
api/.envfile. - You are good to go!
🧹 How to remove
Section titled “🧹 How to remove”- Delete the
aimodule from theapi/src/modulesfolder - Remove the following dependencies from the
api/package.jsonfile:
"@ai-sdk/openai","@opentelemetry/api","@opentelemetry/auto-instrumentations-node","@opentelemetry/sdk-node","ai","langfuse","langfuse-vercel",- Remove the following env variables from the
api/.envfile:
LANGFUSE_API_KEY=LANGFUSE_SECRET_KEY=OPENAI_API_KEY=