AI Settings

API Key

Web Search (EXA)

Supabase Vector Store

Environment Variables

Add these to your .env.local file:

OPENAI_API_KEY=your_openai_api_key
ANTHROPIC_API_KEY=your_anthropic_api_key
GROQ_API_KEY=your_groq_api_key
GOOGLE_API_KEY=your_google_api_key
DEEPSEEK_API_KEY=your_deepseek_api_key
FIREWORKS_API_KEY=your_fireworks_api_key

Current Selection

Provider: OpenAI
Model: GPT-4o Mini
Max Tokens: 16,384
Context Window: 16,384
β–² + πŸ¦œπŸ”—
  • 🀝 This template showcases a LangChain.js retrieval chain and the Vercel AI SDK in a Next.js project.
  • πŸ› οΈ The agent has access to a vector store retriever as a tool as well as a memory. It's particularly well suited to meta-questions about the current conversation.
  • πŸ’» You can find the prompt and model logic for this use-case in app/api/chat/retrieval_agents/route.ts.
  • πŸ€– By default, the agent is pretending to be a robot, but you can change the prompt to whatever you want!
  • 🎨 The main frontend logic is found in app/retrieval_agents/page.tsx.
  • πŸ”± Before running this example, you'll first need to set up a Supabase (or other) vector store. See the README for more details.
  • πŸ‘‡ Upload some text, then try asking e.g. What are some ways of doing retrieval in LangChain? below!