Quickstart
Get your AI agent learning from experience in under 5 minutes. This guide walks you through creating an account, installing the SDK, and adding persistent memory to your agent.
Prerequisites
- An AI agent using OpenAI or Anthropic SDKs (or any LLM via the manual client)
- Node.js 18+ (TypeScript) or Python 3.10+
- A free Hippocortex account
1. Get Your API Key
Go to dashboard.hippocortex.dev, sign up (free tier, no credit card required), and copy your API key. It will look like hx_test_... for the test environment.
2. Install the SDK
JavaScript/TypeScript:
npm install @hippocortex/sdk
Python:
pip install hippocortex
3. Configure
Set your API key as an environment variable:
export HIPPOCORTEX_API_KEY=hx_test_your_key_here
That's it — the SDK reads the environment variable automatically.
4. Add Memory
The fastest way to add memory depends on your setup. Choose one:
Gateway (Zero Code, Any Provider)
Change your base URL. Works with OpenAI, Anthropic, Groq, Together, Ollama, Mistral, and any OpenAI-compatible provider.
from openai import OpenAI
client = OpenAI(
base_url="https://api.hippocortex.dev/v1",
api_key="hx_live_...",
default_headers={
"X-LLM-API-Key": "sk-...", # Your provider's API key
},
)
# Use normally. Memory is automatic.
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Deploy payments to staging"}]
)
See the Gateway guide for all provider examples.
Auto-Instrumentation (One Line)
If you prefer the SDK, one import adds memory to every OpenAI or Anthropic call automatically.
TypeScript:
import '@hippocortex/sdk/auto'
import OpenAI from 'openai'
const openai = new OpenAI()
// This call now has persistent memory:
// 1. Relevant context is synthesized and injected
// 2. The conversation is captured after completion
const response = await openai.chat.completions.create({
model: 'gpt-4o',
messages: [{ role: 'user', content: 'Deploy payments to staging' }]
})
Python:
import hippocortex.auto
from openai import OpenAI
client = OpenAI()
# Memory context injected, conversation captured automatically
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Deploy payments to staging"}]
)
That is it. Your agent now remembers past interactions and uses them to improve future responses.
5. Verify It Works
Check the dashboard to see captured events. Knowledge compilation runs automatically after every 10 captures, so you do not need to trigger it manually.
If you want to trigger compilation explicitly (for testing):
TypeScript:
import { Hippocortex } from '@hippocortex/sdk'
const hx = new Hippocortex()
const result = await hx.learn()
console.log(`Created ${result.artifacts.created} knowledge artifacts`)
Python:
import asyncio
from hippocortex import Hippocortex
async def main():
hx = Hippocortex()
result = await hx.learn()
print(f"Created {result.artifacts.created} knowledge artifacts")
asyncio.run(main())
What Just Happened?
- Capture recorded your agent's conversation as a structured event.
- Auto-compile processes events in the background, extracting knowledge artifacts (procedures, failure playbooks, etc.).
- On the next call, Synthesize retrieves relevant artifacts and injects them as context.
Your agent now has a memory that persists across sessions and improves over time.
Next Steps
- Installation for detailed setup options
- Integration Guide for production-grade integration
- SDK Overview for all integration methods
- API Overview for direct API access