Memories

Store and search information using natural language.

Quick Reference

import { createMoaClient } from 'memofai';

const client = createMoaClient({ apiToken: 'moa_your_token' });

// Create
await client.memories.create({
  botId: 'bot_123',
  content: 'User prefers dark mode',
  metadata: { category: 'preference' }
});

// Search
const results = await client.memories.search({
  botId: 'bot_123',
  query: 'user preferences',
  limit: 10
});

// List
const memories = await client.memories.list({ botId: 'bot_123' });

// Delete
await client.memories.delete('mem_456');

Search Memories

const results = await client.memories.search({
  botId: 'bot_123',
  query: 'customers who prefer phone communication',
  limit: 5
});

With Metadata Filters

const results = await client.memories.search({
  botId: 'bot_123',
  query: 'customer preferences',
  metadataFilter: { category: 'preference', importance: 'high' }
});

Common Patterns

Store Preferences

await client.memories.create({
  botId: 'bot_123',
  content: 'User Sarah prefers dark mode UI, timezone PST',
  metadata: { category: 'preference', user_id: 'user_456' }
});

Contextual AI Chat

async function chatWithContext(botId: string, userMessage: string) {
  const context = await client.memories.search({
    botId, query: userMessage, limit: 5
  });

  const contextText = context.map(m => m.content).join('\n');
  const aiResponse = await callYourLLM(contextText, userMessage);
  
  await client.memories.create({
    botId,
    content: `User: ${userMessage}\nAI: ${aiResponse}`,
    metadata: { type: 'conversation' }
  });
  
  return aiResponse;
}

Best Practices

✅ Do:

  • Store rich, specific content
  • Use consistent metadata
  • Use semantic search

❌ Don't:

  • Store vague content
  • Fetch all at once

Next Steps