Subscribe for updates
Back to Changelog
Metadata filtering, new AI models and data integrations

Metadata filtering, new AI models and data integrations

Improve the accuracy of your AI agents with metadata. Leverage the latest models from Meta and Anthropic, and our latest integrations with Dropbox and Confluence

Toni Rosinol

Toni Rosinol

@RosinolToni

December was packed with new developments. We improved how LLMs retrieve and use information for greater accuracy with metadata filtering, launched new models, and integrated new knowledge bases. We’re also celebrating a huge milestone—70,000 users and over 100,000 projects created!

Now, let’s dive into the latest product updates.

Metadata filtering for improved RAG accuracy


Metadata feature
thumbnail

Metadata provides additional context about a document, such as the last modified date, the person who last edited it, and key topics. This helps LLMs better understand the content, leading to more accurate responses. Think of metadata as labels that highlight a document’s relevance, subject matter, and other important details. Studies show that using metadata can improve response accuracy by 10–15%.


How to define new metadata
attributes

By default, certain metadata attributes are automatically extracted from documents. However, you can manually add more:

  1. Open the document in the knowledge base.
  2. Scroll to the bottom and click the + symbol.
  3. Enter a name, description, and value for the metadata.
  4. Repeat for other documents as needed.

How to enable metadata
filtering

To enable metadata filtering in search:

  1. Click the Settings button in the knowledge base.
  2. Enable metadata filtering and choose one of three options:
  3. No filter: Metadata is not used to refine search results.
  4. Strict filter: Only documents matching the requested metadata will be returned.
  5. Loose filter: Metadata is considered, but relevant documents without metadata may still appear.

New Meta and Anthropic models


New AI models

Both the Meta Llama 3.3 and Anthropic Claude 3.5 Haiku models are now available on Stack AI. Here's a brief overview of each:

Meta Llama 3.3

  • Performance: Delivers results comparable to larger models like Llama 3.1 405B, with reduced computational requirements.
  • Multilingual Support: Supports multiple languages, including English, German, French, Italian, Portuguese, Hindi, Spanish, and Thai.
  • Context Window: Handles inputs up to 128,000 tokens, facilitating the processing of extensive documents and conversations.

Anthropic Claude 3.5 Haiku

  • Speed: Offers rapid response times, with an average time to first token of approximately 0.80 seconds, making it suitable for real-time applications.
  • Context Window: Supports a context window of 200,000 tokens, enabling the handling of substantial inputs with ease.
  • Use Cases: Ideal for tasks requiring quick and accurate responses, such as coding, data extraction, and content moderation.

Dropbox and Confluence as new knowledge bases for your AI agents


Dropbox and Confluence integrations
thumbnail

You can now link your knowledge stored in Dropbox or Confluence. Simply drag and drop the node from the knowledge base list in the sidebar, connect your account, select the files to index, and start interacting with your content.

Your knowledge remains private unless you choose to share it with your organization.

New content 🎨

Check out our latest pieces of content:

  1. Vertex AI vs. Stack AI
  2. Copilot Studio vs. Stack AI
  3. Which AI model should you choose for your business?
  4. How to build a staff training AI agent
  5. How to use metadata in Stack AI