Dynamic vector databases help you retrieve the most important segments of a document for an LLM, without handing the full document.

Vector DBs help you store a document (from a data source), send a query (from user input), and return the few pieces that are most relevant to the LLM.

Dynamic Vector Stores in Stack AI have the following structure:

  • Inputs:
    • Data loader.
    • Query: a string of text coming from user input or an LLM completion
  • Outputs:
    • Result: most relevant segments of the data loader in a text string. Can go to an LLM or an output node.

Stack AI offers a Basic DB, which is a vector database that runs in memory without sending the data source to an external tool.

The basic DB does not store embeddings after the flow stops running and hence recomputes embeddings at every run. It is only recommended for advanced document analysis applications.