Subscribe for updates
Back to Changelog
January 2025 Update

January 2025 Update

Build AI agents faster with our new Agent Builder. Control Access to Features and LLMs. Analyze data with our own Code Interpreter tool.

Paul Omenaca

Paul Omenaca

@houmland

January has been quite a ride! Reaching 80,000 users, onboarding new team members, and expanding into new geographies.

StackAI is going international, with deployments across the globe. Check with us if you’d like to have StackAI closer to you.

This sprint's product focus has been on governance, ease of use, and agentic tools. We’ve updated models along the way, on the day they became available (OpenAI o3, DeepSeek in U.S. servers).

Let’s check out the new features.

Metadata filtering for improved RAG accuracy

We just released Quick Start — a faster and easier way to build AI agents in Stack AI! These AI Agents are deployed as Chat Assistant — imagine a custom ChatGPT or Perplexity, powered by any LLM (OpenAI, Anthropic, DeepSeek, Llama), knowledge base, and tools—built in seconds.

Easily customize the title, description, LLM, knowledge base, and agentic tools to trigger — then go live in minutes.


Agent Builder Initial
View

This allows you to immediately launch AI agents. You can also expand the agent functionality in the workflow builder with more nodes. Either way, you can greatly speed up your deployment of AI agents.


Agent Builder Setup
View

Check out our newly redesigned chat assistant interface! We've enhanced the look and feel to be more intuitive and seamless for your end-users.

Feature Access Control

We recently released the Feature Access Control Tab — Admins can now decide which features are accessible to the users in the workflow builder, and appearing in the sidebar.


Feature Access
Control

Moreover, admins can select how citations are rendered to end users. You can permit users to: • Download source documents • Access source documents via a cloud provider (e.g. Sharepoint, GDrive) • Block source documents from download

This empowers admins to control who has access to the underlying source documents. When links are provided to the cloud provider, users must have login access to that provider in order to retrieve the documents.

Code Interpreter

We’re excited to announce the release of StackAI’s own Code Interpreter!

As a tool in Stack AI, Code Interpreter enables you to analyze data, create charts, and translate natural language queries into Python code. This is the equivalent of the same feature from ChatGPT.

It’s under ‘Analysis Tool’ in the ‘Add Tools’ section.


StackAI Code Interpreter

Scroll through the list of available tools to discover exciting new additions like Stripe, Knowledge Base, and more.

Voice in Chat Assistant


Voice to Prompt Chat
Assistant

Forget typing! You can now speak instructions directly into your LLM. This works in two ways. For end users, they can use the voice interface to instruct prompts to the LLM.


Voice to Prompt
Instructions

For app builders, they can voice instructions for the model instead of typing out long passages in the workflow builder. This saves time for both users and creators.

DeepSeek, OpenAI o3, and more


New AI models

January was a big month for new LLMs.

DeepSeek was available in StackAI, the same day it was available in US servers. Now you can build your AI agents with DeepSeek’s low cost, high performance LLM at the core. And you can base your DeepSeek model on a US host (TogetherAI and Groq).

We also recently released OpenAI o3-mini, Gemini 2.0 Flash and Perplexity Sonar Pro, among several other LLM models.

New content 🎨

Check out our latest pieces of content:

  1. RAG vs. Fine-tuning
  2. Implementing AI Agents in finance
  3. Investment Memo Writer