Paul Omenaca
@houmlandOctober has been packed with updates. We’ve introduced new features, boosted system performance, and welcomed new team members.
Let’s dive into the product changes.
Stack AI is now available on-premise. Host Stack AI in your own cloud (AWS, GCP, Azure, Kubernetes, etc.).
Keep your data within your own infrastructure, ensuring maximum control and compliance.
If you believe on-premise is the right choice for your needs, learn more here.
Cerebras is a new LLM provider available in Stack AI that offers high-performance AI inference capabilities. It aims to provide ultra-fast inference to enable more responsive AI applications.
It delivers up to 2,100 tokens per second for Llama 3.1 70B model inference, significantly faster than competitors:
If you are curious about this ground-breaking provider, read more here.
Analyze what’s going on in your organization with the global analytics tab in your dashboard.
Each user will see analytics of the projects they have access to only. An admin user will be the only only seeing all projects.
Filter projects by name or by metric, and access the logs of a specific project directly from this tab.
Introducing a new way to work with information across your projects - Linear Flow!
Now, nodes connected together can access previous node data automatically.
For example, there’s no need to re-link your input node to the LLM if it's already connected to a knowledge base.
Check out other features we’ve released this month:
Check out our latest video and blog post releases: