How to build a staff training AI assistant
Kai Henthorn-Iwane
@KaiHenthornA weak onboarding process can be the starting point of many structural problems. According to Jobvite, one in three new hires decide to leave in the first 90 days. This creates a constant understaffing problem, limiting productivity and preventing companies from reaching their goals.
The answer is creating a high-quality onboarding experience, building employee loyalty and setting them up for success. But instead of redesigning your processes from scratch, you can start by providing easier access to core information with an AI chatbot.
Beyond providing excellent onboarding, this tool is also a great match for any circumstance where you need to provide answers without directing your team to deep and complex documentation. You can train it on admin topics, training curricula or project information to make it easier to surface the right information at the right time.
Whichever your use case, Stack AI offers all the tools to build a chatbot powered by the latest AI models, with knowledge base technology to ground their answers on your company data, maximizing response accuracy. Once you distribute it to your team and they start asking questions, you can explore the most common topics and use that intelligence to overhaul your entire onboarding program.
In this step-by-step tutorial, you'll learn how to build an AI staff training assistant for new employees. Depending on your data setup, you can finish it in less than an hour.
H2 How to build a staff training assistant for new employees
- Create a new Stack AI project
- Connect inputs, run searches and generate accurate answers
- Configure the chat assistant’s interface
- Staff training assistant in Slack
- Share it with your team
- Keep track of usage and analytics
- Improve your employee onboarding tool
H2 Create a new Stack AI project
If you haven’t already, sign up for a Stack AI account. After logging in, click the New Project button at the top right of the screen.
There are many templates available covering dozens of use cases. We’ll be starting from scratch in this tutorial: click New Project to continue.
You’ll land on the project’s canvas with an input and an output node, ready to build.
H2 Connect inputs, run searches and generate accurate answers
H3 Determine question topic
Instead of searching through your entire company data sources to answer every question—which would be time-consuming and potentially degrade LLM performance—we’ll start by determining the question’s intent. Stack AI supports this analysis with the Routing Switch node. On the left-side menu, click Logic to expand, drag the Routing node and drop it onto the canvas.
Each option corresponds to a possible topic. For this tutorial, we’ll add 3 topics covering general company information, job responsibilities and admin. We’ll also add a catch-all route for all the questions that are out of scope, so the chatbot can direct the user to their point of contact or other internal resources.
List and describe all potential topics your trainees could be asking about, and write them down as instructions in the Description field inside the Routing Switch node. Here are some examples to help you get started. Copy and paste them into each new option:
- Option 0: Identify when the user inquires about broad company information, such as its history, mission, vision, culture, values, organizational structure, products, services, size, or locations. Look for phrases like 'Tell me about the company,' 'What does the company do?' or questions about the company's background, market position, or future plans.
- Option 1: Detect when the user asks about their personal role, position, or job within the company. This includes questions about specific responsibilities, duties, tasks, team structure, reporting lines, performance expectations, goals, required skills, or qualifications. Look for phrases like 'What will I be doing?', 'Who do I report to?', or 'What are my main responsibilities?
- Option 2: Recognize when the user inquires about administrative or logistical matters. This includes questions about payroll, salary, benefits, time off policies, vacation days, IT support, equipment provision, office logistics, desk assignments, HR policies, or the employee handbook. Look for phrases like 'How do I request time off?', 'When do I get paid?', or 'How do I set up my computer?'
The prompts we use for the Routing Switch follow a similar pattern:
- A general description of what to look for: “Detect when the user asks about their personal role, position, or job within the company.”
- A collection of keywords and subtopics the user might write to start a conversation: “This includes questions about specific responsibilities, duties, tasks, team structure, reporting lines, performance expectations, goals, required skills, or qualifications.”
- And a few examples to improve interpretation: “Look for phrases like 'What will I be doing?', 'Who do I report to?', or 'What are my main responsibilities?”
Lastly, add the catch-all instruction for questions that aren’t covered by your knowledge bases. Click the plus button to add an additional row.
Copy and paste this prompt onto the last row:
Option 3: Identify when the user asks about topics not covered by the other three categories (company info, job responsibilities, admin issues). This may include questions about industry trends, competitors, personal advice, career development, social events, non-work activities, technical details beyond basic admin setup, or policies and procedures not related to their immediate role or onboarding. When such a query is identified, you should acknowledge the question and state that the topic is outside your current knowledge base. Look for phrases that don't contain keywords from other categories and seem unrelated to immediate onboarding needs. This category serves as a safety net to ensure all user queries receive an appropriate response, even if you can't provide specific information on the topic.
When writing prompts for catch-all situations or to introduce guardrails, the pattern is similar to the previous ones. But, instead of marking the topics the chat assistant should talk about, first list all the topics within the scope and then instruct the chatbot on which topics lie outside of it. You’ll also have to give instructions on how the assistant should handle these questions as an additional layer of safety and to improve response quality. Keep these patterns in mind when designing and tuning your prompts, making sure there’s no overlap in topics or keywords.
Connect the main input to the Message connector of the Routing Switch. This will pass the user message to the Switch, enabling the routing functionality.
H3 Create your knowledge bases
With Stack AI, you can build knowledge bases to gather data about topics, projects or teams into a single source. Then, you can use it in projects to answer questions or to give additional context to an AI workflow. More than organizing information, this can also improve data security: you can choose exactly what each tool will have access to, preventing lower-rank employees from seeing executive-level data, for example.
While you can fill a knowledge base with just PDF documents, connecting your company’s databases is faster and easier to maintain. If you haven’t connected any platforms to Stack AI yet, read our documentation on how to connect data sources such as Microsoft Sharepoint, Google Drive or Salesforce—to name a few of many of the available integrations.
In this tutorial, we’ll create 3 knowledge bases, one for each topic (general company information, job responsibilities and admin information), using Sharepoint, Notion and PDF document uploads.
To do so, we’ll have to go back to the Stack AI dashboard. Click the app’s icon at the top left of the screen to leave the project.
On the dashboard’s left side menu, under the Your Data section, click Knowledge Bases.
Click the Get Started button.
Update the name and the description of the knowledge base: click each text element and edit it to display “Company Information”.
Next, select your data sources. Click Import from a Connection button.
This popup shows the currently connected data sources. If you don’t see a connection that you need, read more on how to connect it here. Click Sharepoint to browse the source’s content.
There are 4 sites in our Sharepoint instance. We’ll select the All Company site, as it contains all the general information, as well as the onboarding guide. Once you select which sites you’d like to import, click the Import selected files button at the bottom-right part of the screen.
Once added, these resources will automatically sync, always remaining up to date. Optional: if you’d like to disable this functionality, click the 3-dot icon at the top right and toggle Auto-Sync off.
We’ll create a new knowledge base that will hold files with information about job responsibilities. Click the New Knowledge Base button at the top-right of the screen.
This time, we’ll use the PDF document upload method. Click the Drag or click to upload button and select files in your computer.
Your uploaded files are listed here, showing their current status. If you need to update any of this information later, you can delete any file and replace it with a more current one.
Before moving on, click the title to update the knowledge base’s name. This one will be called Job Descriptions.
The last knowledge base remaining will cover admin topics. Click the New Knowledge Base button at the top-right again.
Click the Import from a Connection button.
This time, select Notion from the left-side menu.
Then, select all the pages within your Notion account that contain admin information and click the Import selected files button to import.
Change the name of the knowledge base to Admin Information. Like in the Sharepoint knowledge base, you can click the 3-dot icon menu at the top right to turn off Auto Sync, if you wish.
This tutorial covers an example of this functionality, but the setup process may be different depending on your company’s data sources and platforms. When you’re ready to continue building the project, use the folder navigation on the top-left side and click to go back to the project you were working on.
H3 Connect knowledge bases to topics
We’re back to the Stack AI project. Now that we have the 3 knowledge bases we can add them to each topic of the Routing Switch node. On the left side, click to expand the Knowledge Bases section, drag a Knowledge Base node and drop it on the canvas, just in front of the Routing Switch node.
We’ll start connecting the general questions. In the Knowledge Base node, click the Select a Knowledge Base dropdown and choose the Company Information knowledge base.
Repeat the process for the job description topic. Drag a Knowledge Base node from the left side and drop it on the canvas.
In this Knowledge Base, open the dropdown and select Job Descriptions to continue.
Repeat one last time for Admin Information. Drag the Knowledge Base node, drop it on the canvas and select Admin Information to continue.
Connect each of the topics on the Routing Switch to the knowledge bases. First, drag the Option 0 connector to the input query connector of the Company Information Knowledge Base node.
Second, drag the Option 1 connector to the input query connector of the Job Descriptions Knowledge Base node.
Third, drag the Option 2 connector to the input query connector of the Admin Information Knowledge base node.
We’ll come back to the catch-all option (Option 3) later in this tutorial.
H3 Combine the results
We’ll use a Routing Combine node to simplify the LLM’s prompt. On the left side, click to expand the Logic section, drag a Routing Combine node and drop it on the canvas in front of the knowledge bases.
Since we’re combining the results of 4 possible options, change the value in the Combine node to 4. This will change the number of available inputs.
Connect all the Knowledge Base nodes to the Routing Combine node. Start with the Company Information Knowledge base: drag the To LLM connector to the Input 0 connector.
For the Job Description Knowledge Base, drag the To LLM connector to the Input 1 connector.
Next, drag the To LLM connector of the Admin Information Knowledge Base to the Input 2 connector.
The architecture is different for the catch-all option: drag the Output 3 connector from the Routing Switch node to the Input 3 connector of the Routing Combine node. This way, if the user asks a question that’s outside of the scope of the chatbot, the tool won’t run a search in your knowledge bases and will provide an answer without consuming as many resources.
H3 Generate a response with an LLM
The brain of the tool will be the Azure LLM, an isolated version of OpenAI’s latest models, offering higher data security and compliance. On the left side, click the LLMs section, drag the Azure node and drop it in front of the Routing Combine node.
The LLM will need two inputs: the information from the correct knowledge base and the user’s prompt. First, connect the Routing Combine node’s output connector to the Azure LLM input connector.
Then, zoom out and connect the main input node to the Azure LLM.
Azure’s available LLMs include the latest models from OpenAI. At the time of writing, the top two available are GPT-4o and GPT-4o Mini. Click the dropdown within the node and select GPT-4o Mini.
The Instructions show the LLM how to behave. Here’s a suggestion you can copy and paste into the corresponding input field:
You are an employee onboarding assistant. Answer questions in a friendly, thorough but concise way.
In addition to the user message, you'll be given additional context to help you respond accurately, depending on the topic the user is asking about.
When you don’t have enough information to provide an answer, you must not provide any general advice or information. Instead, you should:
1. Politely inform the user that the question is beyond your current capabilities or knowledge base.
2. Clearly state that you cannot provide information or advice on topics outside your designated areas.
3. Direct the user to appropriate internal resources such as their manager, HR department, or the company intranet for further assistance.
You should never attempt to answer these questions with general knowledge or advice. Your response should be brief, direct, and focused solely on redirecting the user to appropriate internal resources.
As for the user prompt, we’ll need far less text. Copy and paste the following into the Prompt input field of the Azure LLM:
_Consider this information: \<information\>{combine-0}\</information\>_
_Reply to this user message: {in-0}_
The values inside curly brackets are variables:
**{combine-0}**
holds the data returned from any of the knowledge bases (if the question topic matches any of them) or is empty if the catch-all option is triggered. This variable is enclosed within XML tags (<information> </information>) to clearly label the data, helping the LLM interpret the prompt accurately. XML tags are optional but recommended.**{in-0}**
contains the question that the trainee wrote in the chat assistant.
In the future, as you add more nodes that contain or generate data, you can reference them in other nodes using these variables. This helps you build more powerful and flexible tools.
All that’s left is adding an output to show the response to the user. In the Azure LLM, drag the Completion connector to the Output node.
Configure the chat assistant’s interface
The workflow part of the project is ready. Now, you’ll configure the user interface to let new employees interact with it. Click the Publish button at the top right to save all the changes.
Important: remember to always click Publish when making changes to the workflow.
At the top left side of the screen, click the Export tab.
New Stack AI projects are published with a form interface by default. We want to create a chat assistant, so click the dropdown at the top of the screen and select Chat Assistant from the available options.
The left-side menu lets you configure the interface. In the General section, you can:
- Add a custom subdomain to make the tool’s link shorter, more readable and easier to share.
- Change the name to “Staff Training Assistant” and add a description that highlights the main functionality, such as: “The Staff Training Assistant can help you with any questions you have about our company, your job role and admin topics.”
- And edit the disclaimer shown below the input field of the chat interface. By default, it reads “AI assistants might make mistakes. Check important information” but you can edit it to a variation that’s more appropriate to your company, if necessary.
The Fields section lets you activate or deactivate inputs and outputs for projects with multiple nodes, also offering ways to label each of these to improve user experience. Changing these has no effect with the Chat Assistant interface, but keep them in mind while configuring other interfaces in the future.
The Style section lets you change the icon of the tool to something more illustrative of the tool’s functionality. If needed, consider downloading an icon from an online icon library or add your company’s logo for consistent branding.
Click to expand the Configuration tab. Here, you can activate or deactivate:
- User feedback, adding a thumbs up or down to each message, letting users record whether the responses were helpful or not. This can be useful for improving the tool in the long run.
- Related results enables a list of related questions about the topic, speeding up and guiding the user through the conversation.
- Show steps shows on the interface what the assistant is doing during the conversation: understanding the input, thinking, and so on.
- Conversation starters lets you add starter questions that will display on top of the input field, great to help users learn what the chat assistant can talk about.
Stack AI has high security and compliance standards, which also translate to the ways you can share this chat assistant with your team. At a basic level, you can add password protection. If you’d like more granularity, you can turn on SSO and decide which users have access to this interface. When embedding this tool in other websites or platforms, you can also whitelist the URLs of those locations, preventing unauthorized access even if someone tries to embed it somewhere else.
Every time you make a change to the interface, don’t forget to click the Save Interface button.
H2 Share it with your team
The tool is ready for sharing. With Stack AI, this is as simple as copying the link at the top of the interface and pasting it in a message or email.
If there are other team members building with Stack AI and you’d like to share your project with them, you can click the Share button at the top right of the screen.
This only shares a copy of your project, so any changes they make won’t impact your work.
H2 Staff training assistant in Slack
Instead of directing new employees to the chat assistant interface, you could instead integrate the assistant into your company’s Slack channels. That way, your team can ask and get answers directly within the app, making it a more seamless experience. Take a look at our article on how to build a Slackbot for your teams, with step-by-step guidance to implement it on your internal communication channels.
H2 Keep track of usage and analytics
You can see the impact of your staff training assistant within Stack AI: on the top left side of the screen, click the Analytics tab.
You’ll find a summary of all the runs, users, errors and tokens consumed by the chat assistant.
Under these general statistics, you’ll find a detailed breakdown of all the runs. Click the Columns dropdown to show or hide columns to see the data you need to see.
You can also export this data for analysis or storage: at the top right of the screen, you can click Download Logs as JSON or in a CSV file. You’ll also find a way to get an AI-powered PDF report, highlighting common inputs, common outputs and sources of errors.
If you’d like a conversation-by-conversation view for troubleshooting issues, click the Manager tab at the top left.
This tab shows all conversations of all users. You can click to read any conversation and dive deep into issues your team reports.
Improve your employee onboarding tool
Keep improving your staff training assistant by checking the Analytics and Manager tabs regularly. Use that information to adjust the prompts in the Routing Switch as well as the Azure LLM.
You can also easily upgrade to the latest LLMs with just a few clicks. On the Workflow tab, find the LLM nodes you want to upgrade and click the cogwheel icon at the top right.
Use the Provider and Model dropdowns to choose the new model you’d like to use.
![][image66]
Important: remember to click Publish to save and test the tool to see if everything’s working as expected.
Wrapping up
When new hires don’t have all the information they need, they’ll waste too much time browsing confusing internal resources or disrupt the work of their managers with constant questions—adding a feeling of lack of transparency and care to their first weeks at their new job. To improve productivity, integration and retention, an AI staff training assistant is a flexible, low-cost and effective way of helping new employees navigate their first challenges and ultimately succeed.
There’s a lot more you can automate with Stack AI. Create your free account here and explore our other tutorials: