How to make your tender review process 40% more efficient with AI

Paul Omenaca

Paul Omenaca

@houmland
How to make your tender review process 40% more efficient with AI

Reviewing hundreds of tender documents to find the most efficient solution to your project brings many challenges:

  • Read multiple vendor documents, accounting for every detail
  • Coordinate with your team as you make sure every constraint is respected
  • With tight deadlines on the horizon, moving too fast may lead to more human mistakes

According to Responsive’s RFP Research, 54% of companies still rely only on basic productivity apps to manage the bid review process, opting for the slow path.

In this step-by-step tutorial, you’ll learn how to build a tender document analysis tool that can analyze hundreds of bids with a single click, offering AI-powered assessments for every aspect—from the scope of work to financial details.

You can quickly find the best bids, share them with your team and make critical decisions faster.

How to build a tender document analysis AI assistant

  1. Create a new project in Stack AI
  2. Connect the LLM, Document, Sharepoint, inputs and outputs
  3. Process hundreds of tender documents with a single click
  4. Analyze everything in the tender document
  5. Try the Chat Assistant, Form and API interfaces
  6. Share it with your team
  7. Keep track of usage with analytics
  8. Stay up to date on the latest AI models

It’ll take up to 30 minutes to build the basic analysis tool, but you can keep upgrading it to analyze every aspect of every tender document.

1. Create a new project in Stack AI

If you don’t have an account yet, click here to sign up for a free Stack AI account.

Please note: you need an Enterprise plan to use the Microsoft Sharepoint connection, but you can still build a tender document analysis tool without that integration.

Once you log in, click the New Project button at the top right of the dashboard.


List of
Templates

There are many templates to explore, but we’ll start from scratch: select New Project to continue.


New Project

The project opens up with an input and an output node on the canvas.

2. Connect the LLM, Document, Sharepoint, inputs and outputs

2.1. Add a document input

Your tender documents are in file format, so we need to add a new input to the canvas for uploading them. On the left side of the screen, expand the Data Loaders section…


Expanding Data
Loaders

Expanding Data Loaders

… and drag a Documents node onto the canvas.


Document Node
Details

Document Node Details

We want to offer a way to upload the tender documents into the final tool: tick on Expose as input to enable this functionality.


Expose as
Input

Expose as Input

2.2. Add an LLM

Next, you’ll need to add an LLM to analyze the data. Stack AI integrates with all major AI model providers, including OpenAI, Anthropic and Google.

However, in this tutorial, we’ll explore the Microsoft Azure LLM integration. Stack AI has an agreement with Microsoft, offering access to the latest OpenAI GPT models in a containerized environment.

Click to expand the LLMs section and find the Azure node…


Drop Azure
Configuration

Drop Azure Configuration

… and drag it into the canvas, to the right of the main input and the Document node.


Azure Dropped
Confirmation

Azure Dropped Confirmation

You can use Stack AI’s native connection without extra configuration.

Optional: if you want to use the GPT models installed in your Azure platform, talk to your IT team to get them to configure everything on their end. Once you have the access details, click the Settings icon on the top right of the node.


Click
Settings

Click Settings

A side tab appears. Scroll all the way down and fill out the fields API Key, API Endpoint and Deployment Name (Azure) to connect to your internal GPT models.


Fill Out API
Details

Fill Out API Details

You can pick which GPT model you’d like to use provided it’s accessible via your Azure setup. Scroll up the side tab and select your favorite model. For this tutorial, we’ll use GPT-4o.


Select Model
Configuration

Select Model Configuration

2.3. Connecting Sharepoint

The response quality improves when you share more context with the LLM. We want to generate the best possible answer, so we’ll connect to Sharepoint to pull key data for analyzing each document. On the left side, click to expand the Knowledge Bases section and locate the Sharepoint node.


Access Knowledge
Bases

Access Knowledge Bases

The Knowledge Bases section offers nodes that help LLMs ground their responses in your data, not as much on their training. Drag the Sharepoint node to the canvas, below the Documents node.


Drop SharePoint
Connection

Drop SharePoint Connection

When using this tool, your team will upload a document and run the analysis. They won’t be writing prompts: they’ll just run the action and the tool will start the workflow you’re building here. You can delete the main input at the top of the Documents node. Click the cross icon.


Delete Input
Node

Delete Input Node

Stack AI has a proprietary search algorithm that automatically finds the most relevant information in your Sharepoint instance and sends it to the LLM for processing. To take advantage of this feature, connect the Documents input to the Sharepoint node.


Connect Documents to
SharePoint

Connect Documents to SharePoint

Next, connect the Sharepoint node to the LLM input box.


Connect SharePoint to
LLM

Connect SharePoint to LLM

Finally, connect the Documents node to the LLM.


Connect Documents to
LLM

Connect Documents to LLM

You need to connect Stack AI to Sharepoint to enable search. On the Sharepoint node, click the Settings icon.


Click Settings
Again

Click Settings Again

The settings side tab appears on the right side of the screen. If you haven’t connected Stack AI to Sharepoint before:

  1. Follow this guide on how to get the connection details from Microsoft Azure. Alternatively, contact your IT department and ask for the API information.
  2. Click the New connection button.

Initiate New
Connection

Initiate New Connection

A popup appears. Search for and click the Sharepoint button…


Select SharePoint as
Source

Select SharePoint as Source

… and type in the connection details you got from Azure or your IT team in the input fields. Click Create connection when done.


Enter SharePoint Connection
Details

Enter SharePoint Connection Details

In the side tab, click the Select connection dropdown and pick Sharepoint. The interface updates with buttons to Test or Disconnect. You can click the former to see that everything is working properly. Notice that the node in the canvas also changes with new status indicators and buttons.


Enter System
Instructions

Note: this Sharepoint connection will now be available for every new project you create in Stack AI. You can manage all your connections by going to the Stack AI dashboard > click your profile name > Settings > Connections.


Enter System
Instructions

2.4. Azure GPT-4o system instructions and prompt

The system instructions and the prompt tell the LLM how to process the tender documents. We’ll configure it to analyze the scope of work set by the bidder to help the reviewer understand it at a glance.

Here’s an example system prompt:

Example:

Question: What role do coral reefs play in marine biodiversity?
Answer: Coral reefs are crucial to marine biodiversity, serving as habitats for about 25% of all marine species. The document Marine Ecosystems Overview (Chapter 3, p. 45) explains that these ecosystems provide shelter, food, and breeding grounds for a wide variety of marine life. Furthermore, the section titled Biodiversity Hotspots in the book Oceanic Life (2022, p. 67) emphasizes that the loss of coral reefs due to environmental stressors could lead to a significant decline in marine biodiversity.
Reference List: Marine Ecosystems Overview. (2020). Sydney: Oceanic Press.
Oceanic Life. (2022). Miami: Marine Studies Publications.

Copy and paste the system instructions in the corresponding input field inside the Azure node.


Enter a
Prompt

While the system instructions cover the big-picture guidelines, the prompt gets specific on what to analyze. Copy and paste the text below into the corresponding input field in the Azure node:

<document>{doc - 0}</document>

Question: review the uploaded tender document and provide a detailed analysis of the scope of work as it pertains to the project.

Analysis Points:

1. Key Work Activities:

- Identify and list the main work activities or tasks that are explicitly mentioned in the scope of work.
- Provide details on the specific responsibilities, deliverables, or outcomes associated with each key activity.

2. Technical Requirements:

- Outline any technical specifications or standards that the work must meet.
- Detail any specific methods, materials, or processes that are required to complete the work according to the tender specifications.

3. Work Execution Guidelines:

- Summarize any guidelines, procedures, or best practices for how the work should be executed.
- Include any operational procedures, restrictions, or quality control measures mentioned in the scope.

4. Timelines and Milestones:

- Identify any deadlines, project timelines, or key milestones outlined in the scope of work.
- Note any phased work requirements, deliverable schedules, or critical path activities.

5. Resource Allocation:

- Note any requirements related to the allocation of resources, including personnel, equipment, or materials.
- Provide details on any specific resource levels, qualifications, or certifications needed to perform the work.

6. Risk and Compliance:

- Highlight any risk management strategies or compliance requirements mentioned in the scope of work.
- Include details on safety standards, environmental considerations, or regulatory obligations that must be met.

Response Format:
Present your findings in a structured, bullet-point format, starting from point 1.
Ensure that the summary is concise, focusing on the most relevant and critical details.

If any of the requirements have specific deadlines or dates that have already passed, clearly indicate those.
Use this context to evaluate the tender document:

<sharepoint>{sharepointemb - 0}</sharepoint>

Prompt

Make sure the formatting (numbering, spaces, bullet points) is present and consistent. This will help the LLM interpret the instructions correctly.

The values in curly brackets are variables from other nodes:

  • {doc-0} contains the data from the uploaded tender document, present in the Documents node.
  • {sharepointemb-0} will contain the data pulled from Sharepoint, represented in the canvas by the green Sharepoint node.

When building more complex flows, you can add variables connected to multiple nodes, pushing dynamic data into your LLMs’ instructions and prompts. This improves the context and accuracy of the reasoning process, increasing the quality of the response.

If you’re having trouble controlling the LLM’s behavior, consider adding XML tags (like <context> docemb-0 </context>) to separate parts of the prompt. This will differentiate the purpose of each part of the instructions.

2.5. Change the LLM settings

On the top right of the Azure node, click the Settings icon.


Click Settings Once
More

The settings side tab appears again. We’ve explored the API connection settings at the bottom, but there are other important controls you can change.


Adjust Azure
Settings

Hover the information icon to reveal basic information about each setting. We don’t need conversational memory for this tutorial, as the model will run once with the full context and provide a reply. We won’t be asking any follow-up questions. Click the toggle to turn Memory off.


Turn Off
Memory

Turn Off Memory

The other setting we’ll change for this tutorial is Max Output Length. Even though we’re asking for a bullet-point summary of the tender document, the model may need to generate a lot of text to fully cover all the content. Slide up the Max Output Length to 2,000—you can also click the number and type this value if you prefer.


Set Maximum Output
Length

That’s all you need to change if you’re following the tutorial. Here’s a walkthrough of the other major settings:

  • Provider and Model let you change the company and the AI model responsible for the reasoning in this node. Stack AI integrates with a wide range of providers, including OpenAI, Anthropic, Hugging Face or Mistral.
  • Citations offer links to the resources used to generate the answer. This will help you jump to the referenced Sharepoint page.
  • You can also activate chart generation, personal information compliance and safety guardrails.

2.6. Connect the Azure LLM node to the output

The LLM needs to display its work to the user. Connect the Azure LLM node to the output node.


Connect Azure LLM to
Output

3. Process hundreds of tender documents with a single click

Now, We need to set up the user interface that your team will use to extract the insights from the tender documents.

Stack AI has a batch processing interface where you can upload hundreds of documents and click once to run the tool on each of them separately.

First, click the Publish button at the top right of the screen to save all the changes.

Important: whenever you make changes to a project, make sure to click the Publish button to save them.


Click Publish for LLM Connection

Click Publish for LLM Connection

Then, click the Export tab at the top left of the screen.


Open the Export
Tab

Open the Export Tab

3.1. Choose the interface

This section has everything to customize the user interface for the tool. We’ll be using the batch interface: click the dropdown currently displaying Form and change it to Batch.


Select Batch Run
Dropdown

Select Batch Run Dropdown

3.2. Use the batch interface

The preview window updates to show an empty list. You can add tender documents for analysis by clicking the Add Run button…


Add Batch Run
Task

Add Batch Run Task

… selecting Upload files in the first row…


Upload Batch
Files

Upload Batch Files

… and selecting a tender document on your computer. Make sure to only add one file to each row, as adding multiple will mix up both tender documents and create unpredictable results. Click the Add Run button to add more rows.

Once you add all the documents you want to process, click the Run Batch button once to process them all.

Please wait until the workflow is completed.


Run the Batch
Process

Run the Batch Process

You can read the results for each row in the Output 1 column. If you’d like to take them all with you to another app or share it with your team, click the 3-dot button and select Download CSV.


Download Batch CSV
File

Download Batch CSV File

3.3. Customize the batch interface

On the left side of the preview screen, there are a range of settings to help you control the visual aspects of the tool, some of its functionality and access control.


Adjust Batch Interface
Settings

Adjust Batch Interface Settings

Here’s a breakdown of the top features:

  • Custom domain binds this interface to your company’s domain. The link to this tool will be easier to read, type and share.
  • Name and Description lets you add a few instructions to this tool, labeling its purpose so your team knows what it can do.
  • Fields selects which inputs and outputs will show on the interface. You can also change the Alias to something more user-friendly. For example, you can change Document 1 to Tender Document and Output 1 to Scope of Work Analysis.
  • Security lets you enable password or interface access control protection, as well as requiring SSO (Single Sign-On) to use this interface. This way, even if the link to this tool is leaked, only your team will have access.

3.4. Analyze everything in the tender document

A tender document has many aspects to explore, so a general scope of work analysis won’t get you very far. We can improve this tool to pick apart every detail about the bid and place it in a corresponding column.

Click on the Workflow tab to go back to the editor canvas.


Navigate Workflow
Back

Navigate Workflow Back

3.5. Adding one more Azure LLM node

We’re going to add a new LLM node to analyze the budget. Expand the LLMs section, locate the Azure node again…


Locate Azure
Configuration

Locate Azure Configuration

… and drag it into the canvas, under the first Azure LLM node.


Drop Second Azure
Configuration

Drop Second Azure Configuration

3.6. Copy the system instructions to the new node

Copy and paste the system instructions from the Azure node above to the new one below, as they govern the general behavior of the model.


Copy LLM System
Instructions

Copy LLM System Instructions

3.7. Change the user prompt

The prompt needs to change so we can run a budget analysis. Copy and paste this example to the prompt input field of the second Azure node:

Question: review the uploaded tender document and provide a detailed financial
analysis of the project.

<document>{doc - 0}</document>

Analysis Points:

1. Project Cost Breakdown:

- Analyze the cost structure of the project, including materials, labor, equipment, and other
  relevant expenses.
- Identify any cost items that appear to be underestimated or overestimated.
- Evaluate the allocation of resources and ensure that costs are proportionate to the project scope.

2. Funding & Financing:

- Analyze the funding structure, including sources of financing, debt-to-equity ratio, and repayment terms.
- Assess the feasibility of securing the required funding based on the project’s financial projections and risk profile.
- Identify any potential issues with financing that could impact the project’s success.

Response Format:

- Present your findings in a structured, bullet-point format, starting from point 1.
- Ensure that the summary is concise, focusing on the most relevant and critical
  details.

Use this context to evaluate the tender document:

<sharepoint>{sharepointemb - 0}</sharepoint>

Add Prompt to Second
LLM

Add Prompt to Second LLM

This analysis should sit in a new column in the batch interface, making it easier to separate the two assessments. But to make that happen, you need to add an additional output node. On the left side menu, expand the Outputs tab…


Drag and Drop Output
Configuration

Drag and Drop Output Configuration

… and drag an output node to the right side of the bottom-most Azure node.


Drop Output
Node

Now, you need to connect these new elements. Let’s go step-by-step to make sure that everything is connected the right way.

First, connect the Documents node to the second Azure LLM node.


Connect Documents to LLM
Output

Then, connect the Sharepoint node to the second Azure LLM node.


Connect SharePoint to Azure LLM
Output

Finally, connect the second Azure LLM to the new output node you dropped on the canvas.


Connect Azure LLM to Final
Output

Connect Azure LLM to Final Output

3.8. Update the batch interface

These changes will impact the batch interface. First, click Publish at the top right of the screen to save.


Click Publish for Final
Output

Click Publish for Final Output

To configure the user interface, click the Export tab at the top left.


Click Export for
Outputs

Click Export for Outputs

We’re back to the Export tab. To activate the new output on the interface, scroll down to the Fields section and take a look at Outputs.


View Final
Outputs

Tick on out-0 to activate it on the interface. You can also change the Alias of each output to reflect what kind of AI analysis happens in each column.


Activate Output and
Labels

Click the Save Interface button at the top right to refresh the user interface preview.


Save Interface
Settings

Notice the Batch interface updates with the new column, labeled with the Alias we typed in. Now, if you click Run Batch, it will show the scope of work analysis and the financial analysis in separate columns.

Additionally, when you download the results as a CSV, these outputs will be in separate columns, making it easier to read and understand.


Add Financial Analysis
Column

3.9. Add more LLM nodes for a complete analysis

You can add as many LLM nodes as you need to analyze every aspect of the document. For example, here’s an improved version of this tool with 5 LLMs and 5 outputs in total.

This tool will show 5 separate columns for the scope of work, financial analysis, project timeline, risk assessment and compliance.


Review Node
Configuration

Review Node Configuration

The process of adding more nodes and columns to the analysis is always the same.

Repeat the steps we went through in this section. Here’s a quick reminder:

  1. Drag and drop an LLM node onto the canvas.
  2. Connect the Documents Input and the Sharepoint node to the LLM.
  3. Set up the system instructions and prompt inside the LLM node.
  4. Drag and drop an output node onto the canvas.
  5. Connect the LLM node to the output.
  6. Go to the Export tab and toggle the output to display on the batch interface.

4. Try the Chat Assistant, Form and API interfaces

The batch interface is the best for processing multiple documents at once. But your tender document analysis needs may require other tools that offer either more flexibility or specialization.

  • Chat Assistant is the best for a flexible approach. You can set it up with one or multiple tender documents to ask free-form questions as you would to ChatGPT, for example.
  • Form is the best for quick analysis of a single file. Upload a document and run the analysis with or without a question and the tool will return the results on the page.
  • API is a powerful interface that lets you connect this tool to your internal systems. With it, you can integrate Stack AI with your CRM, ERP or other back-office tools to quickly send tender documents and get the AI-powered analysis results back into those apps.

5. Share it with your team

Your tender document analysis tool is ready: it’s time to share it with your team. In Stack AI, it’s as easy as copying the link at the top of the preview window in the Export tab and pasting it in an email or internal chat channels.


Share Configuration
Link

Share Configuration Link

Remember to review the security settings for the interface. You can set up password protection and SSO requirements—that way, even if the link is leaked, the interface will still be locked to strangers.

If you have other team members building with Stack AI, share what you’re building! by clicking the Share button at the top part of the screen. This only shares a copy of your flows: your work can’t be changed by anyone else.

6. Keep track of usage with analytics

Keep track of the performance of your tender document analysis tool in the Analytics tab.

Click the corresponding tab on the top left to access the stats.


Open Analytics
Tab

Open Analytics Tab

See the total number of runs, users, errors and used tokens, as well as a breakdown of the performance of the latest runs.


Select Columns
Dropdown

Select Columns Dropdown

See more data at a glance by adding more columns to the report list. Click the Columns dropdown at the middle left part of the screen and tick every property you want to see.

7. Stay up to date on the latest AI models

In addition to connecting more LLM nodes to analyze every aspect of each document, you can make sure your tools remain current with the latest AI models. Stack AI always has the latest models available, so upgrading your reasoning engines is as easy as picking a new provider or AI model in the LLM nodes.

In the Workflow tab, find your LLM nodes and click the Settings icon.


Update LLM Engine
Settings

Update LLM Engine Settings

Adjust the Provider and Model dropdowns to browse and select the latest AI models.


Change Provider and
Model

Change Provider and Model

Remember to always hit the Publish button when making these changes and running a test to ensure everything runs smoothly.

Wrapping up

Finding the best proposal out of hundreds or thousands of submissions requires a lot of time, people and cognitive power. Don’t settle for basic productivity tools: with Stack AI, you can ease the burden on your teams, greenlighting projects faster—and, as a result, reaping the benefits much sooner.

This is just the beginning of what you can create with Stack AI. Create a free Stack AI account and explore our AI tool tutorials: