How to automate investment memo writing with AI

Paul Omenaca

Paul Omenaca

@houmland
How to automate investment memo writing with AI

Investing in established companies can offer significant returns, whether by transforming a seasoned business for additional growth or capitalizing on existing market share. However, analyzing such opportunities typically requires synthesizing massive amounts of information—from shareholder letters and due diligence questionnaires to market outlooks and performance sheets. This process often runs 118 hours per deal, eating into resources and extending decision timelines.

That’s where AI can help. By automating first drafts and populating repetitive, structured sections of the memo, AI not only speeds up the creation of comprehensive reports but also provides an overarching view of the target company’s viability. This can reduce the memo-building workload by up to 40%, potentially saving over 42 hours per opportunity.

In this article, you’ll learn how to construct an AI investment memo generation tool that aggregates all the data about a potential private equity purchase. This will help you focus on the strategic insights and determine whether it’s worth pursuing.

How to build an AI investment generation tool

  • Using the AI investment memo generation tool
  • Add inputs
  • Set up dynamic vector stores
  • Plug in data sources
  • Drop and link LLMs
  • Publish the tool
  • Set up the user interface
  • Share it with your team
  • Keep track of usage and analytics
  • Improve your AI investment tool

Using the AI investment memo generation tool

Here’s how the AI investment tool will look like once you complete this tutorial.


Finished tool

Your team will be able to:

  • Type the name of the company they’d like to research.
  • Write additional instructions to steer the generation process.
  • Add URLs that contain valuable information about the company. The tool will automatically visit those links and extract the data from them.
  • Upload internal reports with company data that should be included in the memo.

Once the process is complete, you can download an AI-generated report based on all the information added and data available from your internal sources.

1. Add inputs

Reports depend on the amount and quality of the data you ingest into the AI models in the project. Since creating an investment memo requires breadth and depth, we should add multiple inputs to let your team control how much data they’d like to include into the report.

1.1 Adding a company name

The user will enter the target company name in the first input. We’ll use the input node that’s already present on the canvas.


Use default
input

Due to this project’s complexity, we’ll use Stack AI’s organization features. Click the edit node name icon at the top right of the node.


Click rename
button

Change the node’s name to Company name and click Save.


Change name and
save

1.2 Setting additional instructions

Your team may want to analyze a specific angle of the company data instead of creating a general report. Click the Inputs section to expand it, drag and drop an Input node onto the canvas.


New input

Click the edit node name icon and change it to Additional instructions.


Rename additional instructions and
save

1.3 Uploading reports

Some company information may already be compiled in existing documents and files. Instead of copying and pasting the contents into an input field, we can add a tool to upload them. Click to expand the Input section, drag and drop a Files input node onto the canvas.


Drop files
input

Make sure this file input is exposed in the user interface by toggling the Expose as input setting.


Make sure expose as input is
activated

The name of this node can’t be edited, but we can add a note. Click the note icon on the top right of the node.


Files click
notes

This opens a text area. It supports Markdown syntax: you can use it to format text. Write # Upload reports. The hashtag will set it to an H1 header, making it easier to see.


Add note to
files

1.4 Including URLs

A part of the research might involve searching the web and finding pages that contain company data. Again, instead of visiting those pages, copying and pasting the content into the tool here, we can let your team paste the URLs—Stack AI will scrape the content and make it available in the report. Click to expand the Input section, drag and drop a URL node input field onto the canvas.


Drop url node

Untick the Enable URL as input checkbox. In this case, this will disable the node’s input handle within the project—we won’t need it.


Untick enable as
input

As with files, the URL node’s name can’t be edited. Click the note icon on the top right of the node and write # Reference URLs.


Change note to reference
url

1.5 More inputs

We’ll use 4 inputs for this tutorial, but you can add as many as you need, depending on the data types and instructions you want to pass into the AI models. Stack AI supports inputs for text, documents, URLs, audio files, images and YouTube videos.


View of all
inputs

_

2. Set up dynamic vector stores

In simple terms, a vector store is an AI-friendly database. It can store large amounts of data and retrieve the most relevant chunks based on a user prompt, forwarding them to an LLM to ground the response. Since the inputs we just added to the project may contain a lot of data, we’ll use a Dynamic Vector Store. This node was developed by Stack AI to quickly store and return the most relevant data based on the prompt.

On the left side menu, click to expand the Dynamic Vector Stores section. Then, drag and drop a Basic node onto the canvas.


Drop dynamic vector store
one

The company name and additional instructions inputs shouldn’t contain a lot of data, so we don’t need to connect them to the vector store. Instead, connect the Files input field to the dynamic vector store’s data loader handle.


Connect files to
dvs

All uploaded files will be stored in this node. We’ll need a second one for the URLs, since the web page data can be extensive. Drag and drop another Dynamic Vector Store Basic node onto the canvas, close to the first one.


Drop dvs two

Connect the URLs input node to the second dynamic vector store’s data loader handle.


Connect url to dvs
two

Both vector stores need a prompt to extract the most relevant data. To simplify the user experience, we can add a default prompt to retrieve the same kind of every time. Click to expand the Utils section, drag and drop a Default Message input onto the canvas.


Drop default
message

Write the following in the Default Message node:

Extract company financial information, risk factors, product information and management remarks.

You can change this prompt depending on what kinds of information you want to extract from URLs and your documents. If they’re widely different, you can use two Default Message nodes with two different prompts instead.


Write prompt in default
message

Connect the Default Message node to the input query handles of both Dynamic Vector Store nodes.


Connect default message to both
dvs

3. Plug in data sources

Now that we’ve handled the user inputs, it’s time to connect other data sources to the project. For this tutorial, we’ll connect a web search and a Microsoft Sharepoint instance. You can connect as many data sources as needed to bring in all company information without exporting it from the platforms where it’s stored.

On the left side menu, click to expand the Knowledge Bases section. Then, drag and drop a Web Search node onto the canvas, close to the Company name input.


Drop a web
search

This node needs a keyword to understand what to search for. Connect the company name input to the Web Search.


Connect main input to web
search

Connecting Microsoft Sharepoint

In the Knowledge Bases section, drag and drop a Microsoft Sharepoint node onto the canvas.


Drop
sharepoint

This node integrates with your Sharepoint instance, offering similar functionality to a dynamic vector store: Stack AI created a proprietary algorithm that efficiently searches for the most relevant information based on a prompt.

Read the documentation for instructions on how to connect Microsoft Sharepoint to Stack AI. When fully connected, you’ll see your Sharepoint resources inside the node.

To simplify user experience, we’ll add a default prompt to retrieve information from Sharepoint. On the left side menu, click to expand the Utils section. Then, drag and drop a Default Message node onto the canvas.


Drop default
message

Connect the Default Message to the Sharepoint node.


Connect default message to
sharepoint

Here’s the prompt that we’ll use for this tutorial:

Extract market growth, inflation, macroeconomic conditions.

This prompt is highly dependent on what your data source contains. If you’re storing other kinds of information here, consider adjusting this prompt to surface them instead.

Write the prompt in the Default Message input field.


Write default message for
sharepoint

Other data source connections

Stack AI offers a wide range of data connections with major platforms—you can see the full list in the Knowledge Bases and Databases sections on the left side menu.

Beyond Sharepoint, you can also bring in data from Google Drive, Dropbox, or AWS S3, for example. You can integrate all your data sources and reuse them across projects. For additional organization and security, you can create knowledge bases inside Stack AI and assign files and folders to each one, setting role-based access control for each.

All these nodes include the Stack AI proprietary search algorithm we mentioned earlier. As such, they can’t be connected to a dynamic vector store: they’re ready to be connected to LLMs directly.

In this tutorial, we’ll use a linear AI generation architecture, a chain of 5 LLMs, each tasked with writing a section of the investment memo.

4.1 LLM for executive summary section

On the left side menu, click to expand the LLMs section. Drag and drop an OpenAI LLM node onto the canvas.


Drop openai
one

The node drops on the canvas with GPT-4o mini selected as default. Click the dropdown and change it to GPT-4o for its advanced reasoning skills.


Choose 4o

Let’s connect the inputs and data sources to this node. Start with the company name input and connect it to the OpenAI LLM node.


Company name to
llm

Connect the additional instructions input to the OpenAI LLM node.


Instructions to
llm

Repeat for the Web Search node.


Web search to
llm

Continue by connecting the vector store containing the files to the OpenAI LLM.


Veczero to
llm

Next, connect the vector store containing the scraped URL data to the Open AI LLM.


Vecone to llm

Finish by connecting the Microsoft Sharepoint node to the OpenAI LLM.


Sharepoint to
llm

We can’t change the name of LLM nodes, but we can add a note. Click the note icon at the top right of the node and write Executive summary.


Add note to llm
one

Copy and paste the following into the instructions input field:


_You are an AI assistant helping draft an investment memo for a PE fund._

_\- Write in bullet points. Only return the bullet points._

_\- You will focus on one section only._

_\- Be verbose._

_\- Your section is Executive Summary. Draft just that section._

_\- Don't mention the section you are writing as a title._

_Add citations at the end as: \[1\] www.google.com, \[2\] document.pdf_


Add instructions to llm
one

Next, copy and paste the following user prompt into the corresponding input field:

_Company to analyze: {in-0}_

_Additional instructions: {in-1}_

_Market information to use: {knowledgebase-0}_

_Web scraped content: {vec-1}_

_Additional documents to be used: {vec-0}_

_Web search: {websearch-0}_

Add user prompt to llm
one

The values in curly brackets are variables. They’ll be replaced with the content generated by each corresponding node. If you add more data sources in the future, remember to connect them to the LLM and then add their variables to the prompt.

LLM for company overview section

Drag another OpenAI LLM node and drop it to the right of the first one.


Drop llm two

Connect the first OpenAI LLM’s output to the second’s input handle.


Connect llm one to llm
two

You don’t need to connect all the previous data sources and inputs to the second OpenAI LLM. They’re all available through the connection to the first LLM.

This new node will work on the company overview section. Copy and paste the following system instructions onto the corresponding input field inside the node. Notice that it’s the same as the first one, except for the bolded part:

_You are an AI assistant helping draft an investment memo for a PE fund._

_\- Write in bullet points. Only return the bullet points._

_\- You will focus on one section only._

_\- Be verbose._

**_\- Your section is company overview. Draft just that section._**

_\- Don't mention the section you are writing as a title._

_Add citations at the end as: \[1\] www.google.com, \[2\] document.pdf_


Paste instructions llm
two

The system instructions will already change the focus and behavior of this new OpenAI LLM. All that’s needed is to pass the variables with the data. Copy and paste the same user prompt into the corresponding input field:

_Company to analyze: {in-0}_

_Additional instructions: {in-1}_

_Market information to use: {knowledgebase-0}_

_Web scraped content: {vec-1}_

_Additional documents to be used: {vec-0}_

_Web search: {websearch-0}_


Paste user prompt into llm
two

_

Edit the second OpenAI LLM’s note to display Company overview by clicking the corresponding icon at the top right of the node.


Add note to llm
two

Finally, open the dropdown menu and select the GPT-4o model.


Select gpt4o llm
two

LLMs for the remaining sections

You can repeat these instructions as many times as needed for all the sections. As you do so, update the line in the system instructions that refers to the section that each LLM should focus on.

Here’s how this architecture would look like for an investment memo covering:

  • Executive summary
  • Company summary
  • Market opportunity
  • Investment recommendation
  • Summary

Connect last llm to
output

Finish with an output node

Once you complete the LLM chain to create all the sections, connect the last LLM to the output node.


Click output
settings

To include the results from all the LLMs, you can configure a template for the output. Click the settings icon at the top right of the output node.


Click to add
template

Click the Add template button.


Set templated
output

You can structure the output here and include any variables to populate each section. Here’s an example template you can start from:

_Investment Opportunity in_

_{in-0}_

_Executive summary_

_{llm-0}_

_Company Overview_

_{llm-1}_

_Market Opportunity_

_{llm-2}_

_Investment Strategy_

_{llm-3}_

_Summary_

{llm-4}

This node supports Markdown syntax, so you can turn each section name into a heading. You can do so by typing the hashtag before each header (#) or using the rich-text formatting tools at the top of the input field.

5. Publish the tool


Click publish

On the top right of the screen, click the Publish button.

Every time you make changes to the workflow of the project, remember to click Publish to save them and make them available to your team.

6. Set up the user interface


Click export

Click the Export tab on the top left side of the screen.


Interface selection
dropdown

Stack AI exports new projects as a Form interface by default—even though there are others, such as Chat Assistant or Batch. For this project, this is exactly what we need, so you can leave the default as it is.


Domain name
description

Add a custom domain to make the tool easier to access. At the same time, you can add a name and a description of the tool to help your team understand what it can do and how to interact with it.


Fields

In the Fields section, you can show or hide input fields in the interface. In this case, we want all of them to be showing. Make sure all of them are activated by ticking the checkboxes. To improve user experience, change the Alias of each input to reflect what kind of information should be added. In this case:

  • in-0 as Company name
  • in-1 as Additional instructions
  • url-0 as URLs for scraping
  • doc-0 as Reports
  • out-0 as Investment memo

Security

Since this interface integrates with your internal data sources, you need to keep it secure from unauthorized access. Beyond high privacy and security standards already active at the platform level, Stack AI offers security controls to protect these tools. In the Security section, you can set password lock, SSO protection for organization or specific emails, and whitelist web pages where you’re planning to embed this interface.


Click save
interface

When you’re done making changes to this interface, always click the Save Interface button at the top-right corner of the screen.


Copy sharing
link

7. Share it with your team

Sharing this project with your team is as easy as copying the link at the top of the preview window and pasting it in an email or internal communication channel.


Share with
builders

You can also share a copy of the project with other Stack AI builders on your team. Click the Share button and add their emails. They’ll be able to see and experiment with the project, but no changes they make will reflect on your project.


Click
analytics

8. Keep track of usage and analytics

Stack AI keeps track of usage as your team interacts with the tool. Click the Analytics tab at the top-left side of the screen.


Analytics
screen

From top to bottom, you’ll be able to:

  • Filter results in a date range
  • Generate an AI report or download logs as a CSV
  • A breakdown of runs, users, errors and tokens
  • A list of recent tool runs. You can hide or show columns using the dropdown on the top right of this table

9. Improve your AI investment memo generation tool

After you share the AI investment memo tool with your team, keep an open line of communication to receive feedback and understand how to improve it. This can include improving the system and user prompts, adding more nodes or reconnecting them differently.

Beyond these changes, you can keep up with the latest AI models as they’re released. Stack AI makes it easy to upgrade. Simply click the settings icon in each of the LLM nodes…


Click llm
settings

… and change the provider and model dropdowns to do so.


Change provider and
model

Remember to run a test of the newer model to see how it changes the tool’s functionality, and click Publish to push the changes to the user interface.

Wrapping up

Creating a thorough investment memo requires a lot of research, plenty of analysis time and gathering all the details in a single place to help you make a decision. While AI can’t decide for you, it can help you gather, organize and explore all the data you need. This way, you don’t have to waste time in rote research tasks, instead freeing your mind to weigh whether the investment at hand is a good choice.

But this is just the beginning of all you can automate with Stack AI. Create a free account and explore our other tutorials: