Notion-powered AI Agents: Use Notion as a Data Source with Flowise and Notion MCP

We can build chatflows and agentflows (AI multi-agent systems) in Flowise AI. An interesting application of AI Agents is when they have context and content about your specific knowledge, information, and documentation. So the Agent can use a RAG (retrieval augmented generation) system to generate answers or take actions in a way that fits your specific preferences, needs, wants, and overall way of being, which is something embedded uniquely within each person or business—our identity.

In this essay, I am exploring how we can build an Agentflow in Flowise AI with Notion MCP to retrieve content from your Notion workspace. This could be part of an external or internal AI chat that you can interact with (Flowise also provides a UI that can be embedded into any website or page that supports embedding JavaScript). Or it could be an internal tool that you build specifically to help you achieve something; to extend your capabilities in specific areas; to do work on your behalf—like having a team of AI Agents to which you delegate some Tasks.

Store data in Notion databases and pages

The first step is to build the infrastructure to store the data. We can build a Notion database for this. There could be multiple databases. For knowledge retrieval purposes, we may use a single database or multiple related databases. For example, a database storing internal company memos, handbooks, blog posts, documentation, guidelines, principles, and company aspirations. They could be in the same or separate databases.

Notion-style Callout
If your Notion workspace is large and you wish to only search across a few databases and pages, it is best to place all of those target objects into a single Page or Teamspace and use that as a filter for the Notion MCP search tool.

Create the Flowise Agentflow or Chatflow

Some time ago, I published a series of demos explaining one option to create a Chatflow in Flowise AI, connecting it with Google Sheets and Vector Databases. You can check out the videos here. Flowise and the AI landscape in general have undergone updates and improvements since then—though the core logic remains the same.

In Flowise AI, we create an Agentflow (template available here) that utilizes LLMs and webhook calls to process data and take action. An Agentflow is suitable for use cases where we wish to deploy multiple AI Agents or retrieval systems. AI Agents can also take actions, not just search and output information. If we want to develop a conversational tool only, a Flowise Chatflow is enough.

There are four steps (nodes) in the Agentflow:

  1. Start the Agentflow — this is the trigger of the automated workflow. It can be a chat message or a form submission. In my example, the trigger is a chat message.

  2. Agent (e.g., OpenAI, Anthropic, etc.) — we pass the chat input and the LLM formats it as a valid JSON that we can pass to the next node. The “input message” I used is this one:

    Format this input as valid JSON. Input: {"query":{{question}}}. Output only the JSON and nothing else. Only output the final JSON, without any commentary or extra characters.

  3. HTTP — POST a webhook request to the Make scenario (details on this in the next section), including the JSON parameters defined in the output of node #2, which we can map here by using double curly brackets.

  4. Agent that interacts with the user (elaborates and sends a chat reply). Here we can use additional vector stores (e.g., Pinecone, Supabase) to enrich and customize the output even more. As the input message, we map the output of the HTTP node, which includes Notion MCP content and context. As the System message, we define the role, objectives, scope of the AI Agent (a practice referred to as prompt engineering, out of the scope of this essay).

These four steps are the foundations of the AI Agentflow in Flowise AI. We can add further nodes to customize the setup even more, but this is outside of the scope of this essay.

Use Notion MCP to Search Data via Make & Send The Response Back to Flowise AI

In Make, we create a scenario that gets the Flowise AI message (and, optionally, the conversation history), formats the message as a clear, single query for the Notion MCP, uses the Notion MCP server to search across our Notion workspace, and sends the output back to Flowise, so the Agent can provide a context-aware, accurate reply.

Notion-style Callout
Get the Make scenario template here.

The Make scenario is composed of four modules:

  1. Webhook — this is the URL that receives the Flowise AI HTTP request (see step #3 in the previous section), including the JSON body that includes the chat messages.

  2. LLM (e.g., OpenAI, Anthropic, or also the native Make AI modules) — this module processes the JSON body input and outputs a clear query that we send to the Notion MCP call to retrieve the most accurate data. The prompt I used in the demo is this:

    Frame this query as a clear question optimized to search across our internal database via the Notion MCP. Query: {{1.query}}. Include source content URL

  3. MCP Client (Call tool) — this module uses the Notion MCP search tool to retrieve data from the connected Notion workspace. In the query field, we map the output of the LLM (i.e., the query) in step #2 above.

  4. Webhook response — this module sends a 200 (success) response to the webhook request, including the Notion MCP parsed_results output in the response body. This sends the results back to Flowise (HTTP node) for use in the AI response.

Build a User Interface (using Bolt/Cursor and Embedded Flowise Chat)

Once all of this is live, we can use the Flowise AI chat interface, or optionally build a custom user interface. This choice depends on your specific workflow, user base, aims, and context. If building a custom user interface, Flowise AI provides customizable HTML code to embed the Agentflow/Chatflow on any websites or apps.

You may embed the chat into an existing website, or build a User Interface from scratch using an AI app-building tool such as Bolt, Cursor AI, V0, or any other method you wish.

Here is a post (and videos) I published some time ago regarding creating Flowise AI chatflows with custom data retrieval.

Pricing Breakdown of Tools Usage

Notion-style Table
Tool Cost/year Notes
Flowise AI USD 0–400 Pricing shown for cloud account.

Self hosting is possible and makes it cheaper but requires some technical knowledge.
OpenAI / Anthropic / Other - Depends on the LLM used and the model selected. Varies highly based on the provider and model. Search official documentations to find out.
Make USD 900–2000 Here is my partner link if you wish to get one month free on the Pro plan.
 
 


Similar Articles


Previous
Previous

Webflow Forms to Notion CRM Automation

Next
Next

How to Connect Stripe Sales to Notion CRM - Automate Sales & Project Pipeline