AI Personal Learning
and practical guidance

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications

General Introduction

FlowiseAI is an open source, low-code tool designed to help developers build custom LLM (Large Language Model) applications and AI agents. With a simple drag-and-drop interface, users can quickly create and iterate on LLM applications, making the process from testing to production much more efficient.FlowiseAI provides a rich set of templates and integration options, making it easy for developers to implement complex logic and conditional settings for a variety of application scenarios.

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications-1


 

Function List

  • Drag-and-drop interface: Build custom LLM streams with simple drag-and-drop operations.
  • Template support: multiple built-in templates to quickly start building applications.
  • Integration options: supports integration with tools such as LangChain and GPT.
  • User Authentication: Supports user name and password authentication to ensure application security.
  • Docker support: provide Docker images for easy deployment and management.
  • Developer Friendly: Supports a variety of development environments and tools for secondary development.
  • Rich Documentation: Detailed documentation and tutorials are provided to help users get started quickly.

 

Using Help

Installation process

  1. Download and install NodeJS: Make sure NodeJS version >= 18.15.0.
  2. Installing Flowise::
   npm install -g flowise
  1. Start Flowise::
   npx flowise start

If you need username and password authentication, you can use the following command:

   npx flowise start --FLOWISE_USERNAME=user --FLOWISE_PASSWORD=1234
  1. Access to applications: Open http://localhost:3000 in your browser.

Usage Process

  1. Create a new project: In the Flowise interface, click the "New Project" button, enter a project name and select a template.
  2. drag-and-drop component: Drag and drop the desired component from the left toolbar to the workspace to configure the component properties.
  3. connection kit: Connect the components by dragging the connecting wires to form a complete process.
  4. test application: Click the "Run" button to test the functionality and effectiveness of the application.
  5. Deploying applications: After completing testing, the application can be deployed to a production environment, managed and maintained using Docker images.

Featured Function Operation

  • Integrating LangChain: In the component configuration, select the LangChain integration option and enter the relevant parameters to realize seamless connection with LangChain.
  • user authentication: In the .env file add theFLOWISE_USERNAMEcap (a poem)FLOWISE_PASSWORDvariable, the user authentication feature will be automatically enabled when starting the application.
  • Using templates: Choose the right template when creating a new project, you can quickly build common applications, such as PDF Q&A, Excel data processing and so on.

common problems

  • lack of memory: You can increase the Node.js heap memory size if you run out of memory during the build:
  export NODE_OPTIONS="--max-old-space-size=4096"
pnpm build
  • Docker Deployment: Use the following commands to build and run a Docker image:
  docker build --no-cache -t flowise .
docker run --d --name flowise -p 3000:3000 flowise

With the above steps, users can quickly get started with FlowiseAI, build and deploy custom LLM applications, and improve development efficiency and application performance.

 

Case Study: Building an Automated News Writing System with FlowiseAI

Flowise Multiple Workflow Diagram

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications-1

 

Flowise Configuration Flow

1. We use Flowise to build a news authoring system, first we create a new agent in the agentflows of Flowise, named "news authoring system", as follows:

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications-1

 

2. We drag in a Supervisor, and 3 Workers in the interface, and name and connect them as shown below:

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications-1

 

3. Set the prompt word for each agent:

# Supervisor

You are a Supervisor responsible for managing communication between the following workers: `{team_members}`.

## Task Flow

1. **Send a task to worker1**
Instruct worker1 to search for the latest news.

2. **Waiting for worker1 to return results**
Passes the latest news content returned by worker1 to worker2.

3. **Waiting for worker2 to complete the task**
Instructs worker2 to write the news as an article and then pass the article content to worker3.

4. **Confirm task completion**
Ensures that worker3 successfully saves the article and then notifies the task completion.

## Caution.

- Always schedule tasks in an accurate and coordinated manner.
- Ensure that each step is complete and free of omissions.

 

# worker1

You are a news search engine responsible for providing callers with up-to-date news information. Here are your specific task requirements:

1. **Search for the latest 10 news items**: find the latest news content that matches the criteria based on incoming requests.
2. **Extract key information**: from the searched news, extract the following information:
- **Title**: title of the news
- **Summary**: a short summary of the news content
- **Source**: link to the news
- **Core point**: the central point or main message of the news
3. **Return Clearly Structured Information**: returns the above information to the caller in a clear format.

### Example output:

- **Headline**: [News Headline]
- **Summary**: [news summary]
- **Source**: [News Link]
- **Core Points**: [News Core Points]

### Notes:

- **Currency**: ensure that the news provided is up-to-date.
- **Accuracy**: ensure that the information extracted is accurate.

 

# worker2

### Task Description.
1. **Write a complete and fluent article based on the news headlines, summaries and content sources provided**: Ensure that the article is logically clear, adheres to the information provided, and expresses itself naturally.
2. **Language Requirements**: be succinct and avoid lengthy expressions to make the point.
3. **Formatting requirements**:
- The title is on a separate line and stands out.
- The text should be reasonably segmented and hierarchically organized for ease of reading.

### Sample Outputs
The following is the basic structure and sample format of the article:

```markdown
# News headline (centered or on a separate line)

The first paragraph of the body content: the opening paragraph introduces the topic of the news and points out the background or core content of the event.

The second paragraph of the main content: a detailed description of the main content of the news, add the necessary details to make the content more substantial.

The third paragraph: analyze or comment on the significance of the news event, its possible impact or the next step in the development.

The fourth paragraph (optional): summarizes the whole article, echoes the beginning and impresses the reader.

 

# worker3

Your task is:

1. receive the complete article content, including title and body.
2. name the file according to the title, ensuring that the file name is concise and meaningful (e.g., use the first few words of the title and remove special characters).
3. Save the file in TXT format to a specified path on your computer. 4.
4. Return the path to the saved file and the success status to the caller. Example:
- File path: [Save Path]
- Status: Saved successfully

 

4. Set the Tool Calling Chat Model and Agent Memory of Supervisor, please choose the appropriate large model according to your actual situation, as shown below:

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications-1

 

5. Select the appropriate search tool for worker1, please select it according to your environment, as shown below:

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications-1

 

6. Select the appropriate file saving tool for worker3, as shown below:

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications-1

 

7. The final overall configuration is shown below:

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications-1

 

8. After the configuration is complete, we tap the dialog box in the upper right corner, enter the keyword "Big Model", as shown below:

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications-1

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications-1

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications-1

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications-1

We see the worker executing sequentially, completing the tasks we configured.

 

9. By clicking on the code icon in the upper right corner, we can see how to call this system's API as shown below:

FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications-1

May not be reproduced without permission:Chief AI Sharing Circle " FlowiseAI: Building a Node Drag-and-Drop Interface for Custom LLM Applications

Chief AI Sharing Circle

Chief AI Sharing Circle specializes in AI learning, providing comprehensive AI learning content, AI tools and hands-on guidance. Our goal is to help users master AI technology and explore the unlimited potential of AI together through high-quality content and practical experience sharing. Whether you are an AI beginner or a senior expert, this is the ideal place for you to gain knowledge, improve your skills and realize innovation.

Contact Us
en_USEnglish