laitimes

NAS, a knowledge base question answering system based on LLM large language model

author:What is worth buying

This content comes from @WhatWorth Buying APP, the views only represent the author himself | Author: Panda is not a cat QAQ

preface

FastGPT is a knowledge base question answering system based on LLM large language model, which provides out-of-the-box data processing and model invocation capabilities. At the same time, workflow orchestration can be carried out through Flow visualization, so as to realize complex Q&A scenarios! The project is provided for the group, and he deployed one himself, I feel good with it, and the official trial version can be used. If you are interested, you can check it out:https://fastgpt.run/

deploy

FastGPT provides docker-compose, so we can deploy quickly directly with its official project files. Two different official documents provide a non-host version and a host version. Choose to use it according to your own situation.

NAS, a knowledge base question answering system based on LLM large language model

fastGPT

Here I chose the non-host version, you need to check the port, change to your own non-conflict port can be. The code is as follows:

# Non-host version, does not use native agents

version: '3.3'

services:

pg:

image: ankane/pgvector:v0.4.2 # git

# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/pgvector:v0.4.2 # Alibaba Cloud

container_name: pg

restart: always

ports: # Production environments are not exposed

- 5432:5432

networks:

- fastgpt

environment:

# The configuration here only takes effect on the first run. After you modify it, rebooting the image will not take effect. The persistent data needs to be deleted and restarted to have an effect

- POSTGRES_USER=username

- POSTGRES_PASSWORD=password

- POSTGRES_DB=postgres

volumes:

- ./pg/data:/var/lib/postgresql/data

mongo:

image: mongo:5.0.18

# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/mongo:5.0.18 # Alibaba Cloud

container_name: mongo

restart: always

ports: # Production environments are not exposed

- 27017:27017

networks:

- fastgpt

environment:

# The configuration here only takes effect on the first run. After you modify it, rebooting the image will not take effect. The persistent data needs to be deleted and restarted to have an effect

- MONGO_INITDB_ROOT_USERNAME=username

- MONGO_INITDB_ROOT_PASSWORD=password

volumes:

- ./mongo/data:/data/db

fastgpt:

container_name: fastgpt

# image: c121914yu/fast-gpt:latest # docker hub

image: registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt:latest # Alibaba Cloud

ports:

- 3000:3000

networks:

- fastgpt

depends_on:

- mongo

- pg

restart: always

environment:

# root password, username is: root

- DEFAULT_ROOT_PSW=1234

# Transit address, if it is an official number, do not need to care

- OPENAI_BASE_URL=https://api.openai.com/v1

- CHAT_API_KEY=sk-xxxx

- DB_MAX_LINK=5 # database max link

- TOKEN_KEY=any

- ROOT_KEY=root_key

# mongo configuration, do not need to change

- MONGODB_URI=mongodb://username:password@mongo:27017/?authSource=admin

- MONGODB_NAME=fastgpt

# PG configuration.

- PG_HOST=pg

- PG_PORT=5432

- PG_USER=username

- PG_PASSWORD=password

- PG_DB_NAME=postgres

networks:

fastgpt:

It involves some files, we need to create it in advance, and it is also written in the docker-composite file, and you can create it according to the prompts, or you can refer to my settings directly.

NAS, a knowledge base question answering system based on LLM large language model

Directory tree

Finally, we open Synology's ssh port, connect to Synology with the ssh tool, cd to the fastGPT directory and enter the command: docker-compose up -d to start the project (pay attention to the file format, switch to utf-8).

NAS, a knowledge base question answering system based on LLM large language model

Start the project

After successful startup, the following interface will appear, and you can see three containers in the docker UI.

NAS, a knowledge base question answering system based on LLM large language model

The project started successfully

experience

The browser enters http://nasip + port number to see the project interface, at this time we click on the login in the upper right corner.

NAS, a knowledge base question answering system based on LLM large language model

Main interface

If you have not changed the default configuration, then the username is root and the password is 1234.

NAS, a knowledge base question answering system based on LLM large language model

example

OpenAI needs to be configured to use, and it can also be used in simple mode, but it uses the basic library, which is not as smart as 3.5 and 4.0.

NAS, a knowledge base question answering system based on LLM large language model

experience

As for more functions, you can experience it yourself. Panda is not very interested in GPT and the like.

summary

The tool is a good tool, but the knowledge base needs to be built, and the fee for GPT seems to be very low, and after daily use, you will find that it is very fast to use, and it is virtually gone. So I still choose bingAI!

The above is all the content of this issue, the original is not easy, you may wish to like the collection, and finally hope to get your attention, let's see you in the next issue!

NAS, a knowledge base question answering system based on LLM large language model

Like, favorite, follow

NAS, a knowledge base question answering system based on LLM large language model

Read on