General Introduction
Minima is an open source project designed to provide a fully localized Retrieval Augmentation Generation (RAG) tool. Users can talk to local files through Minima , supports two modes : fully local installation and custom GPT mode. The project is deployed using Docker and supports indexing and querying of a wide range of file formats, including PDF, XLS, DOCX, TXT, MD, and CSV. Minima is released under the Mozilla Public License v2.0 (MPLv2), which ensures that users have the freedom to use and modify the code.
Function List
- Local Document Dialog: Talk to local files by way of local installation.
- Customized GPT modes: Queries local files using a custom GPT model.
- Multi-file format support: Supports indexing and querying of PDF, XLS, DOCX, TXT, MD and CSV files.
- Docker Deployment: Rapid deployment and management via Docker.
- Environment variable configuration: Configure environment variables through the .env file to flexibly set file paths and model parameters.
- recursive index: Supports recursive indexing of all subfolders and files within a folder.
Using Help
Installation process
- cloning project: Clone the Minima project locally from GitHub.
git clone https://github.com/dmayboroda/minima.git
cd minima
- Configuring Environment Variables: Create a .env file in the root directory of the project and copy the contents of the .env.sample file to the .env file and set the relevant variables.
cp .env.sample .env
Variables to be configured include:
LOCAL_FILES_PATH
: Specifies the path to the folder to be indexed.EMBEDDING_MODEL_ID
: Specifies the embedding model to use.EMBEDDING_SIZE
: Set the embedding dimension.START_INDEXING
: Initial startup is set totrue
to start indexing.USER_ID
cap (a poem)PASSWORD
: User authentication for customizing the GPT mode.
- Starting the Docker Container::
- Fully local installation:
bash
docker compose -f docker-compose-ollama.yml --env-file .env up --build
- Customize the GPT mode:
bash
docker compose --env-file .env up --build
- Fully local installation:
Usage Process
- Local Document Dialog::
- Connect to the local server:
ws://localhost:8003/llm/
- Starts a dialog with a local file to query the contents of the file.
- Customized GPT modes::
- After starting the Docker container, copy the OTP from the terminal and query it using Minima GPT.
- Ask a question and Minima will provide an answer based on the contents of the local file.
Detailed Function Operation
- Document Indexing: After starting the container, Minima automatically indexes all files in the specified folder. The indexing process is recursive and all files in subfolders are also indexed.
- Query document content: Users can connect to a local server via WebSocket, send a query request, and Minima will return an answer based on the contents of the file.
- Environment variable configuration: Users can flexibly adjust Minima's configuration by modifying variables in the .env file, such as changing file paths, embedding models, etc.