AI Personal Learning
and practical guidance

Model Context Provider CLI: Command line tool for using MCP services in any large model, does not depend on Claude.

General Introduction

The Model Context Provider CLI (mcp-cli) is a protocol-level command-line tool for interacting with model context provider servers. The tool allows users to send commands, query data, and interact with a variety of resources provided by the server. mcp-cli supports several providers and models, including OpenAI and Ollama, with the default models being gpt-4o-mini and qwen2.5-coder, respectively. The tool requires Python 3.8 or later, and the installation of the appropriate The tool requires Python 3.8 or higher, and the appropriate dependencies need to be installed. You can use the tool by cloning the GitHub repository and installing the necessary dependencies.

Model Context Provider CLI: Command line tool to use MCP services in any hit model, does not depend on Claude-1


 

Function List

  • Support for protocol-level communication with model context provisioning servers
  • Dynamic tools and resources to explore
  • Support for multiple providers and models (OpenAI and Ollama)
  • Provides an interactive mode that allows users to dynamically execute commands
  • Supported commands include: ping, list-tools, list-resources, list-prompts, chat, clear, help, quit/exit
  • Supported command line parameters include: --server, --config-file, --provider, --model

 

Using Help

Installation process

  1. Cloning Warehouse:
   git clone https://github.com/chrishayuk/mcp-cli
cd mcp-cli
  1. Install UV:
   pip install uv
  1. Synchronization dependencies:
   uv sync --reinstall

Usage

  1. Start the client and interact with the SQLite server:
   uv run main.py --server sqlite
  1. Run the client with the default OpenAI provider and model:
   uv run main.py --server sqlite
  1. Using specific configurations and Ollama The provider runs the client:
   uv run main.py --server sqlite --provider ollama --model llama3.2

interactive mode

Enter interactive mode and interact with the server:

uv run main.py --server sqlite

In interactive mode, you can use the tool and interact with the server. The provider and model specified at startup will be displayed as follows:

Entering chat mode using provider 'ollama' and model 'ollama3.2'...

Supported commands

  • ping (computing): Check if the server is responding
  • list-tools: Show available tools
  • list-resources: Show available resources
  • list-prompts: Show available tips
  • chat: Enter interactive chat mode
  • clear: Clear the terminal screen
  • help: Displays a list of supported commands
  • quit / exit: Exit Client

Using OpenAI Providers

If you wish to use an OpenAI model, you should set the OPENAI_API_KEY environment variable, which can be found in the .env file or set as an environment variable.

May not be reproduced without permission:Chief AI Sharing Circle " Model Context Provider CLI: Command line tool for using MCP services in any large model, does not depend on Claude.

Chief AI Sharing Circle

Chief AI Sharing Circle specializes in AI learning, providing comprehensive AI learning content, AI tools and hands-on guidance. Our goal is to help users master AI technology and explore the unlimited potential of AI together through high-quality content and practical experience sharing. Whether you are an AI beginner or a senior expert, this is the ideal place for you to gain knowledge, improve your skills and realize innovation.

Contact Us
en_USEnglish