AI Personal Learning
and practical guidance
CyberKnife Drawing Mirror

Ollama+LangGraph Locally Deployed Academic Research Report Generation Assistant

General Introduction

Ollama Deep Researcher is a fully native running web research and report generation assistant developed by the LangChain team. It uses an arbitrary Large Language Model (LLM) hosted by Ollama to allow users to enter a research topic and then automatically generate web search queries, gather information, summarize content, and generate Markdown reports with sources. The entire process is done without the need for an internet connection to call an external model, protecting privacy and at no additional cost. It supports DuckDuckGo, Tavily or Perplexity and other search tools , the user can customize the number of research cycles , suitable for users who need in-depth research and generate structured reports . Easy to install , open source and free .

Ollama Local Simple Deployment Academic Research Report Generation Assistant-1


 

Function List

  • local operational language model (LOLM): Use native LLM through Ollama without external APIs.
  • Automatic generation of search queries: Generate accurate web search terms based on user themes.
  • Web Information Collection: Supports DuckDuckGo (default), Tavily or Perplexity searches.
  • Content Summary and Optimization: Analyze search results, identify deficiencies and improve summaries.
  • Generate Markdown Reports: Output structured report with all source citations.
  • Customize the depth of research: The user can set the number of cycles and control the level of study detail.
  • Visualize Workflow: By LangGraph Studio View each step of the operation.
  • Support for multiple models: Compatible DeepSeek R1, Llama 3.2, and other models.

 

Using Help

Installation process

Ollama Deep Researcher requires local environment support. Below are detailed steps for Mac and Windows users.

Mac users

  1. Install Ollama
    • Visit the official Ollama website to download the Mac version of the installer.
    • After installation, the terminal runs ollama --version Check the version.
  2. pull model
    • Enter it in the terminal:ollama pull deepseek-r1:8b Download Recommended Models.
    • Also available ollama pull llama3.2The
  3. cloning project
    • Run the following command:
      git clone https://github.com/langchain-ai/ollama-deep-researcher.git
      cd ollama-deep-researcher
      
  4. Creating a virtual environment (recommended)
    • Make sure Python 3.9+ is installed. Run it:
      python -m venv .venv
      source .venv/bin/activate
      
  5. Install dependencies and start
    • Input:
      curl -LsSf https://astral.sh/uv/install.sh | sh
      uvx --refresh --from "langgraph-cli[inmem]" --with-editable . --python 3.11 langgraph dev
      
    • When launched, the browser opens LangGraph Studio (the default). http://127.0.0.1:2024).

Windows user

  1. Install Ollama
    • Download the Windows version from the Ollama website.
    • After installation, the command line runs ollama --version Validation.
  2. pull model
    • Input:ollama pull deepseek-r1:8bThe
  3. cloning project
    • Running:
      git clone https://github.com/langchain-ai/ollama-deep-researcher.git
      cd ollama-deep-researcher
      
  4. Creating a Virtual Environment
    • Install Python 3.11 (with "Add to PATH" checked) and run it:
      python -m venv .venv
      .venv\Scripts\Activate.ps1
      
  5. Install dependencies and start
    • Input:
      pip install -e .
      pip install -U "langgraph-cli[inmem]"
      langgraph dev
      
    • post-launch access http://127.0.0.1:2024The

Configuration search tool (optional)

  • DuckDuckGo is used by default and no API key is required.
  • If you use Tavily or Perplexity:
    1. make a copy of .env.example because of .envThe
    2. compiler .env, add the key:
      TAVILY_API_KEY=your_tavily_key
      PERPLEXITY_API_KEY=your_perplexity_key
      
    3. Optional configuration:
      • OLLAMA_BASE_URL(default) http://localhost:11434).
      • MAX_WEB_RESEARCH_LOOPS(default 3).

How to use

  1. Open LangGraph Studio
    • After starting the service, access the http://127.0.0.1:2024The
    • The interface is divided into left and right columns: configuration on the left, and input and results on the right.
  2. Configuration parameters
    • Search Tools: Choose from DuckDuckGo, Tavily or Perplexity.
    • mould: Enter the name of the downloaded model (e.g. deepseek-r1:8b).
    • Number of cycles: Set the study depth, default 3 times.
    • Save the configuration.
  3. Enter a theme
    • Enter a research topic, such as "The Future of Machine Learning" in the input box on the right.
    • Click "Run" to start the study.
  4. View process and results
    • Studio displays each step: generating queries, searches, summaries, and so on.
    • When finished, the Markdown report is saved in the graph state of the project folder.

Featured Function Operation

  • Adjusting the depth of the study
    • Change in the configuration MAX_WEB_RESEARCH_LOOPSIf the number of times is set to 5, the results will be more comprehensive but more time-consuming.
  • Toggle search tool
    • DuckDuckGo Free but limited results, Tavily More detailed (key required). Restart the service after switching.
  • Model Compatibility Check
    • If an error is reported (e.g. KeyError: 'query'), indicating that the model does not support JSON output. Switching to DeepSeek R1 (8B) or Llama 3.2.

caveat

  • hardware requirement: 8GB of memory is required for the 8B model and 16GB for the 13B.
  • Browser compatibility: Firefox is recommended, Safari may have security warnings.
  • error detection: If you get stuck, check the terminal logs, you may need to update dependencies or change models.

 

application scenario

  1. academic research
    • Scene Description: Students type in "neural network optimization methods" and the tool searches and generates a report with citations, saving time in searching for information.
  2. Industry Analysis
    • Scene DescriptionAI Market Trends 2025", a tool that provides a detailed summary to help in decision making.
  3. Technical Learning
    • Scene Description: Developers enter "Python Asynchronous Programming" and the tool generates a tutorial report to facilitate self-study.

 

QA

  1. Do I have to network?
    • The local model does not require an internet connection, but searching the web requires one. If offline, only existing data can be used.
  2. Does it support Chinese?
    • Support. Input Chinese themes, the tool generates Chinese results, but the effect varies depending on the search tool.
  3. Can the report be modified?
    • Markdown files can be edited directly.
  4. What should I do if I get a JSON error?
    • It means the model does not support structured output. Retry with DeepSeek R1 (8B) or Llama 3.2.
May not be reproduced without permission:Chief AI Sharing Circle " Ollama+LangGraph Locally Deployed Academic Research Report Generation Assistant

Chief AI Sharing Circle

Chief AI Sharing Circle specializes in AI learning, providing comprehensive AI learning content, AI tools and hands-on guidance. Our goal is to help users master AI technology and explore the unlimited potential of AI together through high-quality content and practical experience sharing. Whether you are an AI beginner or a senior expert, this is the ideal place for you to gain knowledge, improve your skills and realize innovation.

Contact Us
en_USEnglish