AI Personal Learning
and practical guidance
豆包Marscode1

Open Deep Research: LangChain's Open Source Intelligent Assistant for Deep Research

General Introduction

Open Deep Research is a web-based research assistant that generates comprehensive research reports on any topic. The system uses a plan-and-do workflow that allows users to plan and review the report structure before moving on to the time-consuming research phase. Users can choose from different planning models, search APIs, and writing models, such as Tavily, Perplexity, Anthropic, and OpenAI, to meet individual needs.Open Deep Research supports multiple iterations of reflection and search to ensure the depth and accuracy of the report. Users can quickly deploy and utilize the tool through simple configuration files and command line operations.


 

Function List

  • Provide an outline of the structure of the report
  • Setting up planning models (e.g. DeepSeek, OpenAI inference models, etc.)
  • Feedback on plans for each section of the report and iterate until users are satisfied
  • Setting search APIs (e.g. Tavily, Perplexity) and number of searches per study iteration
  • Setting the search depth (number of iterations) for each section
  • Custom writing models (e.g. Anthropic)
  • Running LangGraph Studio UI locally
  • Automatic generation of structured research reports
  • Supports multiple searches and reflective iterations to improve report quality

 

Using Help

Quick Start

  1. Ensure that the API key for the required tool has been set.
  2. Select a web search tool (Tavily is used by default):
  3. Select a writing model (Anthropic is used by default) Claude 3.5 Sonnet):
  4. Select a planning model (OpenAI o3-mini is used by default):
    • OpenAI
    • Groq

Usage

virtualized environment

  1. Create a virtual environment:
    python -m venv open_deep_research
    source open_deep_research/bin/activate
    

  1. Installation:
    pip install open-deep-research
    

Using the Jupyter Notebook

  1. Import and compile the chart:
    from langgraph.checkpoint.memory import MemorySaver
    from open_deep_research.graph import builder
    memory = MemorySaver()
    graph = builder.compile(checkpointer=memory)
    
  2. View Chart:
    from IPython.display import Image, display
    display(Image(graph.get_graph(xray=1).draw_mermaid_png()))
    
  3. Run the chart:
    import uuid 
    thread = {"configurable": {"thread_id": str(uuid.uuid4()),
    "search_api": "tavily",
    "planner_provider": "openai",
    "planner_model": "o3-mini",
    "writer_provider": "anthropic",
    "writer_model": "claude-3-5-sonnet-latest",
    "max_search_depth": 1,
    }}
    topic = "Overview of the AI inference market with focus on Fireworks, Together.ai, Groq"
    async for event in graph.astream({"topic":topic,}, thread, stream_mode="updates"):
    print(event)
    print("\n")
    
  4. After generating the report plan, submit feedback to update the report plan:
    from langgraph.types import Command
    async for event in graph.astream(Command(resume="Include a revenue estimate (ARR) in the sections"), thread, stream_mode="updates"):
    print(event)
    print("\n")
    
  5. Submitted when satisfied with the reporting programTrueto generate reports:
    async for event in graph.astream(Command(resume=True), thread, stream_mode="updates"):
    print(event)
    print("\n")
    

Running LangGraph Studio UI locally

  1. Cloning Warehouse:
    git clone https://github.com/langchain-ai/open_deep_research.git
    cd open_deep_research
    
  2. compiler.envfile to set the API key:
    cp .env.example .env
    
  3. Setting environment variables:
    export TAVILY_API_KEY=<your_tavily_api_key>
    export ANTHROPIC_API_KEY=<your_anthropic_api_key>
    export OPENAI_API_KEY=<your_openai_api_key>
    
  4. Start the LangGraph server:
    • Mac:
      curl -LsSf https://astral.sh/uv/install.sh | sh
      uvx --refresh --from "langgraph-cli[inmem]" --with-editable . --python 3.11 langgraph dev
      
    • Windows:
      pip install -e .
      pip install langgraph-cli[inmem]
      langgraph dev
      
  5. Open the Studio UI:
    - 🚀 API: http://127.0.0.1:2024
    - 🎨 Studio UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024
    - 📚 API Docs: http://127.0.0.1:2024/docs
    

Customized Reports

  • report_structure:: Define a customized report structure (standard research report format is used by default)
  • number_of_queries:: Number of search queries generated per part (default: 2)
  • max_search_depth: Maximum search depth (default: 2)
  • planner_provider:: Model provider for the planning phase (default: "openai", optional "groq")
  • planner_model:: Specific model used for planning (default: "o3-mini", optionally "deepseek-r1-distill-llama-70b").
  • writer_model:: Model used to write the report (default: "claude-3-5-sonnet-latest")
  • search_api: Search API used (default: Tavily)
May not be reproduced without permission:Chief AI Sharing Circle " Open Deep Research: LangChain's Open Source Intelligent Assistant for Deep Research
en_USEnglish