General Introduction
Open Deep Research is a web-based research assistant that generates comprehensive research reports on any topic. The system uses a plan-and-do workflow that allows users to plan and review the report structure before moving on to the time-consuming research phase. Users can choose from different planning models, search APIs, and writing models, such as Tavily, Perplexity, Anthropic, and OpenAI, to meet individual needs.Open Deep Research supports multiple iterations of reflection and search to ensure the depth and accuracy of the report. Users can quickly deploy and utilize the tool through simple configuration files and command line operations.
Function List
- Provide an outline of the structure of the report
- Setting up planning models (e.g. DeepSeek, OpenAI inference models, etc.)
- Feedback on plans for each section of the report and iterate until users are satisfied
- Setting search APIs (e.g. Tavily, Perplexity) and number of searches per study iteration
- Setting the search depth (number of iterations) for each section
- Custom writing models (e.g. Anthropic)
- Running LangGraph Studio UI locally
- Automatic generation of structured research reports
- Supports multiple searches and reflective iterations to improve report quality
Using Help
Quick Start
- Ensure that the API key for the required tool has been set.
- Select a web search tool (Tavily is used by default):
- Tavily API
- Perplexity API
- Select a writing model (Anthropic is used by default) Claude 3.5 Sonnet):
- Select a planning model (OpenAI o3-mini is used by default):
- OpenAI
- Groq
Usage
virtualized environment
- Create a virtual environment:
python -m venv open_deep_research source open_deep_research/bin/activate
- Installation:
pip install open-deep-research
Using the Jupyter Notebook
- Import and compile the chart:
from langgraph.checkpoint.memory import MemorySaver from open_deep_research.graph import builder memory = MemorySaver() graph = builder.compile(checkpointer=memory)
- View Chart:
from IPython.display import Image, display display(Image(graph.get_graph(xray=1).draw_mermaid_png()))
- Run the chart:
import uuid thread = {"configurable": {"thread_id": str(uuid.uuid4()), "search_api": "tavily", "planner_provider": "openai", "planner_model": "o3-mini", "writer_provider": "anthropic", "writer_model": "claude-3-5-sonnet-latest", "max_search_depth": 1, }} topic = "Overview of the AI inference market with focus on Fireworks, Together.ai, Groq" async for event in graph.astream({"topic":topic,}, thread, stream_mode="updates"): print(event) print("\n")
- After generating the report plan, submit feedback to update the report plan:
from langgraph.types import Command async for event in graph.astream(Command(resume="Include a revenue estimate (ARR) in the sections"), thread, stream_mode="updates"): print(event) print("\n")
- Submitted when satisfied with the reporting program
True
to generate reports:async for event in graph.astream(Command(resume=True), thread, stream_mode="updates"): print(event) print("\n")
Running LangGraph Studio UI locally
- Cloning Warehouse:
git clone https://github.com/langchain-ai/open_deep_research.git cd open_deep_research
- compiler
.env
file to set the API key:cp .env.example .env
- Setting environment variables:
export TAVILY_API_KEY=<your_tavily_api_key> export ANTHROPIC_API_KEY=<your_anthropic_api_key> export OPENAI_API_KEY=<your_openai_api_key>
- Start the LangGraph server:
- Mac:
curl -LsSf https://astral.sh/uv/install.sh | sh uvx --refresh --from "langgraph-cli[inmem]" --with-editable . --python 3.11 langgraph dev
- Windows:
pip install -e . pip install langgraph-cli[inmem] langgraph dev
- Mac:
- Open the Studio UI:
- 🚀 API: http://127.0.0.1:2024 - 🎨 Studio UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 - 📚 API Docs: http://127.0.0.1:2024/docs
Customized Reports
report_structure
:: Define a customized report structure (standard research report format is used by default)number_of_queries
:: Number of search queries generated per part (default: 2)max_search_depth
: Maximum search depth (default: 2)planner_provider
:: Model provider for the planning phase (default: "openai", optional "groq")planner_model
:: Specific model used for planning (default: "o3-mini", optionally "deepseek-r1-distill-llama-70b").writer_model
:: Model used to write the report (default: "claude-3-5-sonnet-latest")search_api
: Search API used (default: Tavily)