AI Personal Learning
and practical guidance

Curiosity: building a Perplexity-like AI search tool using LangGraph

General Introduction

Curiosity is a project designed for exploration and experimentation, primarily using the LangGraph and FastHTML technology stacks, with the goal of building a similar Perplexity AI of the search product. At the heart of the project is a simple ReAct Agent, which utilizes Tavily search to enhance text generation.Curiosity supports a variety of Large Language Models (LLMs), including OpenAI's gpt-4o-mini, Groq's llama3-groq-8b-8192-tool-use-preview, and the Ollama The project focuses not only on the technical implementation, but also spends a lot of time on the front-end design to ensure a high quality visual and interactive experience.

Curiosity: building a Perplexity-like AI search tool using LangGraph


 

Function List

  • Using LangGraph and FastHTML Technology Stacks
  • Integrated Tavily Search Enhanced Text Generation
  • Support for multiple LLMs, including gpt-4o-mini, llama3-groq, and llama3.1
  • Provides flexible back-end switching capabilities
  • The front-end is built with FastHTML and supports WebSockets streaming.

 

Using Help

Installation steps

  1. Cloning Warehouse:
    git clone https://github.com/jank/curiosity
    
  2. Make sure you have the latest Python3 interpreter.
  3. Set up a virtual environment and install dependencies:
    python3 -m venv venv
    source venv/bin/activate
    pip install -r requirements.txt
    
  4. establish .env file and set the following variables:
    OPENAI_API_KEY=
    GROQ_API_KEY=
    TAVILY_API_KEY=
    LANGCHAIN_TRACING_V2=true
    LANGCHAIN_ENDPOINT="https://api.smith.langchain.com"
    LANGSMITH_API_KEY=
    LANGCHAIN_PROJECT="Curiosity"
    
  5. Run the project:
    python curiosity.py
    

Guidelines for use

  1. Initiation of projects: Run python curiosity.py After that, the project will start and run on the local server.
  2. Select LLM: Select the appropriate LLM (e.g., gpt-4o-mini, llama3-groq, or llama3.1) according to your needs.
  3. Search with Tavily: Enter a query into a dialog and the ReAct Agent enhances the text generation with Tavily search.
  4. Front-end Interaction: The front-end of the project is built using FastHTML and supports WebSockets streaming to ensure real-time response.

common problems

  • How to switch LLM: in .env file to configure the appropriate API key and select the desired LLM when starting the project.
  • WebSockets Issues: If you are experiencing problems with WebSockets closing for no apparent reason, it is recommended that you check your network connection and server configuration.
AI Easy Learning

The layman's guide to getting started with AI

Help you learn how to utilize AI tools at a low cost and from a zero base.AI, like office software, is an essential skill for everyone. Mastering AI will give you an edge in your job search and half the effort in your future work and studies.

View Details>
May not be reproduced without permission:Chief AI Sharing Circle " Curiosity: building a Perplexity-like AI search tool using LangGraph

Chief AI Sharing Circle

Chief AI Sharing Circle specializes in AI learning, providing comprehensive AI learning content, AI tools and hands-on guidance. Our goal is to help users master AI technology and explore the unlimited potential of AI together through high-quality content and practical experience sharing. Whether you are an AI beginner or a senior expert, this is the ideal place for you to gain knowledge, improve your skills and realize innovation.

Contact Us
en_USEnglish