AI Personal Learning
and practical guidance

TimesFM 2.0: Google Open Sources Pre-Trained Model for Time Series Forecasting

General Introduction

TimesFM 2.0 - 500M PyTorch is a pre-trained time series base model developed by Google Research and designed for time series forecasting. The model is capable of handling context lengths up to 2048 points in time and supports arbitrary prediction ranges.TimesFM 2.0 performs well in several leading benchmarks, with a performance improvement of 25% over its predecessor.The model also provides 10 experimental quantile headers, although these headers are not yet calibrated after pre-training. Users can download and use the model for time series forecasting via the Hugging Face platform.

This model can be used in scenarios such as predicting retail sales, stock movements, website traffic, etc. TimesFM 2.0 is ranked #1 on the GIFT-Eval rating list and supports fine-tuning with your own data. It performs univariate time series forecasting for up to 2048 points in time, as well as any forecast range length, with an optional frequency indicator.

TimesFM 2.0: Google's open source pre-training model for time series forecasting-1

 

Function List

  • time series forecast: Supports context lengths up to 2048 time points and arbitrary prediction ranges.
  • quantile forecast: 10 experimental quartile heads are provided.
  • Model fine-tuning: Support for model fine-tuning on user-owned data.
  • Zero sample covariate support: Support for zero-sample prediction using external regression variables.
  • High performance: Outperforms in multiple benchmarks with a performance boost of 25%.

 

Using Help

Installation process

  1. Installation of dependencies::
    • utilization pyenv cap (a poem) poetry Perform a local installation.
    • Make sure the Python version is 3.10.x (for JAX versions) or >=3.11.x (for PyTorch versions).
    • Run the following command to install the dependencies:
     pyenv install 3.11.x
    pyenv virtualenv 3.11.x timesfm-env
    pyenv activate timesfm-env
    poetry install
    
  2. Download model::
    • Visit the Hugging Face platform to download TimesFM 2.0 - 500M PyTorch Model Checkpoints.
    • Use the following command to download the model: bash
      git clone https://huggingface.co/google/timesfm-2.0-500m-pytorch
      cd timesfm-2.0-500m-pytorch

Usage Process

  1. Loading Models::
    • Load the model in the Python environment:
     from transformers import TimesFMForTimeSeriesForecasting
    model = TimesFMForTimeSeriesForecasting.from_pretrained("google/timesfm-2.0-500m-pytorch")
    
  2. carry out forecasting::
    • Prepare input data and make predictions:
     import torch
    input_data = torch.tensor([...])  # Replace with actual time series data
    predictions = model(input_data)
    
  3. fine-tuned model::
    • Model fine-tuning using own data:
     from transformers import Trainer, TrainingArguments
    training_args = TrainingArguments(output_dir=". /results", num_train_epochs=3, per_device_train_batch_size=4)
    trainer = Trainer(model=model, args=training_args, train_dataset=your_dataset)
    trainer.train()
    
  4. Use of external regression variables::
    • Support for zero-sample covariate prediction: python
      external_regressors = torch.tensor([...]) # Replace the data with the actual external regressor variables
      predictions = model(input_data, external_regressors=external_regressors)

May not be reproduced without permission:Chief AI Sharing Circle " TimesFM 2.0: Google Open Sources Pre-Trained Model for Time Series Forecasting

Chief AI Sharing Circle

Chief AI Sharing Circle specializes in AI learning, providing comprehensive AI learning content, AI tools and hands-on guidance. Our goal is to help users master AI technology and explore the unlimited potential of AI together through high-quality content and practical experience sharing. Whether you are an AI beginner or a senior expert, this is the ideal place for you to gain knowledge, improve your skills and realize innovation.

Contact Us
en_USEnglish