AutoCTS Series Overview

AutoCTS is a series of automated Neural Architecture Search (NAS) frameworks designed for Correlated Time Series (CTS) forecasting. The AutoCTS series progressively addresses the following challenges in NAS for CTS forecasting.

  1. Lack of a search space for CTS: Existing NAS methods lack a specific search space for CTS forecasting, and using all S/T-operators results in an excessively large search space. Additionally, most existing NAS methods focus on identifying one optimal cell while assuming a fixed design structure.
  2. Lack of a joint search space for hyperparameters and architectures: Existing automated CTS forecasting methods often relies on predefined settings that can lead to suboptimal performance.
  3. Lack of support for zero-shot search ability for unseen tasks: Existing automated CTS forecasting methods rely on fully supervised, task-specific models, which makes it costly to perform new searches for diverse datasets and forecasting settings.
  4. Lack of automated design methods for the search space: The search space is still designed manually, which may yield suboptimal performance and also violates the goal of AutoML, namely to automate the entire process.

AutoCTS

AutoCTS proposes an innovative search space specifically designed to CTS tasks, featuring both a variety of ST-blocks and diverse topological structures.

  • The search space of AutoCTS consists of a micro search space and a macro search space. The micro search space constructs ST-blocks by combining temporal and spatial operators, while the macro search space is responsible for establishing the topological relationships between ST-blocks.
  • AutoCTS employs a bi-level optimization algorithm aimed at end-to-end training of the macro-DAG and multiple heterogeneous micro-DAGs, focusing solely on optimizing architecture parameters.

AutoCTS+

With the architecture search space of AutoCTS established, AutoCTS+ further implements a joint search space for hyperparameters and neural architectures, and modifies the optimization strategy to reduce both time and memory consumption.

  • Building on the AutoCTS architecture search space, AutoCTS+ designs a joint search space for architectures and hyperparameters, using dual graph encoding to unify the search space.
  • Instead of the gradient-based optimization strategy, AutoCTS+ adopts a comparator-based approach, collecting <model, performance> metadata using approximated metrics, thus significantly reducing time consumption.
  • AutoCTS+ employs transfer learning to minimize time cost when transferring to unseen datasets.

AutoCTS++

While AutoCTS+ utilizes transfer learning to reduce costs for unseen datasets, it still lacks a zero-cost transfer mechanism. In AutoCTS++, a task-aware module is proposed, achieving zero-cost generalization for unseen tasks.

  • AutoCTS++ features a novel Task-aware Architecture Hyperparameter Comparator (T-AHC) that encodes task-specific representations. Through pre-training on upstream tasks, it identifies which architectures are suitable for specific tasks, enabling zero-cost generalization for downstream tasks.

FACTS: Fully Automated Correlated Time Series Forecasting

Building on AutoCTS++, FACTS further automates the design of the search space and accelerates the training of architectures found during the search on downstream tasks, resulting in a fully automated architecture search process that takes minutes.

  • FACTS can automatically design the search space based on the current task, gradually refining it through an iterative pruning process, replacing manual rule definitions and enhancing the potential to discover better architectures.
  • FACTS introduces a novel parameter inheritance mechanism, allowing architectures discovered for downstream tasks to inherit parameters from previously trained similar architectures, thus accelerating and enhancing the training process on downstream tasks.