WWW25 Keynote | The AI Revolution in Time Series - Challenges and Opportunities, by Yan Liu from USC
(Btw, all the WWW keynotes can be found here.) Foundation models are large, pre-trained neural network trained on broad, diverse time series data with the goal of supporting many downstream tasks (forecasting, anomaly detection, causal inference, generation) across different domains. It’s like GPT-4, but for time series analysis. 🔹 Core Capabilities of a Time Series Foundation Model: Prediction — short-term, long-term, multivariate Analysis — pattern discovery, representation learning Causal Inference — learning and modeling causal relationships over time Generation — synthetic time series for simulation or augmentation Cross-domain transfer — one model works across finance, medicine, climate, etc. To use a foundation model, there are mainly two ways: Prompt-based learning vs Fine-tuning. Prompting is fast (like how you ask ChatGPT questions). Fine-tuning can be costly but would be better for narrow, high-accuracy needs. ...