欢迎来到清华三亚国际数学论坛

会议申请 学术日历

Learning spatial and temporal dependence in the age of AI

人工智能时代下空间与时间依赖性的学习

会议编号:  M260103

时间:  2026-01-05 ~ 2026-01-09

浏览次数:  85

组织者:   Oliver Linton, Yundong Tu, Yuhong Yang, Qiwei Yao

会议介绍

    会议摘要(Abstract)

    Data with temporal and spatial dependencies are ubiquitous in every domain. The age of AI has introduced unprecedented challenges, as well as opportunities, for research on the statistical learning for dependent data. While the ideas from machine learning and deep neural networks/AI have demonstrated remarkable empirical performance across a range of tasks, there is huge potential to use these ideas with novel approaches for learning dependence, especially beyond linear correlations, in high-dimensional setting, with complex data structures, or under nonstationary scenarios. On the other hand, there is an urgent need to gain some appreciation when/how/why DNN and various learning techniques work for dependent data, while the results on understanding DNN with independent data start to merge.

    This workshop aims to draw energy and inspiration from the recent development in machine learning (ML) including DNN to tackle some important challenges in complex time series analysis and spatial statistics, and reversely from tools of analyzing dependent data to aid ML. It will gather the leading scholars as well as young researchers in this broad area to reflect and to exchange the new advancements in the analysis of complex dependent data, with a balanced focus on methodology, theory, and algorithms. The key topics include (i) modelling complex time series including tensor processes, dynamic networks, and spatial-temporal processes, (ii) nonlinear PCA/ICA for time series, (iii) nonstationary spatial processes and spatial cointegration, (iv) change-point/anomaly detection in time and/or over space, (v) applications of tools of analyzing dependent data in ML/AI.

    The workshop will feature oral presentations, poster sessions, and topic/panel discussions.


    举办意义(Description of the aim)

    The workshop will cover five inter-correlated areas under the umbrella of learning dependence in the age of AI.

    1. Modelling complex time series including tensor processes, dynamic networks, and spatial-temporal processes

    Modelling complex time series involving tensor processes, dynamic networks, and spatial- temporal processes presents multifaceted challenges. Tensor processes, with their multi- dimensional nature, not only increase computational costs and the risk of overfitting but also pose interpretability issues. Dynamic networks require capturing their constantly evolving structure, yet the driving factors are often complex and unobservable, and scaling models for large-scale data is computationally demanding. Spatial-temporal processes need to account for nonlinear and variable spatial and temporal dependencies, while also contending with data sparsity that can bias results. Integrating these components into a unified framework is arduous, as their complex interactions can lead to excessive model complexity, necessitating a delicate balance between accuracy and computational feasibility.

    2. Nonlinear PCA/ICA for time series

    While principal/independent component analysis (PCA/ICA) are effective to transform a multi- or high-dimensional data analysis problem into a number of univariable problems, their application to time series or spatial data are less than ideal, as the dependence across different times or over space are ignored. The progress has been made to extend PCA for time series in the sense that one can derive a contemporaneous linear transformation for a p-variate time series such that the transformed series is segmented into several lower-dimensional subseries, and those subseries are uncorrelated with each other both contemporaneously and serially. Therefore, those lower-dimensional series can be analyzed separately as far as the linear dynamic structure is concerned. However such an approach ignore the dependence beyond linear correlation. The DNN architecture Autoencoder can be viewed a nonlinear generalization of PCA. However it is not suitable for time series PCA/ICA for three reasons: (i) the transformation deduced is not necessarily invertible, (ii) it does not require the `components’ to be uncorrelated or independent, and (iii) it ignores the dependence across different lags. The workshop will provide a forum to explore some particular class of invertible DNN (such as Real NVP) for nonlinear PCA/ICA for time series.

    3. Nonstationary spatial processes and spatial cointegration

    Spatial data analysis has become increasingly relevant in various fields such as economics, environmental sciences, epidemiology, and social sciences. Most real-world spatial processes are nonstationary. Simultaneously, spatial cointegration has emerged as an important concept when dealing with nonstationary spatial processes. Cointegration, in a time series context, refers to a situation where two or more nonstationary series are linked by a long-term equilibrium relationship. When extended to spatial data, spatial cointegration explores whether spatial units (such as geographic regions or neighboring locations) exhibit long-term relationships that account for spatial dependencies and common trends. There are two approaches in dealing nonstationarity: (i) econometric spatial autoregression in which the nonstationarites are reflected in weight matrices, (ii) intrinsic processes which can be viewed as a generalization of differenced time series to spatial processes sampled over irregular grids. Both the inference methods and the associated theory for cointegration under both setting are to be developed.

    4. Change-point/anomaly detection in time and/or over space

    Ideas from machine learning and AI have demonstrated remarkable empirical performance across a range of tasks. There is huge potential to use these ideas with novel approaches to detect change-points/anomalies, albeit with a simultaneous need to better understand theoretically why and when these methods work reliably at detecting and localizing change-points/anomalies. Deep learning offers an effective framework for understanding the characteristics of “normal” data, creating a baseline for detecting change points. These methods are adaptive to various types of latent structure in the data, thanks to their ability to learn complex patterns and representations from data. Another area which will be explored in the workshop is transfer Learning. This allows for pre-trained models on large datasets to be fine-tuned for specific change-point/anomaly detection tasks. This is particularly suited for some condition monitoring applications where one can learn a model based on data across many similar machines/sites, which then can be refined with much more limited data to be specific for each individual machine/site.

    more limited data to be specific for each individual machine/site.

    5. Applications of tools of analyzing dependent data in ML/AI

    Recent developments in AI intend to take advantage of dependences. Examples include attention weights in transformers capturing dependence between tokens, CNN and RNN leveraging structured dependencies for better modeling performance. In recent studies on AI model protection, dependent perturbations are used to prevent an adversarial to steal an AI model in service. Insight.....

组织者

Oliver Linton, University of Cambridge
Yundong Tu, Peking University
Yuhong Yang, Tsinghua University
Qiwei Yao, London School of Economics and Political Science

参会人员
会议日程
报告摘要
会议手册
会议总结
精彩瞬间

往期回顾