A Method for Forecasting the Error and Training Time of Neural Networks for Multivariate Time Series Imputation

Бесплатный доступ

The article presents a neural network-based method called tsGAP2, designed for predicting the error and training time of neural network models used for imputing missing values in multivariate time series. The input data for the method are neural network represented as a directed acyclic graphs, where nodes correspond to layers and edges represent connections between them. The method involves three components: an Autoencoder, which transforms the graph-based representation of the model into a compact vector form; an Encoder, which encodes the hyperparameters and characteristics of the computational device; and an Aggregator, which combines the vector representations to generate the prediction. Training of the tsGAP2 neural network model is carried out using a composite loss function, defined as a weighted sum of multiple components. Each component evaluates different aspects of the tsGAP2 model’s output, including the correctness of the decoded neural network model from the vector representation, the prediction of the model’s error, and its training time. For the study, a search space comprising 200 different architectures was constructed. During the experiments, 12,000 training runs were conducted on time series from various application domains. The experimental results demonstrate that the proposed method achieves high accuracy in predicting the target model’s error: the average error, measured using SMAPE, is 4.4 %, which significantly outperforms existing alternative approaches, which show an average error of 27.6 %. The average prediction error for training time was 8.8 %, also significantly better than existing methods, which show an error of 61.6 %.

Еще

Time series, missing value imputation, neural network models, autoencoder, graph neural networks, attention mechanism, performance prediction, neural architecture search

Короткий адрес: https://sciup.org/143185313

IDR: 143185313   |   УДК: 04.032.26, 004.048   |   DOI: 10.24412/2073-0667-2025-3-72-95