Assessment of the applicability of graph neural networks for extending numerical optimization methods

Бесплатный доступ

The article explores the existing scientific studies related to the current graph neural networks, such as Graph Convolutional Network (GCN), Graph Attention Network (GAT), Graph Sage, Physics- Informed Neural Networks (PINN), and numerical optimization methods, including derivative-based computations, gradient descent, stochastic gradient descent, Newton’s method, Adam, AdamW, AdaGrad, particle swarm optimization, and the quasi-Newton L-BFG S numerical optimization method. Various architectural approaches in modeling these neural networks are considered, such as fully connected neural networks (FCNN), convolutional neural networks (CNN), and neural networks based on the deep operator network architecture (DeepONet). Their advantages and disadvantages, as well as the fields of application, such as recommendation systems and combinatorial optimization problems, are discussed. Additionally, key aspects of numerical optimization methods used for training developed models are identified, with their strengths, weaknesses, and variations aimed at improving the quality of specific methods being highlighted. The aim of this study is to clarify, systematize, and analyze the scope of existing scientific literature in the selected areas to determine the possibility of rethinking and advancing existing numerical optimization methods using models based on graph neural networks.

Еще

PINN, Physics-Informed Neural Network, Graph Attention Network, graph neural networks with attention mechanisms, GCN, Graph Convolutional Network, graph convolutional neural network, Graph Sage, numerical optimization, gradient

Короткий адрес: https://sciup.org/148331170

IDR: 148331170   |   DOI: 10.18137/RNU.V9187.25.02.P.33

Статья научная