
Optimization and acceleration of convolutional neural networks…
Jul 1, 2022 · Three types of strategies have been explained to enhance the computational speed of CNN at algorithmic level and implementation level.
CNN Optimization: Strategies for Enhancing Deep Learning …
Apr 7, 2025 · Optimizing CNNs involves improving both their accuracy and speed through various techniques. Techniques like Stochastic Gradient Descent (SGD), Adam optimization, and fast convolution methods can significantly accelerate computation speed while maintaining or even improving model performance.
Optimization Rule in Deep Neural Networks - GeeksforGeeks
Mar 3, 2025 · Unconstrained optimization plays a crucial role in the training of neural networks. Unlike constrained optimization, where the solution must satisfy certain constraints, unconstrained optimization seeks to minimize (or maximize) an objective function without any …
Hyperparameter Optimization in CNN: A Review - IEEE Xplore
Abstract: Hyperparameter optimization is an important issue in convolutional neural networks (CNNs), which is an appropriate deep learning network for image classification. Several classical and metaheuristic algorithms are often employed to optimize hyperparameters.
As an attempt to explain the practical success of deep CNNs and mitigate the gap between theory and practice, this work aims to provide tighter data-independent generalization er-ror bound and algorithmic optimization guarantees for the commonly used deep CNN models in practice.
Gradient-Sensitive Optimization for Convolutional Neural Networks
Mar 23, 2021 · Convolutional neural networks (CNNs) are effective models for image classification and recognition. Gradient descent optimization (GD) is the basic algorithm for CNN model optimization. Since GD appeared, a series of improved algorithms have been derived. Among these algorithms, adaptive moment estimation (Adam) has been widely recognized.
Abstract—Optimizing hyperparameters in Convolutional Neural Network (CNN) is a tedious problem for many researchers and practitioners. To get hyperparameters with better performance, experts are required to configure a set of hyperparameter choices manually.
(PDF) Optimizing Convolutional Neural Network Architectures
Sep 28, 2024 · Motivated by an interest in optimizing Machine Learning models, in this paper, we propose Optimizing Convolutional Neural Network Architectures (OCNNA). It is a novel CNN optimization and...
Survey of Optimization Algorithms in Modern Neural Networks
May 26, 2023 · In this review, we consider all existing optimization algorithms that meet in neural networks. We present modifications of optimization algorithms of the first, second, and information-geometric order, which are related to information geometry for Fisher–Rao and Bregman metrics.
MODE-CNN: A fast converging multi-objective optimization algorithm …
Sep 1, 2021 · In this study, we first develop an algorithm called MODE-CNN, based on the multi-objective differential evolution (MODE) algorithm for parameter optimization of CNN or CNN-based methods. MODE-CNN is then compared with …
- Some results have been removed