A smoothing gradient-based neural network strategy for solving semidefinite programming problems. 2022

Asiye Nikseresht, and Alireza Nazemi
Faculty of Mathematical Sciences, Shahrood University of Technology, Shahrood, Iran.

Linear semidefinite programming problems have received a lot of attentions because of large variety of applications. This paper deals with a smooth gradient neural network scheme for solving semidefinite programming problems. According to some properties of convex analysis and using a merit function in matrix form, a neural network model is constructed. It is shown that the proposed neural network is asymptotically stable and converges to an exact optimal solution of the semidefinite programming problem. Numerical simulations are given to show that the numerical behaviours are in good agreement with the theoretical results.

UI MeSH Term Description Entries
D011340 Problem Solving A learning situation involving more than one alternative from which a selection is made in order to attain a specific goal.
D011382 Programming, Linear A technique of operations research for solving certain kinds of problems involving many variables where a best value or set of best values is to be found. It is most likely to be feasible when the quantity to be optimized, sometimes called the objective function, can be stated as a mathematical expression in terms of the various activities within the system, and when this expression is simply proportional to the measure of the activities, i.e., is linear, and when all the restrictions are also linear. It is different from computer programming, although problems using linear programming techniques may be programmed on a computer. Linear Programming
D016571 Neural Networks, Computer A computer architecture, implementable in either hardware or software, modeled after biological neural networks. Like the biological system in which the processing capability is a result of the interconnection strengths between arrays of nonlinear processing nodes, computerized neural networks, often called perceptrons or multilayer connectionist models, consist of neuron-like units. A homogeneous group of units makes up a layer. These networks are good at pattern recognition. They are adaptive, performing tasks by example, and thus are better for decision-making than are linear learning machines or cluster analysis. They do not require explicit programming. Computational Neural Networks,Connectionist Models,Models, Neural Network,Neural Network Models,Neural Networks (Computer),Perceptrons,Computational Neural Network,Computer Neural Network,Computer Neural Networks,Connectionist Model,Model, Connectionist,Model, Neural Network,Models, Connectionist,Network Model, Neural,Network Models, Neural,Network, Computational Neural,Network, Computer Neural,Network, Neural (Computer),Networks, Computational Neural,Networks, Computer Neural,Networks, Neural (Computer),Neural Network (Computer),Neural Network Model,Neural Network, Computational,Neural Network, Computer,Neural Networks, Computational,Perceptron

Related Publications

Asiye Nikseresht, and Alireza Nazemi
January 2001, IEEE transactions on neural networks,
Asiye Nikseresht, and Alireza Nazemi
January 1997, IEEE transactions on neural networks,
Asiye Nikseresht, and Alireza Nazemi
January 1999, IEEE transactions on neural networks,
Asiye Nikseresht, and Alireza Nazemi
March 2014, Neural networks : the official journal of the International Neural Network Society,
Asiye Nikseresht, and Alireza Nazemi
January 1996, IEEE transactions on neural networks,
Asiye Nikseresht, and Alireza Nazemi
January 1995, IEEE transactions on neural networks,
Asiye Nikseresht, and Alireza Nazemi
November 2006, IEEE transactions on neural networks,
Asiye Nikseresht, and Alireza Nazemi
May 2017, Neural networks : the official journal of the International Neural Network Society,
Asiye Nikseresht, and Alireza Nazemi
January 2003, IEEE transactions on neural networks,
Asiye Nikseresht, and Alireza Nazemi
January 1996, IEEE transactions on neural networks,
Copied contents to your clipboard!