Recurrent neural networks for solving second-order cone programs

Chun-Hsu Ko Jein-Shan Chen Ching-Yu Yang

Machine Learning mathscidoc:1910.43881

Neurocomputing, 74, (17), 3646-3653, 2011.10
This paper proposes using the neural networks to efficiently solve the second-order cone programs (SOCP). To establish the neural networks, the SOCP is first reformulated as a second-order cone complementarity problem (SOCCP) with the KarushKuhnTucker conditions of the SOCP. The SOCCP functions, which transform the SOCCP into a set of nonlinear equations, are then utilized to design the neural networks. We propose two kinds of neural networks with the different SOCCP functions. The first neural network uses the FischerBurmeister function to achieve an unconstrained minimization with a merit function. We show that the merit function is a Lyapunov function and this neural network is asymptotically stable. The second neural network utilizes the natural residual function with the cone projection function to achieve low computation complexity. It is shown to be Lyapunov stable and converges globally
No keywords uploaded!
[ Download ] [ 2019-10-20 22:35:57 uploaded by Jein_Shan_Chen ] [ 641 downloads ] [ 0 comments ]
@inproceedings{chun-hsu2011recurrent,
  title={Recurrent neural networks for solving second-order cone programs},
  author={Chun-Hsu Ko, Jein-Shan Chen, and Ching-Yu Yang},
  url={http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20191020223557660338410},
  booktitle={Neurocomputing},
  volume={74},
  number={17},
  pages={3646-3653},
  year={2011},
}
Chun-Hsu Ko, Jein-Shan Chen, and Ching-Yu Yang. Recurrent neural networks for solving second-order cone programs. 2011. Vol. 74. In Neurocomputing. pp.3646-3653. http://archive.ymsc.tsinghua.edu.cn/pacm_paperurl/20191020223557660338410.
Please log in for comment!
 
 
Contact us: office-iccm@tsinghua.edu.cn | Copyright Reserved