In this paper, we consider a family of neural networks for solving nonlinear complementarity problems (NCP). The neural networks are constructed from the merit functions based on three classes of NCP-functions: the generalized natural residual function and its two symmetrizations. In this paper, we first characterize the stationary points of the induced merit functions. Growth behavior of the complementarity functions is also described, as this will play an important role in describing the level sets of the merit functions. In addition, the stability of the steepest descent-based neural network model for NCP is analyzed. We provide numerical simulations to illustrate the theoretical results, and also compare the proposed neural networks with existing neural networks based on other well-known NCP-functions. Numerical results indicate that the performance of the neural network is better when the parameter <i>p</i> associated
1. Motivation and basic concepts In this note, we consider two signomial functions whose convexity play important roles in some recent papers [4, 6, 7, 8, 9] dealing with geometric programming problems. However, the verifications therein contain some certain flaws and those incorrect arguments are repeatedly appeared and cited. From point of scientific researchs view, we hereby provide correct proofs for them.
In this article, we consider the Lorentz cone complementarity problems in infinite-dimensional real Hilbert space. We establish several results that are standard and important when dealing with complementarity problems. These include proving the same growth of the FishcherBurmeister merit function and the natural residual merit function, investigating property of bounded level sets under mild conditions via different merit functions, and providing global error bounds through the proposed merit functions. Such results are helpful for further designing solution methods for the Lorentz cone complementarity problems in Hilbert space.