Connected hidden neurons (CHNNet): an artificial neural network for rapid convergence
Abstract
Despite artificial neural networks being inspired by the functionalities of biological
neural networks, unlike biological neural networks, conventional artificial neural
networks are often structured hierarchically, which can impede the flow of information
between neurons as the neurons in the same layer have no connections between
them. Hence, we propose a more robust model of artificial neural networks where
the hidden neurons, residing in the same hidden layer, are interconnected that leads
to rapid convergence. With the experimental study of our proposed model as fully
connected layers in deep networks, we demonstrate that the model results in a
noticeable increase in convergence rate compared to the conventional feed-forward
neural network.