Neural networks : the official journal of the International Neural Network Society
-
In this paper, we study global uniform asymptotic fixed deviation stability and stability for a wide class of memristive neural networks with time-varying delays. Firstly, a new mathematical expression of the generic memductance (memristance) is proposed according to the feature of the memristor and the general current-voltage characteristic and a new class of neural networks is designed. Next, a new concept of stability (fixed deviation stability) is proposed in order to describe veritably the stability characteristics of the discontinuous system and the sufficient conditions are given to guarantee the global uniform asymptotic fixed deviation stability and stability of the new system. Finally, two numerical examples are provided to show the applicability and effectiveness of our main results.
-
Deep learning has received significant attention recently as a promising solution to many problems in the area of artificial intelligence. Among several deep learning architectures, convolutional neural networks (CNNs) demonstrate superior performance when compared to other machine learning methods in the applications of object detection and recognition. We use a CNN for image enhancement and the detection of driving lanes on motorways. ⋯ A conventional ELM can be applied to networks with a single hidden layer; as such, we propose a stacked ELM architecture in the CNN framework. Further, we modify the backpropagation algorithm to find the targets of hidden layers and effectively learn network weights while maintaining performance. Experimental results confirm that the proposed method is effective in reducing learning time and improving performance.
-
The success of myoelectric pattern recognition (M-PR) mostly relies on the features extracted and classifier employed. This paper proposes and evaluates a fast classifier, extreme learning machine (ELM), to classify individual and combined finger movements on amputees and non-amputees. ELM is a single hidden layer feed-forward network (SLFN) that avoids iterative learning by determining input weights randomly and output weights analytically. ⋯ The experimental results show the most accurate ELM classifier is radial basis function ELM (RBF-ELM). The comparison of RBF-ELM and other well-known classifiers shows that RBF-ELM is as accurate as SVM and LS-SVM but faster than the SVM family; it is superior to LDA and kNN. The experimental results also indicate that the accuracy gap of the M-PR on the amputees and non-amputees is not too much with the accuracy of 98.55% on amputees and 99.5% on the non-amputees using six electromyography (EMG) channels.
-
Automatic Speaker Recognition (ASR) and related issues are continuously evolving as inseparable elements of Human Computer Interaction (HCI). With assimilation of emerging concepts like big data and Internet of Things (IoT) as extended elements of HCI, ASR techniques are found to be passing through a paradigm shift. Oflate, learning based techniques have started to receive greater attention from research communities related to ASR owing to the fact that former possess natural ability to mimic biological behavior and that way aids ASR modeling and processing. ⋯ By using ANN in FF form as feature extractor, the performance of the system is evaluated and a comparison is made. Experimental results show that the application of big data samples has enhanced the learning of the ASR system. Further, the ANN based sample and feature extraction techniques are found to be efficient enough to enable application of ML techniques in big data aspects as part of ASR systems.
-
We investigated the organization of a recurrent network under ongoing synaptic plasticity using a model of neural oscillators coupled by dynamic synapses. In this model, the coupling weights changed dynamically, depending on the timing between the oscillators. ⋯ Heterogeneous layered clusters with different frequencies emerged from homogeneous populations as the Fourier zero mode increased. Our findings may provide new insights into the self-assembly mechanisms of neural networks related to synaptic plasticity.