Monday 23 December 2013

Neural Networks Volume 50 Pages 1-182 February 2014

Neural Networks Letters

1. Existence and global exponential stability of periodic solution for high-order discrete-time BAM neural networks 
Pages: 98-109
Author(s): Ancai Zhang, Jianlong Qiu, Jinhua She

2. Cellular computational networks—A scalable architecture for learning the dynamics of large networked systems 
Pages: 120-123
Author(s): Bipul Luitel, Ganesh Kumar Venayagamoorthy

Cognitive Science

3. Supervised orthogonal discriminant subspace projects learning for face recognition  
Pages: 33-46
Author(s): Yu Chen, Xiao-Hong Xu

Learning Systems

4. Direct Kernel Perceptron (DKP): Ultra-fast kernel ELM-based classification with non-iterative closed-form weight calculation 
Pages: 60-71
Author(s): Manuel Fernández-Delgado, Eva Cernadas, Senén Barro, Jorge Ribeiro, José Neves

5. Batch gradient method with smoothing image regularization for training of feedforward neural networks  
Pages: 72-78
Author(s): Wei Wu, Qinwei Fan, Jacek M. Zurada, Jian Wang, Dakun Yang, Yan Liu

6. Compressed classification learning with Markov chain samples 
Pages: 90-97
Author(s): Feilong Cao, Tenghui Dai, Yongquan Zhang, Yuanpeng Tan

7. Semi-supervised learning of class balance under class-prior change by distribution matching  
Pages: 110-119
Author(s): Marthinus Christoffel du Plessis, Masashi Sugiyama

8. Robust support vector machine-trained fuzzy system  
Pages: 154-165
Author(s): Yahya Forghani, Hadi Sadoghi Yazdi

9. Large-scale linear nonparallel support vector machine solver  
Pages: 166-174
Author(s): Yingjie Tian, Yuan Ping

10. Finite time convergent learning law for continuous neural networks  
Pages: 175-182
Author(s): Isaac Chairez
   

Mathematical and Computational Analysis

11. A Bayesian inverse solution using independent component analysis  
Pages: 47-59
Author(s): Jouni Puuronen, Aapo Hyvärinen
   
12. A one-layer recurrent neural network for constrained nonsmooth invex optimization  
Pages: 79-89
Author(s): Guocheng Li, Zheng Yan, Jun Wang
   
13. Pointwise probability reinforcements for robust statistical inference  
Pages: 124-141
Author(s): Benoît Frénay, Michel Verleysen
   
14. A linear recurrent kernel online learning algorithm with sparse updates  
Pages: 142-153
Author(s): Haijin Fan, Qing Song

Engineering and Applications

15. Correcting and combining time series forecasters  
Pages: 1-11
Author(s): Paulo Renato A. Firmino, Paulo S.G. de Mattos Neto, Tiago A.E. Ferreira

16. Hybrid fault diagnosis of nonlinear systems using neural parameter estimators 
Pages: 12-32
Author(s): E. Sobhani-Tehrani, H.A. Talebi, K. Khorasani

Friday 6 December 2013

Neural Networks Volume 48 Pages 1-208

Neural Networks Letters

1. Is mutual information adequate for feature selection in regression?  
Pages: 1-7
Author(s): Benoît Frénay, Gauthier Doquire, Michel Verleysen
   
2. Neural architecture design based on extreme learning machine  
Pages: 19-24
Author(s): Andrés Bueno-Crespo, Pedro J. García-Laencina, José-Luis Sancho-Gómez
   
3. Comments on the “No-Prop” algorithm  
Pages: 59-60
Author(s): Meng-Hiot Lim

4. Exponential stabilization of delayed recurrent neural networks: A state estimation based approach  
Pages: 153-157
Author(s): He Huang, Tingwen Huang, Xiaoping Chen, Chunjiang Qian

Neuroscience

5. A model of task-specific focal dystonia  
Pages: 25-31
Author(s): Eckart Altenmüller, Dieter Müller

6. Global exponential synchronization of memristor-based recurrent neural networks with time-varying delays  
Pages: 195-203
Author(s): Shiping Wen, Gang Bao, Zhigang Zeng, Yiran Chen, Tingwen Huang
   

Learning Systems

7. An efficient matrix bi-factorization alternative optimization method for low-rank matrix recovery and completion 
Pages: 8-18
Author(s): Yuanyuan Liu, L.C. Jiao, Fanhua Shang, Fei Yin, F. Liu

8. Analysis of programming properties and the row–column generation method for 1-norm support vector machines 
Pages: 32-43
Author(s): Li Zhang, WeiDa Zhou

9. Fully corrective boosting with arbitrary loss and regularization  
Pages: 44-58
Author(s): Chunhua Shen, Hanxi Li, Anton van den Hengel
   
10. Fixed-final-time optimal control of nonlinear systems with terminal constraints  
Pages: 61-71
Author(s): Ali Heydari, S.N. Balakrishnan
   
11. Solving graph data issues using a layered architecture approach with applications to web spam detection
Pages: 78-90
Author(s): Franco Scarselli, Ah Chung Tsoi, Markus Hagenbuchner, Lucia Di Noi
   
12. Fuzzy rough sets, and a granular neural network for unsupervised feature selection 
Pages: 91-108
Author(s): Avatharam Ganivada, Shubhra Sankar Ray, Sankar K. Pal
   
13. Categorization and decision-making in a neurobiologically plausible spiking network using a STDP-like learning rule  
Pages: 109-124
Author(s): Michael Beyeler, Nikil D. Dutt, Jeffrey L. Krichmar
   
14. 1-norm support vector novelty detection and its sparseness  
Pages: 125-132
Author(s): Li Zhang, WeiDa Zhou
   
15. On the construction of the relevance vector machine based on Bayesian Ying-Yang harmony learning  
Pages: 173-179
Author(s): Dansong Cheng, Minh Nhut Nguyen, Junbin Gao, Daming Shi
   

Mathematical and Computational Analysis

16. Multivariate neural network operators with sigmoidal activation functions 
Pages: 72-77
Author(s): Danilo Costarelli, Renato Spigler
   
17. Self-Organizing Hidden Markov Model Map (SOHMMM) 
Pages: 133-147
Author(s): Christos Ferles, Andreas Stafylopatis
   
18. The breaking of a delayed ring neural network contributes to stability: The rule and exceptions 
Pages: 148-152
Author(s): T.N. Khokhlova, M.M. Kipnis
   
19. Global exponential dissipativity and stabilization of memristor-based recurrent neural networks with time-varying delays  
Pages: 158-172
Author(s): Zhenyuan Guo, Jun Wang, Zheng Yan
   
20. Dynamical behaviors for discontinuous and delayed neural networks in the framework of Filippov differential inclusions  
Pages: 180-194
Author(s): Lihong Huang, Zuowei Cai, Lingling Zhang, Lian Duan
   

Letter to the Editor

21. Reply to the Comments on the “No-Prop” algorithm  
Pages: 204
Author(s): Bernard Widrow
   
22. A comment on “ image -stability and image -synchronization for fractional-order neural networks”  
Pages: 207-208
Author(s): Li Kexue, Peng Jigen, Gao Jinghuai
   

Erratum

23. Erratum to “Comments on the ‘No-Prop’ algorithm” [Neural Netw. 48 (2013) 59–60]  
Pages:
205
Author(s): Meng-Hiot Lim
   

Corrigendum

24. Corrigendum to “Noise enhanced clustering and competitive learning algorithms” [Neural Netw. 37 (2013) 132–140]  
Pages: 206
Author(s): Osonde Osoba, Bart Kosko

Tuesday 17 September 2013

Neural Networks Volume 47 Pages 1-150

1. Computation in the Cerebellum  
Author(s): Dieter Jaeger, Henrik Jorntell, Mitsuo Kawato
Pages: 1-2

2. The importance of stochastic signaling processes in the induction of long-term synaptic plasticity  
Author(s): Erik De Schutter
Pages: 3-10

3. Dendritic calcium signaling in cerebellar Purkinje cell  
Author(s): Kazuo Kitamura, Masanobu Kano
Pages: 11-17

4. Bistability in Purkinje neurons: Ups and downs in cerebellar research  
Author(s): Jordan D.T. Engbers, Fernando R. Fernandez, Ray W. Turner
Pages: 18-31

5. Mechanisms producing time course of cerebellar long-term depression  
Author(s): Taegon Kim, Keiko Tanaka-Yamamoto
Pages: 32-35

6. Cerebellar LTD vs. motor learning—Lessons learned from studying GluD2  
Author(s): Michisuke Yuzaki
Pages: 36-41

7. Adaptive coupling of inferior olive neurons in cerebellar learning  
Author(s): Isao T. Tokuda, Huu Hoang, Nicolas Schweighofer, Mitsuo Kawato
Pages: 42-50

8. Solution to the inverse problem of estimating gap-junctional and inhibitory conductance in inferior olive neurons from spike trains by network model simulation  
Author(s): Miho Onizuka, Huu Hoang, Mitsuo Kawato, Isao T. Tokuda, Nicolas Schweighofer, Yuichi Katori, Kazuyuki Aihara, Eric J. Lang, Keisuke Toyama
Pages: 51-63

9. Nucleo-olivary inhibition balances the interaction between the reactive and adaptive layers in motor control  
Author(s): Ivan Herreros, Paul F.M.J. Verschure
Pages: 64-71

10. Transfer of memory trace of cerebellum-dependent motor learning in human prism adaptation: A model study  
Author(s): Soichi Nagao, Takeru Honda, Tadashi Yamazaki
Pages: 72-80
   
11. Classical conditioning of motor responses: What is the learning mechanism?  
Author(s): Germund Hesslow, Dan-Anders Jirenhed, Anders Rasmussen, Fredrik Johansson
Pages: 81-87

12. Cross-correlations between pairs of neurons in cerebellar cortex in vivo  
Author(s): Fredrik Bengtsson, Pontus Geborek, Henrik Jörntell
Pages: 88-94

13. Using a million cell simulation of the cerebellum: Network scaling and task generality  
Author(s): Wen-Ke Li, Matthew J. Hausknecht, Peter Stone, Michael D. Mauk
Pages: 95-102
   
14. Realtime cerebellum: A large-scale spiking network model of the cerebellum that runs in realtime using a graphics processing unit  
Author(s): Tadashi Yamazaki, Jun Igarashi
Pages: 103-111

15. Modeling the generation of output by the cerebellar nuclei  
Author(s): Volker Steuber, Dieter Jaeger
Pages: 112-119

16. Modeling cancelation of periodic inputs with burst-STDP and feedback  
Author(s): K. Bol, G. Marsat, J.F. Mejias, L. Maler, A. Longtin
Pages: 120-133

17. Adaptive filters and internal models: Multilevel description of cerebellar function  
Author(s): John Porrill, Paul Dean, Sean R. Anderson
Pages: 134-149

Saturday 1 June 2013

Registration for IJCNN 2013 is now open

Registration for IJCNN 2013 is now open. This conference will be held in Dallas, Texas, 4-9 August, 2013.

The deadline for early registration is 28 June 2013.


Friday 31 May 2013

Reminder: Nominations for INNS Awards 2014

The deadline for these nominations is very soon!


The International Neural Network Society's Awards Program is established to recognize individuals who have made outstanding contributions in the field of Neural Networks. Up to three awards, at most one in each category, of $1,000 each, are presented annually to senior, highly accomplished researchers for outstanding contributions made in the field of Neural Networks.

The Hebb, Helmholtz and Gabor Awards:

The Hebb Award - recognizes achievement in biological learning.

The Helmholtz Award - recognizes achievement in sensation/perception.

The Gabor Award - recognizes achievement in engineering/application.

Young Investigator Awards:

Up to two awards of $500 each are presented annually to individuals with no more than five years postdoctoral experience and who are under forty years of age, for significant contributions in the field of Neural Networks.


Nominations:

1. The Awards Committee should receive nominations of no more than two pages in length, specifying:


  • The award category (Hebb, Helmholtz, Gabor, or Young Investigator) for which the candidate is being nominated.
  • The reasons for which the nominee should be considered for the award.
  • A list of at least five of the nominee's important and published papers. 
2. The curriculum vitae of both the nominee and the nominator must be included with the nomination, including the name, address, position/title, phone, fax, and e-mail address for both the nominee and nominator.

3. The nominator must be an INNS member in good standing. Nominees do not have to be INNS members. If an award recipient is not an INNS member, they shall receive a free one-year INNS membership.

4. Nominators may not nominate themselves or their family members.

5. Individuals may not receive the same INNS Award more than once.

All nominations will be considered by the Awards Committee and selected ones forwarded to the INNS Board of Governors, along with the Committee's recommendations for award recipients. Voting shall be performed by the entire BoG.

The Awards Committee:

INNS Award Committee consists of the chair (Prof. Leonid Perlovsky) and two other members. All members must be INNS Governors in the year that they are appointed.

Please email the 2014 nominations along with their attachments directly to the chair of the Awards Committee at leonid@seas.harvard.edu, with a copy to the Secretary of the Society at hava@cs.umass.edu by June 1, 2013. Please use the following subject line in the email: INNS award nomination.

Neural Networks: 22-28 May

1. The learning problem of multi-layer neural networks
Author(s): Jung-Chao Ban, Chih-Hung Chang

2. Active learning for noisy oracle via density power divergence 
Author(s): Yasuhiro Sogawa, Tsuyoshi Ueno, Yoshinobu Kawahara, Takashi Washio
   
3. Integer sparse distributed memory: Analysis and results  
Author(s): Javier Snaider, Stan Franklin, Steve Strain, E. Olusegun George
   
4. Characterization of seizure-like events recorded in vivo in a mouse model of Rett syndrome  
Author(s): Sinisa Colic, Robert G. Wither, Liang Zhang, James H. Eubanks, Berj L. Bardakjian

Monday 6 May 2013

Nominations for INNS Awards 2014

The International Neural Network Society's Awards Program is established to recognize individuals who have made outstanding contributions in the field of Neural Networks. Up to three awards, at most one in each category, of $1,000 each, are presented annually to senior, highly accomplished researchers for outstanding contributions made in the field of Neural Networks.

The Hebb, Helmholtz and Gabor Awards:

The Hebb Award - recognizes achievement in biological learning.

The Helmholtz Award - recognizes achievement in sensation/perception.

The Gabor Award - recognizes achievement in engineering/application.

Young Investigator Awards:

Up to two awards of $500 each are presented annually to individuals with no more than five years postdoctoral experience and who are under forty years of age, for significant contributions in the field of Neural Networks.


Nominations:

1. The Awards Committee should receive nominations of no more than two pages in length, specifying:

  • The award category (Hebb, Helmholtz, Gabor, or Young Investigator) for which the candidate is being nominated.
  • The reasons for which the nominee should be considered for the award.
  • A list of at least five of the nominee's important and published papers. 
2. The curriculum vitae of both the nominee and the nominator must be included with the nomination, including the name, address, position/title, phone, fax, and e-mail address for both the nominee and nominator.

3. The nominator must be an INNS member in good standing. Nominees do not have to be INNS members. If an award recipient is not an INNS member, they shall receive a free one-year INNS membership.

4. Nominators may not nominate themselves or their family members.

5. Individuals may not receive the same INNS Award more than once.

All nominations will be considered by the Awards Committee and selected ones forwarded to the INNS Board of Governors, along with the Committee's recommendations for award recipients. Voting shall be performed by the entire BoG.

The Awards Committee:

INNS Award Committee consists of the chair (Prof. Leonid Perlovsky) and two other members. All members must be INNS Governors in the year that they are appointed.

Please email the 2014 nominations along with their attachments directly to the chair of the Awards Committee at leonid@seas.harvard.edu, with a copy to the Secretary of the Society at hava@cs.umass.edu by June 1, 2013. Please use the following subject line in the email: INNS award nomination.

Thursday 18 April 2013

Neural Networks Special Issue CFP

Affective and Cognitive Learning Systems for Big Social Data Analysis

 
Guest Editors
Amir Hussain (Lead Guest Editor), University of Stirling, United Kingdom (ahu@cs.stir.ac.uk)
Erik Cambria, National University of Singapore, Singapore (cambria@nus.edu.sg)
Björn Schuller, Technische Universität München, Germany (schuller@tum.de)
Newton Howard, MIT Media Laboratory, USA (nhmit@mit.edu)

Background and Motivation
As the Web rapidly evolves,Web users are evolving with it. In an era of social connectedness, people are becoming more and more enthusiastic about interacting, sharing, and collaborating through social networks, online communities, blogs, Wikis, and other online collaborative media. In recent years, this collective intelligence has spread to many different areas, with particular focus on fields related to everyday life such as commerce, tourism, education, and health, causing the size of the Web to expand exponentially. The distillation of knowledge from such a large amount of unstructured information, however, is an extremely difficult task, as the contents of today’s Web are perfectly suitable for human consumption, but remain hardly accessible to machines. The opportunity to capture the opinions of the general public about social events, political movements, company strategies, marketing campaigns, and product preferences has raised growing interest both within the scientific community, leading to many exciting open challenges, as well as in the business world, due to the remarkable benefits to be had from marketing and financial market prediction.
 
Existing approaches to opinion mining mainly rely on parts of text in which sentiment is explicitly expressed, e.g., through polarity terms or affect words (and their co-occurrence frequencies). However, opinions and sentiments are often conveyed implicitly through latent semantics, which make purely syntactical approaches ineffective. In this light, this Special Issue focuses on the introduction, presentation, and discussion of novel techniques that further develop and apply big data analysis tools and techniques for sentiment analysis. A key motivation for this Special Issue, in particular, is to explore the adoption of novel affective and cognitive learning systems to go beyond a mere word-level analysis of natural language text and provide novel concept-level tools and techniques that allow a more efficient passage from (unstructured) natural language to (structured) machine-processable data, in potentially any domain.
 
Articles are thus invited in areas such as machine learning, weakly supervised learning, active learning, transfer learning, deep neural networks, novel neural and cognitive models, data mining, pattern recognition, knowledge-based systems, information retrieval, natural language processing, and big data computing. Topics include, but are not limited to:

• Machine learning for big social data analysis
• Biologically inspired opinion mining
• Semantic multi-dimensional scaling for sentiment analysis
• Social media marketing
• Social media analysis, representation, and retrieval
• Social network modeling, simulation, and visualization
• Concept-level opinion and sentiment analysis
• Patient opinion mining
• Sentic computing
• Multilingual sentiment analysis
• Time-evolving sentiment tracking
• Cross-domain evaluation
• Domain adaptation for sentiment classification
• Multimodal sentiment analysis
• Multimodal fusion for continuous interpretation of semantics
• Human-agent, -computer, and -robot interaction
• Affective common-sense reasoning
• Cognitive agent-based computing
• Image analysis and understanding
• User profiling and personalization
• Affective knowledge acquisition for sentiment analysis

The Special Issue also welcomes papers on specific application domains of big social data analysis, e.g., influence networks, customer experience management, intelligent user interfaces, multimedia management, computer-mediated human-human communication, enterprise feedback management, surveillance, art. The authors will be required to follow the Author’s Guide for manuscript submission to Elsevier Neural Networks.

Timeframe
Call for Papers out: April 2013
Submission Deadline: August 1, 2013
Notification of Acceptance: November 1, 2013
Final Manuscripts Due: December 1, 2013
Date of Publication: March 2014

Composition and Review Procedures
The Elsevier Neural Networks Special Issue on Affective and Cognitive Learning Systems for Big Social Data Analysis will consist of papers on novel methods and techniques that further develop and apply big data analysis tools and techniques in the context of opinion mining and sentiment analysis. Some papers may survey various aspects of the topic. The balance between these will be adjusted to maximize the issue’s impact. All articles are expected to successfully negotiate the standard review procedures for Elsevier Neural Networks. Authors are required to follow Elsevier Neural Networks proceedings templates and to submit their manuscripts at http://ees.elsevier.com/neunet.

Monday 8 April 2013

Neural Networks: New articles 2-8 April, 2013

1. Discriminant subspace learning constrained by locally statistical uncorrelation for face recognition
Author(s): Yu Chen, Wei-Shi Zheng, Xiao-Hong Xu, Jian-Huang Lai
Pages: 28-43

2. Probabilistic DHP adaptive critic for nonlinear stochastic control systems
Author(s): Randa Herzallah
Pages: 74-82

3. Learning in compressed space
Author(s): Alexander Fabisch, Yohannes Kassahun, Hendrik Wöhrle, Frank Kirchner
Pages: 83-93

4. Wavelet neural networks: A practical guide
Author(s): Antonios K. Alexandridis, Achilleas D. Zapranis
Pages: 1-27

5. A model of analogue winners-take-all neural circuit
Author(s): Pavlo V. Tymoshchuk
Pages: 44-61

6. Synthesis of high-complexity rhythmic signals for closed-loop electrical neuromodulation
Author(s): Osbert C. Zalay, Berj L. Bardakjian
Pages: 62-73

Monday 18 February 2013

Deadline extended for IJCNN 2013

The deadline for submitting papers to the International Joint Conference on Neural Networks (IJCNN) 2013 has been extended to 1 March 2013. Proposals for tutorials, workshops and panel sessions will also be accepted until this date. This conference will be held in Dallas, Texas, 4-9 August 2013.