IEEE Computational Intelligence Magazine
Special Issue on

Evolutionary Neural Architecture Search and Applications

Background:

Deep neural networks have shown significantly promising performance in addressing real-world problems. The achievements of such algorithms largely owe to its deep neural architectures. However, designing an optimal deep architecture for a particular problem requires rich domain knowledge on both the investigated data and the neural network domain, which is not necessarily held by the end-users. Neural Architecture Search (NAS), as an emerging technique to automatically design the optimal deep architectures without having the expertise, is drawing increasing attention from industry and academia. However, NAS is theoretically a non-convex and non-differentiable optimization problem, and existing accurate methods are incapable of well addressing it.

Evolutionary computation (EC) approaches have shown superiority in addressing real-world problems due largely to their powerful abilities in searching for global optima, dealing with non-convex/non-differentiable problems, and requiring no rich domain knowledge. EC has been successfully applied for NAS of shallow and median-scale neural networks, while only limited work on deep neural networks. The goal of this Special Issue is to investigate both the new methods and applications on how EC promotes the deep neural network architecture search without or with only rare human expertise.

Topics:

  • Encoding strategy for both supervised and unsupervised deep neural networks, such as CNNs, Autoencoders, DBNs, GANs, LSTM, etc.

  • Novel EC methods for automatically designing the encoding strategy

  • Efficient genetic operators for variable-length individuals

  • Multi- and many-objective evolutionary neural architecture search

  • Evolutionary constrained neural architecture search

  • Evolutionary bi-level neural architecture search

  • Evolutionary methods for differential neural architecture search

  • Evolutionary transfer learning for neural architecture search

  • Surrogate-assist evolutionary neural architecture search

  • Evolutionary fitness acceleration method of neural architecture search

  • EC-based interpretable neural architecture search

  • Evolutionary fuzzy neural architecture search

  • Small-sample evolutionary neural architecture search

  • Computational efficiency and scalability of evolutionary neural architecture search algorithms

  • Real-world applications based evolutionary neural architecture search, e.g. image sequences, image analysis, face recognition, speechaudiosignal recognition and processing, big data and data analytics, pattern recognition, text mining and natural language processing, health and medical data analysis, networksoftwarecyber security, renewable/sustainable energy, engineering problems, and financial and business data analysis, etc.

Time Line:

  • Deadline for paper submission: August 30, 2020.

  • First notification for authors: November 15, 2020.

  • Deadline for revised paper submission: December 15, 2020.

  • Final notification for authors: January 31, 2021.

Guest Editors:

  • Dr. Yanan Sun, Sichuan University, China (Email:ysun@scu.edu.cn)

  • Prof. Mengjie Zhang, Victoria University of Wellington, New Zealand (Email:mengjie.zhang@ecs.vuw.ac.nz)

  • Prof. Gary G. Yen, Oklahoma State University, USA (Email:gyen@okstate.edu)