Нейроэволюция в тени, но не сдаётся

Jun 13, 2016 23:20

Вышло несколько интересных работ по нейроэволюции:
-- Convolution by Evolution: Differentiable Pattern Producing Networks, http://arxiv.org/abs/1606.02580, Our main result is that DPPNs can be evolved/trained to compress the weights of a denoising autoencoder from 157684 to roughly 200 parameters, while achieving a reconstruction accuracy comparable to a fully connected network with more than two orders of magnitude more parameters.
-- Simple Evolutionary Optimization Can Rival Stochastic Gradient Descent in Neural Networks, http://eplex.cs.ucf.edu/publications/2016/morse-gecco16 (впрочем, я уже давал эту ссылку). using this approach with only a simple evolutionary algorithms (called the limited evaluation EA or LEEA) is competitive with the performance of the state-of-the-art SGD variant RMSProp on several benchmarks with neural networks with over 1,000 weights.

Урожай работ этого года будут собирать, конечно, к 8-9 декабря 2016, ICERN 2016: 18th International Conference on Evolutionary Robotics and Neuroevolution, https://www.waset.org/conference/2016/12/rome/ICERN/home.
Previous post Next post
Up