A study found artificial neural networks can evolve to perform tasks without learning.
Such networks usually adjust the weight or strengths of the connections between computing elements (neurons), and the new technique looks for networks for which the weights are irrelevant.
Starting with an array of simple networks that connects inputs to behavioral outputs, the process assesses the nets' performance on a given task, retains the best-performing networks, and adds a neuron or link or adjusts a neuron's sensitivity to the total of its inputs.
A shared random number is then assigned to all of a network's weights to yield Weight Agnostic Neural Networks.
These networks earn points for task execution and simplicity, and their performance on three simulated tasks—driving a racecar, making a two-legged robot walk, and controlling a wheeled cart to balance a pole—was comparable to standard experientially-trained networks.
From IEEE Spectrum
View Full Article
Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA
No entries found