re: Is Deep Learning a Dead End? VIEW POST


The question is whether such huge computing resources are required to achieve artificial general intelligence. It appears from simple observation of natural systems that a neural net can be trained without huge data centers and megawatts of power. There is no doubt that back propagation is getting to the same end point, a trained neural network, but it is not a fast learner.

When I talk about learning speed I mean how many observations or experiences it takes to learn something. For example, the unit of experience might be a single game of Go. For a human leaning Go might take perhaps 500 games. For Alpha Zero it took hundreds of millions of games. Because machines operate faster they can play vastly more games and thus have more experience, but their ability to learn from each is minuscule.

My observation is simply that natural systems point toward a better learning solution which once bootstrapped are able to learn from a single experience vs the hundreds or thousands required by current ANNs.

Deep Learning works, but it is like the Model T Ford of cars, or the first aircraft at Kitty Hawk. My article was indicating that many have taken the idea that machine learning requires big data and huge data centers to heart because the only implementations we have to date have these features.

There is a weakness in my appeal to nature, in that often engineered solutions can surpass nature just as modern aircraft vastly surpass birds. But at the same time until we achieve comparable learning performance there is something to be learned by taking inspiration from nature.

code of conduct - report abuse