Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can't approximate simple multiplication function in neural network with 1 hidden layer

I just wanted to test how good can neural network approximate multiplication function (regression task). I am using Azure Machine Learning Studio. I have 6500 samples, 1 hidden layer (I have tested 5 /30 /100 neurons per hidden layer), no normalization. And default parameters Learning rate - 0.005, Number of learning iterations - 200, The initial learning weigh - 0.1, The momentum - 0 [description]. I got extremely bad accuracy, close to 0. At the same time boosted Decision forest regression shows very good approximation.

What am I doing wrong? This task should be very easy for NN.

like image 221
Brans Ds Avatar asked Oct 27 '25 10:10

Brans Ds


1 Answers

Some things to check:

  1. Your output layer should have a linear activation function. If it's sigmoidal, it won't be able to represent values outside it's range (e.g. -1 to 1)
  2. You should use a loss function that's appropriate for regression (e.g. squared error)
  3. If your hidden layer uses sigmoidal activation functions, check that you're not saturating them. Multiplication can work on arbitrarily small/large values. And, if you pass a large number as input you can get saturation, which will lose information. If using ReLUs, make sure they're not getting stuck at 0 on all examples (although activations will generally be sparse on any given example).
  4. Check that your training procedure is working as intended. Plot the error over time during training. How does it look? Are your gradients well behaved or are they blowing up? One source of problems can be the learning rate being set too high (unstable error, exploding gradients) or too low (very slow progress, error doesn't decrease quickly enough).
like image 154
user20160 Avatar answered Oct 30 '25 13:10

user20160



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!