Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Neural Network Initialization - Nguyen Widrow Implementation?

I've had a go at implementing the Nguyen Widrow algorithm (below) and it appears to function correctly, but I have some follow-on questions:

  • Does this look like a correct implementation?

  • Does Nguyen Widrow initialization apply to any network topology / size ? (ie 5 layer AutoEncoder)

  • Is Nguyen Widrow initialization valid for any input range? (0/1, -1/+1, etc)

  • Is Nguyen Widrow initialization valid for any activation function? (Ie Logistic, Tanh, Linear)

The code below assumes that the network has already been randomized to -1/+1 :

        ' Calculate the number of hidden neurons
        Dim HiddenNeuronsCount As Integer = Me.TotalNeuronsCount - (Me.InputsCount - Me.OutputsCount)

        ' Calculate the Beta value for all hidden layers
        Dim Beta As Double = (0.7 * Math.Pow(HiddenNeuronsCount, (1.0 / Me.InputsCount)))

        ' Loop through each layer in neural network, skipping input layer
        For i As Integer = 1 To Layers.GetUpperBound(0)

            ' Loop through each neuron in layer
            For j As Integer = 0 To Layers(i).Neurons.GetUpperBound(0)

                Dim InputsNorm As Double = 0

                ' Loop through each weight in neuron inputs, add weight value to InputsNorm
                For k As Integer = 0 To Layers(i).Neurons(j).ConnectionWeights.GetUpperBound(0)
                    InputsNorm += Layers(i).Neurons(j).ConnectionWeights(k) * Layers(i).Neurons(j).ConnectionWeights(k)
                Next

                ' Add bias value to InputsNorm
                InputsNorm += Layers(i).Neurons(j).Bias * Layers(i).Neurons(j).Bias

                ' Finalize euclidean norm calculation
                InputsNorm = Math.Sqrt(InputsNorm)

                ' Loop through each weight in neuron inputs, scale the weight based on euclidean norm and beta
                For k As Integer = 0 To Layers(i).Neurons(j).ConnectionWeights.GetUpperBound(0)
                    Layers(i).Neurons(j).ConnectionWeights(k) = (Beta * Layers(i).Neurons(j).ConnectionWeights(k)) / InputsNorm
                Next

                ' Scale the bias based on euclidean norm and beta
                Layers(i).Neurons(j).Bias = (Beta * Layers(i).Neurons(j).Bias) / InputsNorm

            Next

        Next
like image 247
Satellite Avatar asked Mar 24 '26 12:03

Satellite


1 Answers

Nguyen & Widrow in their paper assume that the inputs are between -1 and +1. Nguyen Widrow initialization is valid for any activation function which is finite in length. Again in their paper they are only talking about a 2 layer NN, not sure about a 5 layer one.

S

like image 156
user1977150 Avatar answered Mar 26 '26 02:03

user1977150



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!