Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tensorflow- How to display accuracy rate for a linear regression model

I have a linear regression model that seems to work. I first load the data into X and the target column into Y, after that I implement the following...

X_train, X_test, Y_train, Y_test = train_test_split(
    X_data, 
    Y_data, 
    test_size=0.2
)

rng = np.random

n_rows = X_train.shape[0]

X = tf.placeholder("float")
Y = tf.placeholder("float")


W = tf.Variable(rng.randn(), name="weight")
b = tf.Variable(rng.randn(), name="bias")

pred = tf.add(tf.multiply(X, W), b)

cost = tf.reduce_sum(tf.pow(pred-Y, 2)/(2*n_rows))

optimizer = tf.train.GradientDescentOptimizer(FLAGS.learning_rate).minimize(cost)



init = tf.global_variables_initializer()
init_local = tf.local_variables_initializer()

with tf.Session() as sess:

    sess.run([init, init_local])

    for epoch in range(FLAGS.training_epochs):

        avg_cost = 0

        for (x, y) in zip(X_train, Y_train):

            sess.run(optimizer, feed_dict={X:x, Y:y})

        # display logs per epoch step
        if (epoch + 1) % FLAGS.display_step == 0:

            c = sess.run(
                cost, 
                feed_dict={X:X_train, Y:Y_train}
            )

            print("Epoch:", '%04d' % (epoch + 1), "cost=", "{:.9f}".format(c))

    print("Optimization Finished!")

    accuracy, accuracy_op = tf.metrics.accuracy(labels=tf.argmax(Y_test, 0), predictions=tf.argmax(pred, 0))

    print(sess.run(accuracy))

I cannot figure out how to print out the model's accuracy. For example, in sklearn, it is simple, if you have a model you just print model.score(X_test, Y_test). But I do not know how to do this in tensorflow or if it is even possible.

I think I'd be able to calculate the Mean Squared Error. Does this help in any way?

EDIT

I tried implementing tf.metrics.accuracy as suggested in the comments but I'm having an issue implementing it. The documentation says it takes 2 arguments, labels and predictions, so I tried the following...

accuracy, accuracy_op = tf.metrics.accuracy(labels=tf.argmax(Y_test, 0), predictions=tf.argmax(pred, 0))

print(sess.run(accuracy))

But this gives me an error...

FailedPreconditionError (see above for traceback): Attempting to use uninitialized value accuracy/count [[Node: accuracy/count/read = IdentityT=DT_FLOAT, _class=["loc:@accuracy/count"], _device="/job:localhost/replica:0/task:0/device:CPU:0"]]

How exactly does one implement this?

like image 972
buydadip Avatar asked Nov 01 '25 05:11

buydadip


2 Answers

Turns out, since this is a multi-class Linear Regression problem, and not a classification problem, that tf.metrics.accuracy is not the right approach.

Instead of displaying the accuracy of my model in terms of percentage, I instead focused on reducing the Mean Square Error (MSE) instead.

From looking at other examples, tf.metrics.accuracy is never used for Linear Regression, and only classification. Normally tf.metric.mean_squared_error is the right approach.

I implemented two ways of calculating the total MSE of my predictions to my testing data...

pred = tf.add(tf.matmul(X, W), b)
...
...
Y_pred = sess.run(pred, feed_dict={X:X_test})
mse = tf.reduce_mean(tf.square(Y_pred - Y_test))

OR

mse = tf.metrics.mean_squared_error(labels=Y_test, predictions=Y_pred)

They both do the same but obviously the second approach is more concise.

There's a good explanation of how to measure the accuracy of a Linear Regression model here.

like image 155
buydadip Avatar answered Nov 03 '25 01:11

buydadip


I didn't think this was clear at all from the Tensorflow documentation, but you have to declare the accuracy operation, and then initialize all global and local variables, before you run the accuracy calculation:

accuracy, accuracy_op = tf.metrics.accuracy(labels=tf.argmax(Y_test, 0), predictions=tf.argmax(pred, 0))
# ...
init_global = tf.global_variables_initializer
init_local = tf.local_variables_initializer
sess.run([init_global, init_local])
# ...
# run accuracy calculation

I read something on Stack Overflow about the accuracy calculation using local variables, which is why the local variable initializer is necessary.

like image 37
aaaaaaaaaaaaaaaaaaaaaaa0983280 Avatar answered Nov 03 '25 00:11

aaaaaaaaaaaaaaaaaaaaaaa0983280



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!