Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Implementing heaviside step function in TensorFlow

I want to create heaviside step function in TensorFlow. Since Heaviside function is not differentiable I also need to choose derivative approximation and define custom gradient so full implementation looks like this:

import tensorflow as tf


@tf.RegisterGradient("HeavisideGrad")
def _heaviside_grad(unused_op: tf.Operation, grad: tf.Tensor):
    x = unused_op.inputs[0]
    # During backpropagation heaviside behaves like sigmoid
    return tf.sigmoid(x) * (1 - tf.sigmoid(x)) * grad


def heaviside(x: tf.Tensor, g: tf.Graph = tf.get_default_graph()):
    custom_grads = {
        "Sign": "HeavisideGrad"
    }
    with g.gradient_override_map(custom_grads):
        # TODO: heaviside(0) currently returns 0. We need heaviside(0) = 1
        sign = tf.sign(x)
        # tf.stop_gradient is needed to exclude tf.maximum from derivative
        step_func = sign + tf.stop_gradient(tf.maximum(0.0, sign) - sign)
        return step_func

There is one caveat in my implementation: tf.sign(0) returns zero value so heaviside(0) also returns zero and I want heaviside(0) to return 1. How can I achieve such behavior?

like image 697
NShiny Avatar asked Feb 01 '26 00:02

NShiny


1 Answers

A very hacky way would be to use

1 - max(0.0, sign(-x)) 

as your step function instead of

max(0.0, sign(x))

Another option would be to use greater_equal and cast the result to your desired type, and override its gradient with the sigmoid override you already have.

like image 92
etarion Avatar answered Feb 03 '26 12:02

etarion