Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do you export a frozen inference graph in Tensorflow 2.x Object Detection API?

I've been following along the following tutorials in training a custom object detection model using Tensorflow 2.x Object Detection API. Here are the two main links I was using.

https://github.com/tensorflow/models/tree/master/research/object_detection https://tensorflow-object-detection-api-tutorial.readthedocs.io/en/latest/training.html

Everything seems to work up until I try exporting the trained inference graph. Basically, in TensorFlow 1.x, there is a script https://github.com/tensorflow/models/blob/master/research/object_detection/export_inference_graph.py which is used to export the trained model checkpoints to a single frozen inference graph.

In TensorFlow 2.x, this script no longer works and instead, we use https://github.com/tensorflow/models/blob/master/research/object_detection/exporter_main_v2.py which outputs a SavedModel directory and some other stuff, but not the frozen inference graph. This is because in TF 2.x, frozen models have been deprecated.

I want to be able to retrieve the frozen inference graph from TensorFlow 1, in TensorFlow 2. I tried looking at this post https://leimao.github.io/blog/Save-Load-Inference-From-TF2-Frozen-Graph/ but I was encountering a "_UserObject has no attribute 'inputs'" error.

Does anyone know how I can work around this error, or if there are any other solutions to export an object detection SavedModel into a single frozen inference graph?

like image 983
William Jiang Avatar asked Dec 06 '25 18:12

William Jiang


1 Answers

In TF2 as we are going backwards we will have to use model.signatures. His code will change as follows:

# Convert Keras model to ConcreteFunction
full_model = tf.function(lambda x: model(x))
full_model = full_model.get_concrete_function(
    tf.TensorSpec(model.signatures['serving_default'].inputs[0].shape, model.signatures['serving_default'].inputs[0].dtype))

# Get frozen ConcreteFunction
frozen_func = convert_variables_to_constants_v2(full_model)
frozen_func.graph.as_graph_def()

# Check if we can access layers in the converted model
layers = [op.name for op in frozen_func.graph.get_operations()]
print("-" * 50)
print("Frozen model layers: ")
for layer in layers:
    print(layer)

print("-" * 50)
print("Frozen model inputs: ")
print(frozen_func.inputs)
print("Frozen model outputs: ")
print(frozen_func.outputs)

# Save frozen graph from frozen ConcreteFunction to hard drive
tf.io.write_graph(graph_or_graph_def=frozen_func.graph,
                  logdir="./frozen_models",
                  name="frozen_graph.pb",
                  as_text=False)

like image 57
Nishant Avatar answered Dec 11 '25 06:12

Nishant



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!