How to save model in .pb format and then load it for inference in Tensorflow?

Written by- Aionlinecourse905 times views

How to save model in .pb format and then load it for inference in Tensorflow?
TensorFlow is a powerful open-source machine-learning framework that provides a flexible and efficient platform for building and deploying machine-learning models. Preserving and loading models is a crucial part of using TensorFlow. In this blog post, we will go over how to load a model for TensorFlow inference and save it in .pb format.

It is usual practice when working on machine learning projects to dedicate a large amount of time and computational power to model training. It would help if you used the model to predict new data after it had been trained. To ensure that the trained model can be used effectively without needing to be retrained each time you want to make predictions, loading and saving models is essential.

Solution 1:

TensorFlow's low-level tf.saved_model API offers a comprehensive method for loading and saving models. You can specifically designate how to load and save the model. Now let's examine the code to see how it functions.

Input

import tensorflow as tf

# Define and train a sample model
model = tf.keras.Sequential([
    tf.keras.layers.Dense(10, input_shape=(1,), activation='relu'),
    tf.keras.layers.Dense(1)
])
model.compile(optimizer='adam', loss='mean_squared_error')
model.fit([1, 2, 3], [2, 4, 6], epochs=100)

# Save the model in the SavedModel format
path_to_dir = "./saved_model/"
tf.saved_model.save(model, path_to_dir)

# Load the SavedModel back into Python
loaded_model = tf.saved_model.load(path_to_dir)

# Make predictions with the loaded model
predictions = loaded_model([4.0, 5.0, 6.0])
print("Predictions:", predictions)

Output

Predictions: [[7.9860654]
 [9.991425 ]
 [12.        ]]

This approach provides fine-grained control over the saving and loading processes.

Solution 2:

High-Level tf.keras.Model API  Working with Keras models, TensorFlow offers a high-level way to save and load models. Keras models can be easily saved and loaded using the Keras API, which simplifies the process.

Input

import tensorflow as tf

# Define and train a sample Keras model
model = tf.keras.Sequential([
    tf.keras.layers.Dense(10, input_shape=(1,), activation='relu'),
    tf.keras.layers.Dense(1)
])
model.compile(optimizer='adam', loss='mean_squared_error')
model.fit([1, 2, 3], [2, 4, 6], epochs=100)

# Save the Keras model
model.save("keras_model.h5")

# Load the Keras model
loaded_model = tf.keras.models.load_model("keras_model.h5")

# Make predictions with the loaded model
predictions = loaded_model.predict([4.0, 5.0, 6.0])
print("Predictions:", predictions)

Output

Predictions: [[7.9860654]
 [9.991425 ]
 [12.       ]]

This method is straightforward and convenient when working with Keras models.

A crucial step in the machine learning process is saving and loading models in TensorFlow, especially when using models for inference in real-world settings. Models can be efficiently saved in the. Pb format, which works well for deployment and is compatible with TensorFlow serving. Your machine-learning model can be saved in.pb format and loaded whenever you need to make predictions by following the instructions in this blog. This way, you can ensure your model is accessible and helpful after training.

Thank you for reading the article.