Build and Deploy Tensorflow Model and its integration with a Flask application from scratch

Shravan C
2 min readJul 6, 2020
https://stackoverflow.com/questions/54224934/sagemaker-tensorflow-serving-stack-comparisons

In this article, we will cover what needs to be done to deploy the TensorFlow model with the TensorFlow model server and related commands. I have extended one of my previous projects with new topics covering early stopping and model evaluation techniques. Also, will be building a flask application with a form in it, to provide new data to make a prediction.

Points that are covered:

  • Build regression model
  • Train model with Early Stopping callback
  • Evaluate model with the test dataset
  • Save model
  • Save the stats of the dataset for normalization
  • Build a Flask application from scratch
  • Start TensorFlow production-ready server to serve model prediction
  • Make prediction from a flask

Complete video for the above step is as below:

Useful commands are as below:

Deploying TensorFlow model:

# Saving tensorflow modelsave_path = '/tmp/regression_model/1'
tf.keras.models.save_model(model, save_path)

# model_name: Any name for the your model
# model_base_path: location of the saved model
tensorflow_model_server \
--rest_api_port="8501" \
--model_name="regression_model" \
--model_base_path="/tmp/regression_model/"
API request to TensorFlow model server:curl -X POST http://localhost:8501/v1/models/regression_model:predict -d '{"instances": [ [ 0.3, -1.4, 0.8, -0.1, -0.9, -0.3 ] ] }'

Github link for the project is here. Video is pretty lengthy but it is done everything from scratch to understand integration better. Hope it is useful. Enjoy coding!!!

--

--

Shravan C

Software Engineer | Machine Learning Enthusiast | Super interested in Deep Learning with Tensorflow | GCP