Tensorflow Model Exporting and Serving

Alex Egg,

Tensorflow has some nice patterns the reproducibility and productionalization of your modeling. Namely ubiquity of the TensorFlow runtime (Python, Java, C, Swift, etc) and common patterns (SavedModel) allows you to write portable code for industry.

In this post I will document the E2E process of training, deployment and serving of a Tensorflow Model. My plan is to train a model in python, and serve it in Java.

Train

This model will return an embedding for each sentence you pass it.

import tensorflow as tf
import tensorflow_hub as hub

with tf.Graph().as_default():
  module_url = "https://tfhub.dev/google/nnlm-en-dim128-with-normalization/1"
  embed = hub.Module(module_url)
  
  
	estimator = LinearClassifier(feature_columns=[embedded_text_feature_column],)
  estimator.train(...)

Export

Protocol Buffers

 JSON is popular now-a-days as an interchange format, but another robust method is Protocol Buffers.

feature_spec = tf.feature_column.make_parse_example_spec([embedded_text_feature_column])
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)

estimator.export_saved_model("./exports", serving_input_receiver_fn)

Test

$ saved_model_cli show --dir 1554747715/ --tag_set serve --signature_def serving_default
The given SavedModel SignatureDef contains the following input(s):
  inputs['inputs'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['classes'] tensor_info:
      dtype: DT_STRING
      shape: (-1, 2)
      name: dnn/head/Tile:0
  outputs['scores'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 2)
      name: dnn/head/predictions/probabilities:0
Method name is: tensorflow/serving/classify
saved_model_cli run...

Serve

Python

Java

Permalink: tensorflow-model-exporing-and-serving

Tags:

Last edited by Alex Egg, 2019-04-10 00:32:45
View Revision History