I am trying to serve the model over tensorflow serving and I have created the below signature. But it doesnt seem to work. Please help me @pskrunner14
encode_seqs = tf.placeholder(dtype=tf.int64, shape=[batch_size, None], name="encode_seqs")
decode_seqs = tf.placeholder(dtype=tf.int64, shape=[batch_size, None], name="decode_seqs")
Inference Data Placeholders
encode_seqs2 = tf.placeholder(dtype=tf.int64, shape=[1, None], name="encode_seqs")
decode_seqs2 = tf.placeholder(dtype=tf.int64, shape=[1, None], name="decode_seqs")
export_path_base = './export_base/'
export_path = os.path.join(
tf.compat.as_bytes(export_path_base),
tf.compat.as_bytes(str(1)))
print('Exporting trained model to', export_path)
builder = tf.saved_model.builder.SavedModelBuilder(export_path)
classification_inputs = tf.saved_model.utils.build_tensor_info(
encode_seqs)
classification_outputs_classes = tf.saved_model.utils.build_tensor_info(
decode_seqs)
#classification_outputs_scores = tf.saved_model.utils.build_tensor_info(loss)
classification_signature = (
tf.saved_model.signature_def_utils.build_signature_def(
inputs={
tf.saved_model.signature_constants.CLASSIFY_INPUTS:
classification_inputs
},
outputs={
tf.saved_model.signature_constants.CLASSIFY_OUTPUT_CLASSES:
classification_outputs_classes,
#tf.saved_model.signature_constants.CLASSIFY_OUTPUT_SCORES:
#classification_outputs_scores
},
method_name=tf.saved_model.signature_constants.CLASSIFY_METHOD_NAME))
tensor_info_x = tf.saved_model.utils.build_tensor_info(encode_seqs2)
tensor_info_y = tf.saved_model.utils.build_tensor_info(decode_seqs2)
prediction_signature = (
tf.saved_model.signature_def_utils.build_signature_def(
inputs={'issue': tensor_info_x},
outputs={'solution': tensor_info_y},
method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))
builder.add_meta_graph_and_variables(
sess, [tf.saved_model.tag_constants.SERVING],
signature_def_map={
'predict_solution':
prediction_signature,
tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY:
classification_signature,
},
main_op=tf.tables_initializer(),
strip_default_attrs=True)
builder.save()
print('Done exporting!')
I have the below signature,
C:\Users\d074437\PycharmProjects\seq2seq>saved_model_cli show --dir ./export_base/1 --all
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['predict_solution']:
The given SavedModel SignatureDef contains the following input(s):
inputs['issue'] tensor_info:
dtype: DT_INT64
shape: (1, -1)
name: encode_seqs_1:0
The given SavedModel SignatureDef contains the following output(s):
outputs['solution'] tensor_info:
dtype: DT_INT64
shape: (1, -1)
name: decode_seqs_1:0
Method name is: tensorflow/serving/predict
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['inputs'] tensor_info:
dtype: DT_INT64
shape: (32, -1)
name: encode_seqs:0
The given SavedModel SignatureDef contains the following output(s):
outputs['classes'] tensor_info:
dtype: DT_INT64
shape: (32, -1)
name: decode_seqs:0
Method name is: tensorflow/serving/classify
But when I try to run it, I get an error as below
C:\Users\d074437\PycharmProjects\seq2seq>saved_model_cli run --dir ./export_base --tag_set serve --signature_def predict_solution --inputs='this is the text'
usage: saved_model_cli [-h] [-v] {show,run,scan} ...
saved_model_cli: error: unrecognized arguments: is the text'