热门标签 | HotTags
当前位置:  开发笔记 > 编程语言 > 正文

如何服务于张量流模块,特别是通用句子编码器?

如何解决《如何服务于张量流模块,特别是通用句子编码器?》经验,为你挑选了1个好方法。

我花了几个小时尝试设置Tensorflow-hub模块“通用语句编码器”的Tensorflow服务。这里有一个类似的问题:

如何使用TensorFlow服务使TensorFlow集线器嵌入成为可服务的?

我一直在Windows计算机上执行此操作。

这是我用来构建模型的代码:

import tensorflow as tf
import tensorflow_hub as hub

MODEL_NAME = 'test'
VERSION = 1
SERVE_PATH = './models/{}/{}'.format(MODEL_NAME, VERSION)

with tf.Graph().as_default():
  module = hub.Module("https://tfhub.dev/google/universal-sentence- 
  encoder/1")
  text = tf.placeholder(tf.string, [None])
  embedding = module(text)

  init_op = tf.group([tf.global_variables_initializer(), 
  tf.tables_initializer()])
  with tf.Session() as session:
  session.run(init_op)
    tf.saved_model.simple_save(
    session,
    SERVE_PATH,
    inputs = {"text": text},
    outputs = {"embedding": embedding},
    legacy_init_op = tf.tables_initializer()        
   )

我已经到了运行以下行的地步:

saved_model_cli show --dir ${PWD}/models/test/1 --tag_set serve --signature_def serving_default

给我以下结果:

The given SavedModel SignatureDef contains the following input(s):
inputs['text'] tensor_info:
  dtype: DT_STRING
  shape: (-1)
  name: Placeholder:0
The given SavedModel SignatureDef contains the following output(s):
 outputs['embedding'] tensor_info:
  dtype: DT_FLOAT
  shape: (-1, 512)
  name: module_apply_default/Encoder_en/hidden_layers/l2_normalize:0

然后,我尝试运行:

saved_model_cli run --dir ${PWD}/models/test/1 --tag_set serve --signature_def serving_default --input_exprs 'text=["what this is"]'

给出错误:

  File "", line 1
[what this is]
         ^
SyntaxError: invalid syntax

我尝试过更改'text = [“这是什么”]]部分的格式,但对我没有任何帮助。

无论该部分是否工作,主要目标是设置用于提供服务和创建可调用API的模块。

我已经尝试过使用docker,以下行:

docker run -p 8501:8501 --name tf-serve -v ${PWD}/models/:/models -t tensorflow/serving --model_base_path=/models/test

事情似乎已正确设置:

Building single TensorFlow model file config:  model_name: model model_base_path: /models/test
2018-10-09 07:05:08.692140: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
2018-10-09 07:05:08.692301: I tensorflow_serving/model_servers/server_core.cc:517]  (Re-)adding model: model
2018-10-09 07:05:08.798733: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: model version: 1}
2018-10-09 07:05:08.798841: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: model version: 1}
2018-10-09 07:05:08.798870: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: model version: 1}
2018-10-09 07:05:08.798904: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:360] Attempting to load native SavedModelBundle in bundle-shim from: /models/test/1
2018-10-09 07:05:08.798947: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /models/test/1
2018-10-09 07:05:09.055822: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2018-10-09 07:05:09.338142: I external/org_tensorflow/tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2018-10-09 07:05:09.576751: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:162] Restoring SavedModel bundle.
2018-10-09 07:05:28.975611: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:138] Running MainOp with key saved_model_main_op on SavedModel bundle.
2018-10-09 07:06:30.941577: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:259] SavedModel load for tags { serve }; Status: success. Took 82120946 microseconds.
2018-10-09 07:06:30.990252: I tensorflow_serving/servables/tensorflow/saved_model_warmup.cc:83] No warmup data file found at /models/test/1/assets.extra/tf_serving_warmup_requests
2018-10-09 07:06:31.046262: I tensorflow_serving/core/loader_harness.cc:86] Successfully loaded servable version {name: model version: 1}
2018-10-09 07:06:31.184541: I tensorflow_serving/model_servers/server.cc:285] Running gRPC ModelServer at 0.0.0.0:8500 ...
[warn] getaddrinfo: address family for nodename not supported
2018-10-09 07:06:31.221644: I tensorflow_serving/model_servers/server.cc:301] Exporting HTTP/REST API at:localhost:8501 ...
[evhttp_server.cc : 235] RAW: Entering the event loop ...

我试过了

curl http://localhost:8501/v1/models/test

这使

{ "error": "Malformed request: GET /v1/models/test:predict" }

curl -d '{"text": "Hello"}' -X POST http://localhost:8501/v1/models/test:predict

这使

{ "error": "JSON Parse error: Invalid value. at offset: 0" }

类似的问题在这里

Tensorflow服务:Rest API返回“格式错误的请求”错误

只是寻找任何方式来获得该模块的服务。谢谢。



1> Joe Hidakats..:

我终于可以弄清楚了。我将在这里发布我的操作,以防其他人尝试做同样的事情。

我的save_model_cli run命令问题是引号(使用Windows命令提示符)。更改'text=["what this is"]'"text=['what this is']"

POST请求的问题有两个方面。一,我注意到模型的名称是模型,所以应该是http:// localhost:8501 / v1 / models / model:predict

其次,输入格式不正确。我使用了Postman,请求的主体如下所示: {"inputs": {"text": ["Hello"]}}


天哪。谢谢您回答自己的问题
推荐阅读
author-avatar
caiyingsheng852
这个家伙很懒,什么也没留下!
PHP1.CN | 中国最专业的PHP中文社区 | DevBox开发工具箱 | json解析格式化 |PHP资讯 | PHP教程 | 数据库技术 | 服务器技术 | 前端开发技术 | PHP框架 | 开发工具 | 在线工具
Copyright © 1998 - 2020 PHP1.CN. All Rights Reserved | 京公网安备 11010802041100号 | 京ICP备19059560号-4 | PHP1.CN 第一PHP社区 版权所有