Giter Club home page Giter Club logo

Comments (13)

pauldb89 avatar pauldb89 commented on May 16, 2024 4

I replaced
from tensorflow_serving.example import mnist_input_data
with
from tensorflow.contrib.learn.python.learn.datasets import mnist as mnist_input_data
and it worked for me.

from serving.

fangweili avatar fangweili commented on May 16, 2024 1

Short answer:

You should build the target with:
ubuntu@ip-172-31-5-123:~/serving$ bazel build tensorflow_serving/example:mnist_export
And run it with:
ubuntu@ip-172-31-5-123:~/serving$ bazel-bin/tensorflow_serving/mnist_export

Long answer:

Tensorflow Serving does not currently have binary release. Therefore you need to build against its source from scratch for development. For bazel-based project, the build output are organized as this. For python binary target in particular, you have to execute by running the generated shell script instead -- in this case mnist_export. The script sets up proper package/library search path -- to find tensorflow_serving.example package, before executing mnist_export.py.

from serving.

ronilp avatar ronilp commented on May 16, 2024 1

@pauldb89 how did you know tensorflow.contrib.learn.python.learn.datasets should come in place of tensorflow_serving.example ?

from serving.

fangweili avatar fangweili commented on May 16, 2024

closing... feel free to reopen if there's still questions.

from serving.

coltsfan avatar coltsfan commented on May 16, 2024

Hello fangweili: Thanks for the response. I understand your point at a high level but I still don't know how you would do that tactically. Can you elaborate on the steps I would need to take in order to import the tesnorflow_serving module in raw python? I apologize for asking such a trivial question -- for the novices amongst us, that documentation would be super useful.

from serving.

fangweili avatar fangweili commented on May 16, 2024

Do you want to just deploy the mnist example on AWS? Or are you trying to use Tensorflow Serving Exporter in your python code?

from serving.

coltsfan avatar coltsfan commented on May 16, 2024

I want to use TFlow Serving Exporter in my Python Code.

from serving.

vodp avatar vodp commented on May 16, 2024

This means tensorflow_serving cannot be used in an interactive python command line?

from serving.

sukritiramesh avatar sukritiramesh commented on May 16, 2024

Hi @vodp, based on @fangweili's response earlier, I think that is correct. Given that there is no release currently, in the context of export.py, this would probably result in a failed import on the tensorflow_serving package.

from serving.

suraj-vantigodi avatar suraj-vantigodi commented on May 16, 2024

@fangweili
I want to deploy machine translation example on AWS.
Currently I have created a REST webservice. But here the problem is the model and the vocabulary files get loaded for every hit, so its time consuming. That is when I came across Tensorflow serving.
Can you please give me some inputs on how to go ahead. The MINST example is not that clear.

Also when building the test (bazel test tensorflow_serving/...) I get the output as Executed 0 out of 50 tests : 1 fails to build and 49 were skipped.

from serving.

blusa avatar blusa commented on May 16, 2024

You can work it around writing your export code within the scope of tensorflow_serving, adding it to bazel and compiling again

from serving.

kirilg avatar kirilg commented on May 16, 2024

@suraj1990 Did you get a chance to go through https://tensorflow.github.io/serving/ and the architecture overview/tutorials? Can you elaborate on what part was unclear?

Did you have any local changes when running bazel test? The build is not broken so everything should run. Please include more details about the failed build including the output errors if it doesn't include your local changes (probably best to do that in a separate github issue).

from serving.

kirilg avatar kirilg commented on May 16, 2024

Following up on the original question: The exporter (along with the rest of session_bundle) were moved to Tensorflow/contrib in the main Tensorflow repo, so the exporter is now part of the regular release. That means that you can either link it in directly and compile using Bazel (as mentioned before), or install Tensorflow using the pip package and import directly in your Python program as you would any other Tensorflow python code.

from serving.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.