Giter Club home page Giter Club logo

Comments (6)

Johnson-yue avatar Johnson-yue commented on May 18, 2024

I'm using your new docker image . I'm very happy to look at the update .

I run the image is tobegit3hub/simple_tensorflow_serving:latest-gpu
When I load my model(resnet_50) like :
image

I think my model is ok ,and your serving is no error:
image

Then , using your demo to generate client file:
curl http://localhost:8500/v1/models/ __resnet_50__ /gen/client?language=python > client.py

The __client.py __ was generated!!

But When I run python client.py it cost a lot of time and nothing output.So I check the my GPU memory usage it is not increase!!!!
It means the model not use GPU, would you please check it and provide a GPU model for test? Thank you

from simple_tensorflow_serving.

tobegit3hub avatar tobegit3hub commented on May 18, 2024

The answer is No and you can use GPU with tobegit3hub/simple_tensorflow_serving:latest-gpu.

It would be the problem of the docker command if you want to use GPUs in docker container. You can run simple_tensorflow_serving without docker or run with these parameters to mount gpu devices.

export CUDA_SO=$(\ls /usr/lib/x86_64-linux-gnu/libcuda.* | xargs -I{} echo '-v {}:{}')
export DEVICES=$(\ls /dev/nvidia* | xargs -I{} echo '--device {}:{}')
docker run -it -p 8500:8500 $CUDA_SO $DEVICES tobegit3hub/simple_tensorflow_serving:latest-gpu

from simple_tensorflow_serving.

Johnson-yue avatar Johnson-yue commented on May 18, 2024

@tobegit3hub Maybe I forget mount gpu device , I will try this . Thank you

from simple_tensorflow_serving.

tobegit3hub avatar tobegit3hub commented on May 18, 2024

I have updated the README and test successfully in our environment.

You can put all CUDA files in /usr/cuda_files/ and test with these commands. Otherwise, you can put your environments variables and make sure the docker image can access GPU and CUDA.

export CUDA_SO="-v /usr/cuda_files/:/usr/cuda_files/"
export DEVICES=$(\ls /dev/nvidia* | xargs -I{} echo '--device {}:{}')
export LIBRARY_ENV="-e LD_LIBRARY_PATH=/usr/local/cuda/extras/CUPTI/lib64:/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/usr/cuda_files"

docker run -it -p 8500:8500 $CUDA_SO $DEVICES $LIBRARY_ENV tobegit3hub/simple_tensorflow_serving:latest-gpu

from simple_tensorflow_serving.

Johnson-yue avatar Johnson-yue commented on May 18, 2024

@tobegit3hub I update your docker image : tobegit3hub/simple_tensorflow_serving:latest-gpu. It work but there are some bugs :

  1. tf-serving -gpu default is allocation ALL GPU Memory it was so terrible!!! look here , and luckly some have fixed it fixed code, Can you add these config and re-compile the TF-Serving. By the way , the method is per_process_gpu_memory_fraction but I think allow_growth=True but I do not know how to do it.

  2. I test the half_plus_one model it is demo in tf-serving docker, it is work fine in tf-serving:latest-gpu docker image but, it is not working in your docker image(latest-gpu), My way is :

    1. copy saved_model_half_plus_two_gpu into /simple_tensorflow_serving/models/
    2. /simple_tensorflow_serving/models/
      | 00000123
      | ----- | saved_model.pb
      | ------- | variables
      | -----------| variables.data-00000-of-00001
      | -----------| variables.index
    3. I run simple_tensorflow_serving --model_base_path="./models/saved_model_half_plus_two_gpu/"
      ERROR : IOError: SavedModel file does not exist at: ./models/saved_model_half_plus_two_gpu/123/{saved_model.pbtxt|saved_model.pb}
      why???

Do you limited model format??
image
image

from simple_tensorflow_serving.

tobegit3hub avatar tobegit3hub commented on May 18, 2024

Thanks for reporting. The actual path should be ./saved_model_half_plus_two_cpu/00000123 but it seems to lookup the ./saved_model_half_plus_two_cpu/123.

I will open another issue to track this.

To support limit GPU usage, we don't need to rely or compile TensorFlow Serving. It is possible to add per_process_gpu_memory_fraction as param of TensorFlow Python server.

from simple_tensorflow_serving.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.