- https://www.pyimagesearch.com/2018/02/05/deep-learning-production-keras-redis-flask-apache/
- https://medium.com/analytics-vidhya/deploy-machine-learning-models-with-keras-fastapi-redis-and-docker-4940df614ece
- https://shivamrana.me/2020/05/flask-prod/
- https://gabimelo.medium.com/developing-a-flask-api-in-a-docker-container-with-uwsgi-and-nginx-e089e43ed90e
boiler plates with following configs for flask based web server with redis used as message broker in backend and simple python based model server
1.) Flask dev server
2.) uwsgi based prod server (the fastest wsgi framework in python) with nginx as web server
3.) Fastapi based prod serevr (the mainstream asgi framework) with Gunicorn
run_web_server.py contains all our Flask web server code — nginx will load this when starting our deep learning web app.
a.) Load our Keras model from disk
b.) Continually poll Redis for new images to classify
c.) Classify images (batch processing them for efficiency)
d.) Write the inference results back to Redis so they can be returned to the client via Flask