Comments (6)
You can use this repository to test locally:
https://github.com/eddy2by4/django-testing
pip install -r requirements.txt
python manage.py makemigrations
python manage.py migrate
Add 1 new record manually with PgAdmin or a different tool to the db in the Books table
granian --interface wsgi djangotesting.wsgi:application --workers 10
Run the wrk tests
from granian.
From the readme:
the default number of blocking threads should work properly with the majority of applications; in synchronous protocols like WSGI this will also impact the number of concurrent requests you can handle, but you should use the
backpressure
configuration parameter to control it and set a lower number of blocking threads only if your application has a very low (1ms order) average response time;
Thus, you should set --backpressure
to the maximum number of database connections (per worker). See also https://polar.sh/emmett-framework/posts/granian-1-4
from granian.
I am not sure what value should I add here.
I've tried:
--workers 2 --backpressure 2
I would expect this to be very limiting and have a max of 4 db connections, but when I run the wrk script it still hits 100.
What should be a good backpressure value to use to have performance and reliability and make sure db connections don't hit through the roof ?
from granian.
I've tried:
--workers 2 --backpressure 2
I would expect this to be very limiting and have a max of 4 db connections, but when I run the wrk script it still hits 100.
That's somewhat strange. Backpressure should work as you expected:
❯ granian --interface wsgi --workers 1 benchmarks.app.wsgi:app
[INFO] Websockets are not supported on WSGI
[INFO] Starting granian (main PID: 88229)
[INFO] Listening at: http://127.0.0.1:8000
[INFO] Spawning worker-1 with pid: 88231
[INFO] Started worker-1
[INFO] Started worker-1 runtime-1
❯ wrk -d 10s -c 100 http://localhost:8000/io10
Running 10s test @ http://localhost:8000/io10
2 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 12.70ms 0.92ms 20.30ms 71.78%
Req/Sec 3.95k 105.30 4.16k 76.62%
79020 requests in 10.10s, 10.32MB read
Requests/sec: 7821.30
Transfer/sec: 1.02MB
❯ granian --interface wsgi --workers 1 --backpressure 2 benchmarks.app.wsgi:app
[INFO] Websockets are not supported on WSGI
[INFO] Starting granian (main PID: 88332)
[INFO] Listening at: http://127.0.0.1:8000
[INFO] Spawning worker-1 with pid: 88334
[INFO] Started worker-1
[INFO] Started worker-1 runtime-1
❯ wrk -d 10s -c 100 http://localhost:8000/io10
Running 10s test @ http://localhost:8000/io10
2 threads and 100 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 12.58ms 695.12us 17.45ms 83.02%
Req/Sec 159.29 7.73 181.00 68.00%
1596 requests in 10.10s, 213.53KB read
Requests/sec: 158.04
Transfer/sec: 21.14KB
The only other limitation you can set is --blocking-threads
(also per worker): that puts a hard limit in the number of active threads running Python code.
I'm not sure is there something going on with the way Django ORM use connections, maybe it's just opening new connections instead of re-using the previous one..
Anyways, a good value for backpressure
(or blocking-threads
if the first one doesn't work for you) for your use case would be max_db_connections_you_want/workers
.
from granian.
@eddy2by4 do you have any updates on this? May I close it?
from granian.
Closing this due to inactivity. Feel free to comment again @eddy2by4 and I might re-open this.
from granian.
Related Issues (20)
- Consider suport for custom `Server` header
- ASGI Flow Error on websockets unidirectional pong while awaiting response HOT 5
- Implement --pid HOT 2
- Strange error with starlette / fastapi HOT 6
- Add `__rsgi_del__` support to RSGI spec
- `--opt` in RSGI mode is incompatible with `asyncio.timeout` HOT 1
- Handling of Left-to-Right Mark (U+200E) in URLs HOT 3
- Memory leak when using Granian in Django HOT 6
- jemalloc and freebsd HOT 5
- Pathos for multiprocessing? HOT 4
- WSGI response iterator lagging with Flask when mimetype="text/event-stream" HOT 4
- WSGI headers with accented characters are not passed to the app HOT 5
- App factory HOT 1
- Stop responding after load testing HOT 5
- Log configurations access date formatter is ignored HOT 3
- Access log logs wrong status on file responses HOT 5
- Thread panic in FastAPI streaming response HOT 6
- Add CLI option to provide a factory function name
- Add Granian extended reloading options HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from granian.