Hello.

I've noticed that after leaving the http router running for a few days (I'm 
using touch-reload for deployments now), the http plugin process starts 
accumulating memory:

F1 (16 days - 600MB):
USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
www-data 27293  0.8  0.7 1186152 130872 ?      Sl   Jul05 199:41  |   |   \_ 
discoapi uWSGI master
www-data 27321 14.1  4.0 1950788 663660 ?      S    Jul05 3273:58  |   |       
\_ discoapi uWSGI http 1
www-data  6824  0.2  0.4 1272812 71028 ?       Sl   Jul19   6:25  |   |       
\_ discoapi uWSGI worker 1
www-data  7074  0.3  0.4 1288748 77804 ?       Sl   Jul19  10:37  |   |       
\_ discoapi uWSGI worker 2
www-data  9253  0.4  0.4 1289016 78392 ?       Sl   Jul19  12:27  |   |       
\_ discoapi uWSGI worker 3
www-data  9275  0.5  0.4 1278472 78040 ?       Sl   Jul19  16:41  |   |       
\_ discoapi uWSGI worker 4
www-data  9283  0.8  0.7 1322772 122348 ?      Sl   Jul19  24:22  |   |       
\_ discoapi uWSGI worker 5
www-data 10124  1.0  0.4 1285272 80572 ?       Sl   Jul19  30:01  |   |       
\_ discoapi uWSGI worker 6
www-data 11736  1.3  0.4 1287684 81936 ?       Sl   Jul19  38:35  |   |       
\_ discoapi uWSGI worker 7
www-data 11744  1.6  0.9 1339052 156132 ?      Sl   Jul19  48:15  |   |       
\_ discoapi uWSGI worker 8
www-data 13489  2.0  0.5 1288216 87404 ?       Sl   Jul19  59:16  |   |       
\_ discoapi uWSGI worker 9
www-data 13497  2.4  0.5 1287972 85704 ?       Sl   Jul19  71:13  |   |       
\_ discoapi uWSGI worker 10
www-data 15103  2.6  0.6 1310832 99232 ?       Sl   Jul19  74:35  |   |       
\_ discoapi uWSGI worker 11
www-data 15111  2.9  0.6 1319768 107160 ?      Sl   Jul19  83:44  |   |       
\_ discoapi uWSGI worker 12
www-data 16770  3.0  0.5 1302012 88664 ?       Sl   Jul19  86:49  |   |       
\_ discoapi uWSGI worker 13
www-data 16778  3.0  1.0 1349968 177212 ?      Sl   Jul19  87:00  |   |       
\_ discoapi uWSGI worker 14
www-data 18477  2.9  0.5 1291152 88284 ?       Sl   Jul19  84:52  |   |       
\_ discoapi uWSGI worker 15
www-data 18485  2.1  0.5 1293872 91248 ?       Sl   Jul19  62:25  |   |       
\_ discoapi uWSGI worker 16
www-data 18493  1.9  0.8 1336144 141880 ?      Sl   Jul19  56:26  |   |       
\_ discoapi uWSGI worker 17
www-data 18501  2.1  0.6 1289328 108196 ?      Sl   Jul19  62:18  |   |       
\_ discoapi uWSGI worker 18
www-data 20136  3.5  0.6 1328328 109676 ?      Sl   Jul19 102:52  |   |       
\_ discoapi uWSGI worker 19
www-data 21833  6.2  0.5 1314308 92056 ?       Sl   Jul19 177:43  |   |       
\_ discoapi uWSGI worker 20

F2 (7 days - 340MB):
USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
www-data 27272  0.7  0.7 1183716 120616 ?      Sl   Jun25 290:08  |   |   \_ 
discoapi uWSGI master
www-data 12915 14.6  2.0 1521208 338608 ?      R    Jul14 1465:13  |   |       
\_ discoapi uWSGI http 1
www-data 16934  0.2  0.5 1288976 85068 ?       Sl   Jul18   9:25  |   |       
\_ discoapi uWSGI worker 1
www-data 19714  0.2  0.4 1277972 78404 ?       Sl   Jul18  10:27  |   |       
\_ discoapi uWSGI worker 2
www-data 19722  0.3  0.4 1279812 65972 ?       Sl   Jul18  16:09  |   |       
\_ discoapi uWSGI worker 3
www-data 19730  0.5  0.5 1296468 83492 ?       Sl   Jul18  21:28  |   |       
\_ discoapi uWSGI worker 4
www-data 19738  0.7  0.4 1288272 75824 ?       Sl   Jul18  29:45  |   |       
\_ discoapi uWSGI worker 5
www-data 19746  0.9  0.7 1322764 117496 ?      Sl   Jul18  41:48  |   |       
\_ discoapi uWSGI worker 6
www-data 19754  1.3  0.6 1302168 103944 ?      Sl   Jul18  57:49  |   |       
\_ discoapi uWSGI worker 7
www-data 19762  1.7  0.5 1300020 88500 ?       Sl   Jul18  72:52  |   |       
\_ discoapi uWSGI worker 8
www-data 19771  2.0  0.6 1295148 112832 ?      Sl   Jul18  84:05  |   |       
\_ discoapi uWSGI worker 9
www-data 19779  2.6  0.5 1288864 84248 ?       Sl   Jul18 109:38  |   |       
\_ discoapi uWSGI worker 10
www-data 19939  4.1  0.7 1390760 120680 ?      Sl   Jul18 172:13  |   |       
\_ discoapi uWSGI worker 11
www-data 19947  3.2  1.3 1403500 223476 ?      Sl   Jul18 135:11  |   |       
\_ discoapi uWSGI worker 12
www-data 19981  3.4  0.8 1301496 139096 ?      Sl   Jul18 144:25  |   |       
\_ discoapi uWSGI worker 13
www-data 19989  3.6  0.5 1291188 90272 ?       Sl   Jul18 151:22  |   |       
\_ discoapi uWSGI worker 14
www-data 20821  3.4  0.7 1316872 120308 ?      Sl   Jul18 143:23  |   |       
\_ discoapi uWSGI worker 15
www-data 20829  2.9  0.6 1313148 105572 ?      Sl   Jul18 123:23  |   |       
\_ discoapi uWSGI worker 16
www-data 21026  2.7  0.7 1318204 126016 ?      Sl   Jul18 115:33  |   |       
\_ discoapi uWSGI worker 17
www-data 21093  3.5  0.6 1317000 104824 ?      Sl   Jul18 150:16  |   |       
\_ discoapi uWSGI worker 18
www-data 23098  4.9  0.8 1387672 142284 ?      Sl   Jul18 204:54  |   |       
\_ discoapi uWSGI worker 19
www-data 23106  8.5  1.1 1448512 183012 ?      Sl   Jul18 356:01  |   |       
\_ discoapi uWSGI worker 20

F1 (11 days - 530MB):
USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
www-data  9622  0.7  0.6 1186156 104588 ?      Sl   Jun25 298:42  |   |   \_ 
discoapi uWSGI master
www-data 31127 14.1  3.2 1758276 534912 ?      S    Jul10 2272:07  |   |       
\_ discoapi uWSGI http 1
www-data 10655  0.2  0.4 1292632 81496 ?       Sl   Jul19   7:32  |   |       
\_ discoapi uWSGI worker 1
www-data 10662  0.3  0.5 1288748 82788 ?       Sl   Jul19   9:47  |   |       
\_ discoapi uWSGI worker 2
www-data 10671  0.3  0.7 1305216 115036 ?      Sl   Jul19  12:54  |   |       
\_ discoapi uWSGI worker 3
www-data 10701  0.5  0.6 1307812 99656 ?       Sl   Jul19  18:04  |   |       
\_ discoapi uWSGI worker 4
www-data 10760  0.8  0.6 1325400 109676 ?      Sl   Jul19  26:17  |   |       
\_ discoapi uWSGI worker 5
www-data 10769  1.1  0.7 1289744 115516 ?      Sl   Jul19  37:54  |   |       
\_ discoapi uWSGI worker 6
www-data 10777  1.4  0.7 1290828 124592 ?      Sl   Jul19  47:04  |   |       
\_ discoapi uWSGI worker 7
www-data 10785  1.7  0.5 1319820 97748 ?       Sl   Jul19  58:26  |   |       
\_ discoapi uWSGI worker 8
www-data 12370  2.2  0.8 1335368 139084 ?      Sl   Jul19  74:37  |   |       
\_ discoapi uWSGI worker 9
www-data 12378  2.5  0.7 1312612 128296 ?      Sl   Jul19  83:12  |   |       
\_ discoapi uWSGI worker 10
www-data 12386  2.8  0.8 1335696 135608 ?      Sl   Jul19  93:43  |   |       
\_ discoapi uWSGI worker 11
www-data 12394  3.2  0.6 1293268 104124 ?      Sl   Jul19 106:20  |   |       
\_ discoapi uWSGI worker 12
www-data 12402  3.4  0.6 1328784 107932 ?      Sl   Jul19 111:18  |   |       
\_ discoapi uWSGI worker 13
www-data 12410  5.2  0.5 1364556 94812 ?       Sl   Jul19 170:06  |   |       
\_ discoapi uWSGI worker 14
www-data 12418  3.4  0.5 1315304 98108 ?       Sl   Jul19 112:09  |   |       
\_ discoapi uWSGI worker 15
www-data 14013  3.0  0.6 1309764 101700 ?      Sl   Jul19  99:04  |   |       
\_ discoapi uWSGI worker 16
www-data 14021  2.9  0.7 1338004 129264 ?      Sl   Jul19  95:57  |   |       
\_ discoapi uWSGI worker 17
www-data 15744  3.6  0.6 1306612 103396 ?      Sl   Jul19 119:01  |   |       
\_ discoapi uWSGI worker 18
www-data 17334  5.7  0.7 1296188 129748 ?      Sl   Jul19 187:24  |   |       
\_ discoapi uWSGI worker 19
www-data 17342  8.9  0.7 1334184 123340 ?      Sl   Jul19 289:46  |   |       
\_ discoapi uWSGI worker 20


The amount of memory used seems proportional to the time the process has been 
running. I'm using 1.9.12 with uWSGI compiled by pip on a Debian Wheezy.

The configuration is:
uwsgi:
    uid: www-data
    gid: www-data
    module: discoapi.wsgi
    pythonpath: xxx
    pythonpath: xxx
    chdir: xxx
    env: DJANGO_SETTINGS_MODULE=discoapi.settings
    processes: 20
    harakiri-verbose: 1
    auto-procname: 1
    no-orphans: 1
    vacuum: 1
    procname-prefix-spaced: discoapi
    master: 1
    listen: 2048
    virtualenv: xxx
    limit-post: 2147483648
    http-timeout: 60
    cache2: name=ssl,items=20000,blocksize=4096
    ssl-sessions-use-cache: ssl
    ssl-sessions-timeout: 300
    https-session-context: discoapi
    http-keepalive: 1
    http-auto-chunked: 1
    error-route-if: startswith:${uwsgi[status]};20 addheader:Connection: 
Keep-alive
    error-route-if-not: startswith:${uwsgi[status]};20 addheader:Connection: 
close
    gevent: 2000
    enable-threads: 1
    shared-socket: 0.0.0.0:80
    shared-socket: 0.0.0.0:443
    http: =0
    https: 
=1,xxx.crt,xxx.key,PREFERRED_CIPHER_SUITE:AES128-SHA:RC4:CAMELLIA128-SHA:!ADH:!aNULL:!eNULL:!NULL:!LOW:!SSLv2:!EXP
    buffer-size: 16384
    socket-timeout: 30
    carbon: xxx:2003
    carbon-id: discoapi
    threaded-logger: 1
    logger: syslog:discoapi
    reload-on-rss: 500
    evil-reload-on-rss: 1000
    touch-chain-reload: xxx
    worker-reload-mercy: 120
    lazy-apps: 1
    memory-report: 1
    shared-import: discoapi.uwsgi_init
    ignore-sigpipe: 1
    ignore-write-errors: 1
    logformat: {RSS: %(rssM)MB} [pid: %(pid)|tid: %(thread_id)|uid: %(userid)] 
%(addr) [%(ltime)] %(method) %(uri) => generated %(rsize) bytes in %(msecs) 
msecs (%(proto) %(status)) %(headers) headers in %(hsize) bytes (%(switches) 
switches on core %(core)) "%(uagent)"


Is the http plugin supposed to accumulate memory this way? The configured cache 
is on the master, right?

What would be the best way to help debug this problem? Normally I use valgrind 
but since the http plugin is forked by the master process I don't think this 
would work…

Best regards,
André

Attachment: smime.p7s
Description: S/MIME cryptographic signature

_______________________________________________
uWSGI mailing list
[email protected]
http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi

Reply via email to