Hello.
We faced a very strange leak while using libvirt library in long-running server
application for cluster orchestration.
Leak does not directly related to libvirt code and exposed only on specific build options
(?) and/or system environment (?).
Here are the key points:
1. Libvirt client leaks memory while making (RPC) calls to a server. I mean that RSS
memory usage shown by ps, top, etc. indefinitely grows (plz check an attachment ps.log).
Test app attached.
2. Leak detected on Debian and Ubuntu and absent on Mac OS and Gentoo, so it's exactly
an environment or build problem.
3. Valgrind does not see the leak. From valgrind's point of view from start to finish
application constantly consumes 110kb of memory (while ps shows multiple megabytes) and
does not contain any leaks.
4. Logging activity of virMalloc, virRealloc, etc. functions does not show anything: as
expected, all allocated memory is correctly freed, so it's definitely not a bug of
code (we tested that before recognized that problem is distrib/platform specific).
Some useful logs and test code attached. I think digging build options and legoing system
libraries will help us to beat the problem but it would be nice if someone already had a
working solution. Thank you for help.
Attachments:
- valgrind.log
(application/octet-stream — 75.1 KB)
- ps.log
(application/octet-stream — 2.8 KB)
- leak.c
(application/octet-stream — 348 bytes)