I am working on a project using GCD (grand central dispatch) to compute a simulation. All is fine and fast, but there seems to be an unexplainable memory leak which gets larger, the more queues I use. When for example using 100 queues, It only takes 10 seconds till 1 GB is filled. And it continues to infinity, finally crashing the app, especially when working in 32 Bit mode.
When running the same code on Snow Leopard, no memory leaks. I ported it using WINAPI threads to windows, no leaks at all. I used Mavericks for a long time but recently switched to El Capitan. It must be something in El Capitan.
When analyzing the application using Instruments "Memor Leaks", I see no leaks. And indeed, the memory consumption does not rises higher than it should. Only if I run the application using XCode or running it directly, I see the memory growing. I have checked the same situation on a different computer, also with El Capitan running, same thing, memory leaks with no end.
Has anybody else experienced such a thing lately? I did not find any reports in the web so far. Do you have any idea, what could cause this?
More information: I tried running it in the default run loop of NSApplication, I tried to create my own run loop, I tried to manually drain all NSAutorelease buffers, nothing helped. I even implemented a barrier which halts and deletes all queues every five seconds or so and waits for a certain time until it restarts creating all queues anew and continuing the simulation. Nothing helps.
Again, no such behaviour on other systems. Any advice is appreciated. I am sitting on this particular problem since three days.