When to Force Garbage Collection

I've recently been working on a C#.NET batch image resampling application. The app would take a large list of files as input, and attempt to resample all these images as fast as possible to given dimensions. This was done using only managed WPF code (no GDI), so all memory allocation and garbage collection should be left to the CLR according to recommendations.

However, the application was very quickly reaching 1.2GB plus of memory usage before starting to throw OutOfMemory exceptions. I double-checked the code, and every file being resampled appeared to be self-contained (i.e. no dangling references left around). The garbage collector was just not freeing up the memory fast enough after each re-sampling operation.

So I tested putting in a GC.Collect() at the end of each resample iteration. Memory usage of the application dropped dramatically, fluctuating between 150MB to 300MB maximum for the life-time of the application during any resample job. This eliminated the OutOfMemory exceptions.

I also tested to see if this would have a performance impact, but the before-and-after times were almost identical. And given that some of the resample jobs in the first implementation were throwing exceptions and aborting early, I think post-force-GC was actually faster.

So even though the recommendation is "never mess with the GC unless absolutely sure you know what you're doing", there are times were you do need to give it some thought. Especially for batch applications, or any application that runs a very grunty parallelised task.

Comments

Popular posts from this blog

Wkhtmltopdf font and sizing issues

Import Google Contacts to Nokia PC Suite

Can't delete last blank page from Word