Andy Crouch - Code, Technology & Obfuscation ...

Incorrect Use Of .Net Garbage Collector

Dustbin On Misty Street

Photo: Unsplash

I received a Pull Request a few days ago which included the following 2 lines repeated in 2 methods in a class.

GC.Collect();
GC.WaitForPendingFinalizers();

Now if ever a couple of lines would make me sit up and take notice it would be these two.

Some context. The code I was reviewing was a console application which was to run as a web job on Azure. The aim of the job was to join two sets of data and pick out the largest value for each month. The results were then persisted back to the database.

I started to read through the code. Most of it was boilerplate for how we build web jobs. The crux of the whole application was in a service processing class. When I got to the bulk of the service class I found something like:

var dataList = GetDataToBeProcessed();
while (dataList.Count > 0)
{    
  var subDataList = dataList.Take(500);
  RunCalculations(db, subList);
  
  // some other code .....
  
  GC.Collect();     
  GC.WaitForPendingFinalizers(); 
}

I peeked into RunCalculations() and found:

var entity	= _unitOfWork.GetRepository<Entity>().GetByKey(object.EntityId);
var fromDate = new DateTime(model.Year, model.Month, 1);
var toDate = new DateTime(model.Year, model.Month, DateTime.DaysInMonth(model.Year, model.Month));
model.ValueToBeSet = entity.CalculateValue(fromDate, toDate, _unitOfWork).Value;
GC.Collect();
GC.WaitForPendingFinalizers();

(As an aside I hate the use of nullable fields in all but a very select set of circumstances. I will write them down one day).

I decided to run the application and see what it would do. The logic was not far off the spec but I wanted to see how fast it would process our data. Locally it would be touching about 300,000 records. So I set it off. I waited and I read some emails and realized it had still not run. Tldr; I left it for an hour. One of the things that I like to add in Web Jobs is a lot of console writing/logging. It really helps debug issues once on Azure. There was none in this initial version of the application so I was wondering if it was even running still.

I killed off the app and re-read the code and instantly saw one of the issues. During the loop of the RunCalculations() method the developer was using a UnitOfWork class to pull back an Entity record on each iteration. That meant for each iteration we had a database call.

It was at this point I got the developer to look at the code with me. I started by asking them to walk me through the code. I wanted to get an idea of how they saw the code and how it executed. When they were finished I asked how long it took to run on their machine.

“Oh, the first time it will take a long time as there is a lot of data to process. I had problems running it on my machine so I decided to add code to make it process faster. The GC calls free up memory and the Take command batches the processing into smaller pieces.”

Now I had an insight into why the code was written in such a way. After discussing the approach they had taken I asked the developer if they knew how many database calls their code made. They did not. Out came SQL profiler and we run the job again on my machine. They saw straight away the issue.

The number of database calls was only one of many issues. The query used to return the bulk of the data was not filtering on any criteria. They were bringing back all of the data in the database. They had added logic to entity classes (which by choice we keep anaemic).

We spent a significant amount of time pair programming to refactor the service class. When doing exercises like this with a developer I lead with questions the whole way. The 5 Whys method comes in handy in this situation.

“Why do You have GC calls in your code?” - “Because the memory gets eaten up by the objects required for the calculations”. “Why do the calculations have so many objects” - “Because the calculation needs data and logic”. etc …

After refactoring chunks of code as we progressed through the questions our initial method ended up looking like:

processingDataList = processingDataFactory.GetPendingDataToBeProcessed();
RunCalculationsOn(processingDataList);
var dataToImport = processingDataList.Except(_failedItems.Keys.ToList());
tableGateway.ImportData(dataToImport.ToList());

and the RunCalculations() Method ended up looking like:

Parallel.ForEach(processingDataList, (dataItemToProcess) =>
{
  try
  {
    var fromDate = new DateTime(dataItemToProcess.Year, dataItemToProcess.Month, 1);
    var toDate = new DateTime(dataItemToProcess.Year, dataItemToProcess.Month, DateTime.DaysInMonth(dataItemToProcess.Year, dataItemToProcess.Month));
    
    using (var calculation = new Calculation())
    {
      var result = calculation.RunFor(dataItemToProcess.MeterId, fromDate, toDate);
      
      if(result.HasValue)
          dataItemToProcess.MaxDemand = result.Value;
      else
          _failedMeterMaxDemand.Add(dataItemToProcess, new Exception("There was no data to be processed"));
    }
  }
  catch (Exception ex)
  {
    _failedMeterMaxDemand.Add(dataItemToProcess, ex);
  }
}

The orchestration method in the processing service is now nice and clean and has all behaviour injected (omitted for brevity) into it. It gets, processes and imports the data. Simple, clean logic. The RunCalcuaitos() method is also much improved. The calculations have been removed from the entity and are wrapped in their own, disposable, class. With a parallel ForEach loop, the data needed to be processed access prior to the processing rather than during it and no GC calls.

With the improvements made a rerun of the application resulted in all records being processed and imported into the database in under 1 minute. The developer’s machine never once looked like it was struggling and there was no need for manual GC collection.

If you’d like to discuss any of my thoughts here then as always please contact me via twitter or email.