Fix race condition in MemoryCache.Compact()#12434
Conversation
Fixes dotnet#12430 Add ToArray() snapshot before OrderBy to prevent concurrent modification exceptions during enumeration of ConcurrentDictionary.
| protected virtual void Compact() | ||
| { | ||
| var kvps = _dict.OrderBy(x => x.Value.LastAccess).ToArray(); | ||
| var kvps = _dict.ToArray().OrderBy(x => x.Value.LastAccess).ToArray(); |
There was a problem hiding this comment.
Nice fix! Although, it's a shame that we now create multiple arrays. It seems like this chain of calls adds a lot of unnecessary allocations. Also, the two ToArray() calls are a little subtle, since the first is a direct call to the thread-safe ConcurrentDictionary<TKey, TValue>.ToArray(), and the second is just LINQ overhead. I'll take a look at cleaning this up a bit.
There was a problem hiding this comment.
Taking a closer look, it looks like this doesn't actually fix the bug -- it just makes it happen less often. The problem is that the _dict field is declared as IDictionary<TKey, TValue>. So, the first ToArray() call is actually a call to Enumerable.ToArray() rather than ConcurrentDictionary<TKey, TValue>.ToArray(). That means it still enumerates the dictionary while it might be modified -- it just does it sooner. Will fix!
While reviewing #12434, I noticed that the fix applied doesn't actually fix the race condition in `MemoryCache<TKey, TValue>.Compact()`. The reason is subtle! The fix was to avoid enumerating `ConcurrentDictionary` while items could be added and removed by first calling `ConcurrentDictionary<TKey, TValue>.ToArray()`, which is thread-safe. However, the receiver of the call isn't a `ConcurrentDictionary<TKey, TValue>`; it's an `IDictionary<TKey, TValue>`, which doesn't offer a `ToArray()` method. So, the new call actually goes to `Enumerable.ToArray(...)`, and it's still enumerating the dictionary while items can be added and removed -- it's just doing it earlier than before. To address this, I went ahead and implemented a comprehensive fix for the `MemoryCache<TKey, TValue>` thread-safety issues. I also updated the tests to use a test accessor instead of subclassing, which better encapsulates the `MemoryCache` implementation. Many thanks to copilot for helping with comments, documentation, and tests! Here is a summary of the fixes and improvements made to `MemoryCache<TKey, TValue>`. - Atomic remove-then-check pattern to eliminate race conditions in compaction - Thread-safe last access time updates using volatile operations - Improved test structure using test accessors instead of subclassing - Enhanced documentation and comments - Added test coverage for concurrent and edge scenarios
Summary
Fixes #12430
This PR resolves a race condition in
MemoryCache<TKey, TValue>.Compact()that was causing intermittent test failures.The Problem
The
Compact()method was enumerating aConcurrentDictionarydirectly while other threads could simultaneously modify it:This violated thread-safety expectations and caused
ArgumentExceptionduring concurrent operations.The Solution
Add an initial
.ToArray()call to create a snapshot before sorting:This ensures we operate on a stable snapshot rather than the live dictionary, preventing concurrent modifications from disrupting the LINQ enumeration.
Impact
Microsoft.CodeAnalysis.Razor.Utilities.MemoryCacheTest.ConcurrentSets_DoesNotThrow