Dump file: %TEMP%\Microsoft.VisualStudio.Code.ServiceHost.DMP
Date: 2026-03-14
Workspace: C:\git\runtime (dotnet/runtime repository)
CLR version: .NET 11.0.26.10518 (CoreCLR)
GC mode: Server GC
The 19.3 GB dump is caused by unbounded accumulation of MSBuild project evaluation results in the C# Dev Kit project system service. The service is retaining ~1,371 copies of the full project evaluation state for each of the ~7,067 projects in the dotnet/runtime solution, totaling ~16 GB of managed heap consumed almost entirely in Gen2 (long-lived objects). The leak is in the Microsoft.VisualStudio.Server.Contracts / Microsoft.Build.Evaluation pipeline.
| Metric | Value |
|---|---|
| Total managed heap | 16.32 GB |
| Object count | 248,737,850 |
| GC segments | 4,176 |
| Gen2 (long-lived) | 15.36 GB (93.7%) |
| Gen1 | 0.68 GB |
| Gen0 | 0.18 GB |
| Large Object Heap | 0.16 GB |
| Free space | 146 MB |
Almost everything (93.7%) is pinned in Gen2, meaning the GC has determined these objects are long-lived and is not collecting them. This is consistent with a reference leak, not a transient spike.
Three types appear at exactly 9,693,225 instances each, proving they are allocated as 1:1:1 tuples:
| Type | Count | Total Size |
|---|---|---|
ProjectServiceRequestParameters |
9,693,225 | 698 MB |
ProjectId |
9,693,225 | 465 MB |
ProjectData |
9,693,225 | 310 MB |
Structure confirmed by field inspection:
ProjectDatahas two fields:ProjectId+ProjectRequestParameters- Every
ProjectServiceRequestParametershasServiceName = "ProjectSystem.FAE04EC0_301F_11D3_BF4B_00C04F79EFBC"(the C# project type GUID)
How many projects vs. evaluations?
- ~7,067 unique project structures exist (seen via distinct counts of per-project arrays like
ProjectProperty[]andResolvedImport[]) - 9,693,225 / 7,067 = ~1,371 evaluation snapshots retained per project
The ProjectData[] arrays (1,676 arrays, avg size 46 KB = ~5,800 refs each) serve as the containers holding these accumulated evaluation results.
| # | Count | Total Size | Avg Size | Type |
|---|---|---|---|---|
| 1 | 15,885,744 | 2,799 MB | 176 B | System.String |
| 2 | 9,693,225 | 698 MB | 72 B | ProjectServiceRequestParameters |
| 3 | 13,134,469 | 630 MB | 48 B | ProjectMetadata |
| 4 | 9,432,555 | 604 MB | 64 B | SortedInt32KeyNode<ImmutableDictionary<string,string>+HashBucket> |
| 5 | 8,079,366 | 582 MB | 72 B | ProjectItem |
| 6 | 7,162,188 | 573 MB | 80 B | ProjectItemInstance+TaskItem |
| 7 | 9,611,782 | 538 MB | 56 B | ProjectProperty+ProjectPropertyXmlBacked |
| 8 | 9,788,408 | 470 MB | 48 B | ProjectPropertyInstance+ProjectPropertyInstanceImmutable |
| 9 | 9,693,225 | 465 MB | 48 B | ProjectId |
| 10 | 5,490,446 | 462 MB | 84 B | System.Int32[] |
| 11 | 4,159,986 | 399 MB | 96 B | ReaderWriterLockSlim |
| 12 | 4,110,507 | 390 MB | 94 B | RetrievableEntryHashSet<ProjectMetadata>+Slot[] |
| 13 | 1,871,236 | 346 MB | 185 B | ProjectItem[] |
| 14 | 4,110,507 | 329 MB | 80 B | RetrievableValuedEntryHashSet<ProjectMetadata> |
| 15 | 9,693,225 | 310 MB | 32 B | ProjectData |
Positions 2-15 are all MSBuild/Project System types. The only generic .NET type in the top 15 is System.String (which is largely comprised of project paths and MSBuild property values backing the same leak).
The MSBuild evaluation pipeline creates a rich object graph per evaluation. With ~1,371 retained copies per project, the total is enormous:
| Category | Key Types | Estimated Total |
|---|---|---|
| Properties | ProjectProperty+XmlBacked, ProjectPropertyInstance+Immutable, PropertyDictionary, property slot arrays |
~1.8 GB |
| Items | ProjectItem, ProjectItemInstance, TaskItem, item arrays, MultiDictionary slots |
~2.1 GB |
| Metadata | ProjectMetadata, metadata hash sets, metadata slot arrays |
~1.5 GB |
| Request tuples | ProjectData, ProjectId, ProjectServiceRequestParameters |
~1.5 GB |
| Collections/infra | ImmutableDictionary, SortedInt32KeyNode, ReaderWriterLockSlim, List<> |
~2.0 GB |
| Strings (project paths, values) | System.String, System.String[] |
~3.1 GB |
| MEF/Composition | RuntimePartLifecycleTracker, ComposedLazy, display classes |
~0.5 GB |
| Other MSBuild | TaskRegistry, TargetSpecification, ProjectItemDefinition, imports |
~0.8 GB |
Total MSBuild/Project System leak: ~13.3 GB out of 16.3 GB managed heap (81.6%)
-
The VS Code C# Dev Kit service host (
Microsoft.VisualStudio.Code.ServiceHost) hosts the project system for the workspace. -
The workspace is dotnet/runtime -- a massive repository with ~7,067 MSBuild projects.
-
On each project system event (file change, solution reload, configuration change, etc.), the service evaluates projects via MSBuild and creates a
ProjectDatatuple containingProjectId+ProjectServiceRequestParameters. -
These evaluation snapshots are accumulated in
ProjectData[]arrays (1,676 arrays averaging 5,800 entries each) and never pruned. The 1,371 evaluations-per-project ratio suggests the service ran for an extended period with frequent re-evaluations. -
Each evaluation retains the full MSBuild state -- properties, items, metadata, task registrations, imports -- none of which is shared or deduplicated across evaluations.
-
Gen2 dominance (93.7%) confirms this is not a transient burst. The GC promoted these objects through Gen0 and Gen1 because they survived multiple collections, but can never collect them because they remain reachable from the accumulating arrays.
Large strings (>10 KB) total only ~7.2 MB -- not a significant factor. The top strings are MSBuild glob patterns like:
C:\git\runtime\src\coreclr\nativeaot\System.Private.CoreLib\src\**\*.csC:\git\runtime\src\libraries\System.Security.Cryptography\src\**\*.cs
The 2.8 GB of total string memory comes from millions of small-to-medium strings (project paths, property names/values, item metadata) that are part of the leaked evaluation state.
The leak is in the C# Dev Kit project system service (Microsoft.VisualStudio.Server.Contracts namespace), specifically in whatever component accumulates ProjectData instances. The pattern of:
- Exactly matching counts (9,693,225) across
ProjectData,ProjectId, andProjectServiceRequestParameters - Growing
ProjectData[]arrays - All with the same C# project type GUID
...points to a list/cache of project evaluation results that grows without bound. The fix would be to either:
- Replace old evaluation results when a project is re-evaluated (keep only the latest per project)
- Implement an eviction policy on the cache
- Properly dispose/release MSBuild
ProjectandProjectInstanceobjects after extracting needed data
- File a bug against C# Dev Kit (
ms-dotnettools.csdevkit) with this analysis. The extension version in the dump is2.13.9. - Workaround: Periodically restart the C# Dev Kit service host (
>C# Dev Kit: Restart Language Serverin VS Code command palette) to reclaim memory. - Workaround: For very large solutions like dotnet/runtime, consider using a
.slnf(solution filter) to reduce the number of loaded projects. - Monitor: The C# Dev Kit logs may show repeated "project evaluation" events that correlate with the leak growth.
When running dotnet-dump analyze <dump> -c "dumpheap -stat" interactively via a PTY/piped session, the tool appeared completely unresponsive with the user reporting "zero disk or CPU". However, non-interactive execution completed successfully in 65.6 seconds producing 1.46 MB / 8,649 lines of output.
- Started
dotnet-dump analyzein an interactive async terminal session - Sent
dumpheap -statcommand - Captured the PID (49156) and monitored CPU/memory over time
- Used
dotnet-stack report -p 49156to capture managed call stacks at two points in time - Ran the same command non-interactively with file redirection
Sample 1 (during dumpheap -stat execution):
Interop+Kernel32.UnmapViewOfFile(int) ← page eviction
SafeHandle.InternalRelease(bool)
SafeHandle.Dispose()
MemoryMappedViewAccessor.Dispose(bool)
ArrayPoolBasedCacheEntry.GetPageDataAtOffset(uint64) ← cache miss → evict + remap
CacheEntryBase`1.ReadPageDataFromOffset(...)
CachedMemoryReader.TryReadMemory(...)
ImageMappingMemoryService.ReadMemory(...) ← extra indirection layer
ClrHeap+<EnumerateObjects>d__83.MoveNext() ← heap walk iterator
HeapWithFilters+<EnumerateFilteredObjects>d__46.MoveNext()
DumpHeapService.PrintHeap(...)
DumpHeapCommand.Invoke()
Sample 2 (captured moments later):
Interop+Kernel32.ReadConsoleInput(...) ← back at the prompt, command finished!
ConsoleService.Start(...)
The non-interactive run (dotnet-dump analyze <dump> -c "dumpheap -stat" > output.txt) completed in 65.6 seconds. This is comparable to our custom ClrMD script (83s). The command is not broken.
The dumpheap -stat command produces 1.46 MB of output (8,649 lines). In an interactive piped/PTY session:
DumpHeapService.PrintHeap()writes results to stdout viaConsole.Write/Console.Out- The pipe buffer between the terminal emulator and the process has a limited capacity (typically 64 KB on Windows)
- Once the buffer fills,
Console.Writeblocks waiting for the consumer to drain the pipe - If the terminal consumer reads slowly (or our read_powershell polling reads in batches), the process alternates between computing and blocking on I/O
- This makes the process appear idle (low CPU, no disk) during the I/O-blocked intervals
The stack trace reveals dotnet-dump's internal architecture for reading large dumps:
CachedMemoryReader→ usesArrayPoolBasedCacheEntryto cache dump pages in managed memory- Each cache miss triggers
MemoryMappedViewAccessorcreation andMapViewOfFile/UnmapViewOfFilekernel calls - For a 19.3 GB dump, the cache cannot hold everything; it was observed at 4.4 GB working set
- On a cold file cache (first run after dump creation), every cache miss becomes a disk seek
- The
ImageMappingMemoryServicelayer (from Microsoft.Diagnostics.DebugServices) adds an additional indirection that the raw ClrMD API doesn't have
- First run (user's experience): The 19.3 GB dump was cold on disk. Random-access reads through memory-mapped pages caused heavy disk I/O that may have appeared as "zero" in task manager if the I/O was spread across many small reads rather than sustained throughput
- Our later runs: After our ClrMD script sequentially read the entire dump, the OS page cache was warm, making subsequent runs dramatically faster
Our custom ClrMD script using DataTarget.LoadDump() + heap.EnumerateObjects() completed in 83 seconds on the first (relatively warm) run because:
- It uses ClrMD's
DataTarget.LoadDump()directly without theImageMappingMemoryService/DebugServicesmiddleware - It writes output to the console only periodically (progress every 5M objects, summary at end) rather than per-object
- It aggregates in memory and prints once, rather than streaming 8,649 lines through a pipe
dotnet-dump analyze is not fundamentally broken on this dump. The apparent hang was caused by:
- Pipe buffer backpressure: 1.46 MB of
dumpheap -statoutput blocking on a slow-draining console pipe - Cold file cache: First access to 19.3 GB dump with random-access pattern
- Always use non-interactive mode with file redirection:
dotnet-dump analyze <dump> -c "dumpheap -stat" > output.txt - Warm the OS file cache first if possible (e.g.,
Get-Content <dump> -ReadCount 0 > $null) - For programmatic analysis, use ClrMD (
Microsoft.Diagnostics.Runtime) directly — it avoids theDebugServicesmiddleware overhead and gives full control over output - Consider a
dotnet-dumpimprovement: A progress indicator duringdumpheapon large heaps would prevent the appearance of a hang (the command walks 248M objects silently)