Clearing memory in PowerShell after completion

To achieve longer run times, consider dividing the task into several shorter runs. As an alternative solution, you can attempt to initiate garbage collection, although it is uncertain whether it will be effective. Keep in mind that you have limited control over the process, and it may be beneficial to clear variables or eliminate references to the data beforehand.

Solution 1:

Presumably, the file you mentioned,


, is the one that is multi-gigabyte in size. However, if you are referring to other files, this assumption may not be valid.

Your memory usage may be high due to invoking


within the parenthesis and iterating through them using the


statement. Instead, consider using a PowerShell pipeline to stream records from the file without keeping all of them in memory at once. To achieve this, replace the


statement with a



  Import-Csv 'campaigns.txt' -Delimiter '|' | ForEach-Object {
        foreach($id in ($_.CampaignID -split ' ')) {
            $campaignByID[$id] = $_.CampaignName

The performance of the .NET garbage collector is expected to improve significantly with this modification, especially in scenarios where most objects have a short lifespan. Consequently, there should also be a reduction in the amount of time spent winding down at the end.

I would discourage the use of


to initiate garbage collection as the garbage collector is better equipped to determine when to run. The rationale behind this is intricate, and if you are interested in understanding it thoroughly, Maoni’s blog provides a plethora of information on garbage collection within the .NET ecosystem.

Solution 2:

You can attempt to prompt the garbage collection to execute, although there is no guarantee of success.


Although you cannot exercise precise control over it, it might be beneficial to either apply


or assign values to


variables in advance for certain tasks. This will eliminate any potential data references.

Frequently Asked Questions