The problem is the following: there is a program that should iterate through the lists of sites, requests are implemented as follows:
static string Get(string adr) { string content = ""; HttpWebResponse response=null; Stream stream=null; StreamReader reader=null; try { HttpWebRequest req = (HttpWebRequest)WebRequest.Create("http://"+adr); //WebProxy prx = new WebProxy("127.0.0.1", 8888); req.UserAgent = "Mozilla/5.0 (Windows NT 6.3; rv:36.0) Gecko/20100101 Firefox/36.0"; req.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; req.Proxy = null; req.KeepAlive = false; req.Headers.Add("Accept-Language", "ru-RU,ru;q=0.9,en;q=0.8"); req.AllowAutoRedirect = true; req.Timeout = 10000; req.ReadWriteTimeout = 10000; req.MaximumAutomaticRedirections = 1; req.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate; req.Method = WebRequestMethods.Http.Get; using (response = (HttpWebResponse)req.GetResponse()) using (stream = response.GetResponseStream()) using (reader = new StreamReader(stream, Encoding.Default)) { content = System.Net.WebUtility.HtmlDecode(reader.ReadToEnd()); response.Close(); stream.Close(); reader.Close(); } } catch { response.Close(); stream.Close(); reader.Close(); } return content; } In streams, this function is simply called and the received content is processed. Here are some screenshots from the dotMemory utility:

As you can see, the Heap Generation 2 area is full, and in particular, the Char [] and StringBuilder data types. The problem is that I don't have the StringBuilder or Char [] data type in the code, I use String and List (). I suspect the code inside the class HttpWebRequest, I do not know just what else to think. It just happens that the busy RAM grows until the application hangs. I would be grateful for the help!
reader.ReadToEnd(). Hm And what size is the flow? Maybe he really is very big? - VladD