C# C“HttpWebRequest”;基础连接已关闭:服务器已关闭预期保持活动状态的连接;

C# C“HttpWebRequest”;基础连接已关闭:服务器已关闭预期保持活动状态的连接;,c#,.net,http,web-crawler,C#,.net,Http,Web Crawler,我正在尝试构建一个需要两个请求的网络爬虫程序。第一个请求是GET(创建会话),另一个请求是POST(提交表单)。当我尝试提交表单时,会出现以下错误: The underlying connection was closed: A connection that was expected to be kept alive was closed by the server. 我已经尝试使用TLS12,将keep alive设置为false,更改超时属性,但仍然不起作用; 堆栈跟踪: at S

我正在尝试构建一个需要两个请求的网络爬虫程序。第一个请求是GET(创建会话),另一个请求是POST(提交表单)。当我尝试提交表单时,会出现以下错误:

The underlying connection was closed: A connection that was expected to be kept alive was closed by the server.
我已经尝试使用TLS12,将keep alive设置为false,更改超时属性,但仍然不起作用; 堆栈跟踪:

   at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
   at System.Net.FixedSizeReader.ReadPacket(Byte[] buffer, Int32 offset, Int32 count)
   at System.Net.Security._SslStream.StartFrameHeader(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)
   at System.Net.Security._SslStream.StartReading(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)
   at System.Net.Security._SslStream.ProcessRead(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)
   at System.Net.TlsStream.Read(Byte[] buffer, Int32 offset, Int32 size)
   at System.Net.PooledStream.Read(Byte[] buffer, Int32 offset, Int32 size)
   at System.Net.Connection.SyncRead(HttpWebRequest request, Boolean userRetrievedStream, Boolean probeRead)
印刷品:

我的请求功能:

public static string ObterHtmlPostTest(string url, string post, ref CookieContainer cookieContainer)
        {
            try
            {
                Encoding encoding = Encoding.UTF8;
                string postData = post.ToString();
                byte[] byteArray = encoding.GetBytes(postData);
                string html = string.Empty;
                var request = (HttpWebRequest)WebRequest.Create(url);
                Stream dataStream;
                StreamReader reader;
                HttpWebResponse response;

                NonValidatedWebHeader header = new NonValidatedWebHeader();
                header.Add("Accept-Language", "pt-BR,pt;q=0.9,en-US;q=0.8,en;q=0.7");
                request.Headers = header;
                request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
                request.CookieContainer = cookieContainer;
                request.Method = "POST";
                request.Accept = "*/*";
                request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.103 Safari/537.36";

                request.Referer = url;
                request.AllowAutoRedirect = true;
                request.KeepAlive = true;
                request.ContentType = "application/x-www-form-urlencoded";
                request.ContentLength = byteArray.Length;

                ServicePointManager.ServerCertificateValidationCallback = delegate { return true; };

                using (dataStream = request.GetRequestStream())
                {
                    dataStream.Write(byteArray, 0, byteArray.Length);
                }

                response = (HttpWebResponse)request.GetResponse();
                cookieContainer.Add(response.Cookies);
                dataStream = response.GetResponseStream();
                reader = new StreamReader(dataStream, encoding);
                html = reader.ReadToEnd();
                reader.Close();
                dataStream.Close();
                response.Close();

                return html;
            }
            catch (Exception ex)
            {
                throw;
            }
        }
找到解决方案:

我尝试爬网的网站使用http 1.0而不是1.1。必须添加到代码中:

request.ProtocolVersion = HttpVersion.Version10;

Hmmm…流数据流使用using块分配,然后在处理后重用。不确定这是否会导致问题?澄清一下:using块中的数据流导致它进行处理,这可能会关闭connection@Nikki9696我也这么想,所以我针对
https://httpbin.org/post 
,它可以工作。我得到了回复。@Nikki9696我试图使用out-of-block,但仍然没有成功。我已经在其他爬虫程序中使用了这个功能,它工作得很好,但是在这个我试图获取信息的特定网站上,我的get方法不起作用。