Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/csharp/261.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
C# TCPClient.以低至4字节的块进行读取_C#_Sockets_Tcpclient - Fatal编程技术网

C# TCPClient.以低至4字节的块进行读取

C# TCPClient.以低至4字节的块进行读取,c#,sockets,tcpclient,C#,Sockets,Tcpclient,我完全理解,如果数据尚未到达,我无法读取分配的最大值。但是以4字节的速度读取真的合理吗?我的意思是,如果我收到一个100000字节的图像,那就太长了 我接收或发送图像时是否存在明显问题 ****澄清一下:下面的长度每次都会改变,有时我只会收到4个字节,在循环中只有4个字节,而其他时候我会收到3000个,在循环中只有3000个。我很确定有更多的数据,但它似乎停留在每个循环只读取x个数量**** 这是服务器端的接收代码 int totalBuffer, totalRecieved = 0; byte

我完全理解,如果数据尚未到达,我无法读取分配的最大值。但是以4字节的速度读取真的合理吗?我的意思是,如果我收到一个100000字节的图像,那就太长了

我接收或发送图像时是否存在明显问题

****澄清一下:下面的长度每次都会改变,有时我只会收到4个字节,在循环中只有4个字节,而其他时候我会收到3000个,在循环中只有3000个。我很确定有更多的数据,但它似乎停留在每个循环只读取x个数量****

这是服务器端的接收代码

int totalBuffer, totalRecieved = 0;
byte[] totalBufferByte = new byte[4];
byte[] buffer = new byte[0];
byte[] tbuffer;
int rLength, prevLength;
stream.Read(totalBufferByte, 0, totalBufferByte.Length);
totalBuffer = BitConverter.ToInt32(totalBufferByte, 0);
byte[] buf = new byte[8192];
while (totalBuffer > totalRecieved)
{

    rLength = stream.Read(buf, 0, buf.Length);
    totalRecieved = rLength + totalRecieved;
    Console.WriteLine("totalRecieved len: " + totalRecieved + " " + totalBuffer + " " + rLength + " " + buf.Length);
    if (rLength < buf.Length)
    {
        byte[] temp = new byte[rLength];
        Array.Copy(buf, temp, rLength);
        buf = temp;
    }
    prevLength = buffer.Length;
    tbuffer = buffer;
    buffer = new byte[buffer.Length + rLength];
    Array.Copy(tbuffer, buffer, tbuffer.Length);
    buf.CopyTo(buffer, prevLength);
}

这至少是问题的一部分:

rLength = stream.Read(buf, 0, buf.Length);
...
if (rLength < buf.Length)
{
    byte[] temp = new byte[rLength];
    Array.Copy(buf, temp, rLength);
    buf = temp;
}
rLength=stream.Read(buf,0,buf.Length);
...
if(长度
基本上,你将下一次阅读限制在最多与上一次阅读相同的大小

现在还不清楚你到底想做什么,但听起来你应该有这样的想法:

byte[] lengthBuffer = new byte[4];
int lengthRead = stream.Read(lengthBuffer, 0, lengthBuffer.Length);
// TODO: Throw or loop if you haven't read 4 bytes...
int length = BitConverter.ToInt32(totalBufferByte, 0);
byte[] buffer = new byte[length];
int lengthRead = 0;
while (lengthRead < length)
{
    int chunkRead = stream.Read(buffer, lengthRead,
                                length - lengthRead);
    if (chunkRead == 0)
    {
        throw new IOException(
            "Stream ended after reading {0} out of {1} bytes",
            lengthRead, length);
    }
    lengthRead += chunkRead;
}
byte[]lengthBuffer=新字节[4];
int lengthRead=stream.Read(lengthBuffer,0,lengthBuffer.Length);
//TODO:如果尚未读取4个字节,则抛出或循环。。。
int length=BitConverter.ToInt32(totalBufferByte,0);
字节[]缓冲区=新字节[长度];
int-lengthRead=0;
while(螺纹长度<长度)
{
int chunkRead=stream.Read(缓冲区、线程、,
长度(螺纹长度);
if(chunkRead==0)
{
抛出新IOException(
“流在从{1}字节中读取{0}后结束”,
长度(螺纹,长度);
}
lengthRead+=chunkRead;
}

换言之,您应该始终要求读取“剩余的数据量”——而不必将数据复制到新的单独字节数组中。

“4字节”不是一种速度。它只是一个字节数。就长度而言,您一次只能读取4个字节中的4个字节,但在读取端,您应该确保实际读取了4个字节。现在还不清楚你到底在问什么。@JonSkeet有时,服务器端看到的上述RLE长度是以常数3000循环的,有时是以常数4循环的。。。当它以数千的恒定值循环时,我可以在不计时的情况下接收图像。我不知道这是否是造成速度减慢的原因,但你的阅读逻辑似乎太复杂了。您应该尽量避免在循环中的缓冲区之间分配缓冲区和复制数据,而应在所需位置读入全局缓冲区。@500 InternalServerError Ok,这可以在我等待while循环每次读取4字节中的4字节的可能原因时同时完成。@user3059575:我相信问题在于您正在进行的过于复杂的缓冲区复制。
byte[] lengthBuffer = new byte[4];
int lengthRead = stream.Read(lengthBuffer, 0, lengthBuffer.Length);
// TODO: Throw or loop if you haven't read 4 bytes...
int length = BitConverter.ToInt32(totalBufferByte, 0);
byte[] buffer = new byte[length];
int lengthRead = 0;
while (lengthRead < length)
{
    int chunkRead = stream.Read(buffer, lengthRead,
                                length - lengthRead);
    if (chunkRead == 0)
    {
        throw new IOException(
            "Stream ended after reading {0} out of {1} bytes",
            lengthRead, length);
    }
    lengthRead += chunkRead;
}