Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/csharp/309.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/azure/13.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
C# 使用BlobContainerClient的Blob存储将大型文件上载到Azure_C#_Azure_.net Core_Azure Storage Blobs_Topshelf - Fatal编程技术网

C# 使用BlobContainerClient的Blob存储将大型文件上载到Azure

C# 使用BlobContainerClient的Blob存储将大型文件上载到Azure,c#,azure,.net-core,azure-storage-blobs,topshelf,C#,Azure,.net Core,Azure Storage Blobs,Topshelf,我想将大型文件上载到Azure Blob存储(500-2000MB),并尝试使用以下代码执行此操作: private BlobContainerClient containerClient; public async Task<UploadResultDto> Upload(FileInfo fileInfo, string remotePath) { try { var blob

我想将大型文件上载到Azure Blob存储(500-2000MB),并尝试使用以下代码执行此操作:

private BlobContainerClient containerClient;    
public async Task<UploadResultDto> Upload(FileInfo fileInfo, string remotePath)
        {
            try
            {
                var blobClient = containerClient.GetBlobClient(remotePath + "/" + fileInfo.Name);

                var transferOptions = new StorageTransferOptions
                {
                    MaximumConcurrency = 1,
                    MaximumTransferSize = 10485760,
                    InitialTransferSize = 10485760
                };

                await using var uploadFileStream = File.OpenRead(fileInfo.FullName);
                
                await blobClient.UploadAsync(uploadFileStream, transferOptions: transferOptions);
                uploadFileStream.Close();

                return new UploadResultDto()
                {
                    UploadSuccessfull = true
                };

            }
            catch (Exception ex)
            {
                Log.Error(ex,$"Error while uploading File {fileInfo.FullName}");
            }

            return new UploadResultDto()
            {
                UploadSuccessfull = false
            };
        }
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionstring);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient("containerName");
var blobClient = containerClient.GetBlobClient(remotePath + "/" +fileInfo.Name);
如果我从
StorageTransferOptions
中删除
InitialTransferSize
,一段时间后会出现以下错误:

retry failed after 6 tries. (The operation was canceled.)
就我对新SDK的理解而言,分块上传,因此BlockID等的整个处理都应该由SDK完成。还是我错了

有人知道为什么这不起作用吗?对于BlobContainerClient,我没有发现任何与此不同的地方,只是对于旧的cloudblobcontainer

更新: 其他一些信息:


它是一个.netCore 3.1应用程序,作为Windows服务与Topshelf库一起运行

我创建了一个新的控制台应用程序,并使用运行良好的代码进行测试

1.确认您没有。删除早期版本的Azure.Storage.Blobs并更新它

为什么您的
集装箱客户机
私有的?您可以使用以下代码在
Upload
方法中进行设置:

private BlobContainerClient containerClient;    
public async Task<UploadResultDto> Upload(FileInfo fileInfo, string remotePath)
        {
            try
            {
                var blobClient = containerClient.GetBlobClient(remotePath + "/" + fileInfo.Name);

                var transferOptions = new StorageTransferOptions
                {
                    MaximumConcurrency = 1,
                    MaximumTransferSize = 10485760,
                    InitialTransferSize = 10485760
                };

                await using var uploadFileStream = File.OpenRead(fileInfo.FullName);
                
                await blobClient.UploadAsync(uploadFileStream, transferOptions: transferOptions);
                uploadFileStream.Close();

                return new UploadResultDto()
                {
                    UploadSuccessfull = true
                };

            }
            catch (Exception ex)
            {
                Log.Error(ex,$"Error while uploading File {fileInfo.FullName}");
            }

            return new UploadResultDto()
            {
                UploadSuccessfull = false
            };
        }
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionstring);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient("containerName");
var blobClient = containerClient.GetBlobClient(remotePath + "/" +fileInfo.Name);

我无法让它与12.6.0版一起工作

我降级到Microsoft.Azure.Storage.Blob v11,并基于此线程实现了上载


现在这对我来说很好

StorageTransferOptions
中删除
InitialTransferSize
后,问题的第二部分与问题类似

您可以通过如下设置blob客户端的超时来解决此问题:

var blobClientOptions=新的blobClientOptions
{
传输=新的HttpClientTransport(新的HttpClient{Timeout=Timeout.InfiniteTimeSpan}),
重试={NetworkTimeout=Timeout.InfiniteTimeSpan}
};
InfiniteTimeSpan
可能有些过火,但至少可以证明这是问题所在


当我开始使用
Azure.Storage.Blobs
v12.8.0软件包时,这些设置为我消除了“6次尝试后重试失败”错误,并获得了上载<代码>使用var uploadFileStream=File.OpenRead(fileInfo.FullName)等待这是我不知道的C#中的一些新东西吗?@Andy哇,太好了。感谢您提供的信息。我遇到了这样一个问题:“尝试6次后重试失败。任务被取消。”正在尝试将3GB Azure SQL数据库导出到Azure存储中。工作了很多年,直到这个月都没有问题。我已经安装了Azure.Storage.Blobs的最新版本(12.6.0),我还清理并重新安装了所有内容。同样的错误是containerClient在构造函数中初始化,因此我不需要在每个需要它的方法中初始化它。