Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/.net/21.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/jquery/84.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
C# &引用;小溪太长了”;在Azure Blob存储上上载大文件时_C#_.net_Azure_Azure Storage Blobs_Blobstorage - Fatal编程技术网

C# &引用;小溪太长了”;在Azure Blob存储上上载大文件时

C# &引用;小溪太长了”;在Azure Blob存储上上载大文件时,c#,.net,azure,azure-storage-blobs,blobstorage,C#,.net,Azure,Azure Storage Blobs,Blobstorage,我正在尝试将大文件(4Gb)上载到Azure Blob存储,但失败了。 根据本文(),这是我的代码: CloudBlobContainer blobContainer = blobClient.GetContainerReference("my-container-name"); blobContainer.CreateIfNotExistsAsync().Wait(); CloudBlockBlob blockBlob = blobContainer.GetBlockBlobReference

我正在尝试将大文件(4Gb)上载到Azure Blob存储,但失败了。 根据本文(),这是我的代码:

CloudBlobContainer blobContainer = blobClient.GetContainerReference("my-container-name");
blobContainer.CreateIfNotExistsAsync().Wait();
CloudBlockBlob blockBlob = blobContainer.GetBlockBlobReference("blob-name");
await blockBlob.UploadFromFileAsync("C:\test.avi");
TimeSpan backOffPeriod = TimeSpan.FromSeconds(2);
int retryCount = 1;
BlobRequestOptions bro = new BlobRequestOptions()
{
    //If the file to upload is more than 67Mo, we send it in multiple parts
    SingleBlobUploadThresholdInBytes = 67108864, //67Mo (maximum)
    //Number of threads used to send data
    ParallelOperationThreadCount = 1,
    //If the block fail, we retry once (retryCount) after 2 seconds (backOffPeriod)
    RetryPolicy = new ExponentialRetry(backOffPeriod, retryCount),
};
blobClient.DefaultRequestOptions = bro;
CloudBlockBlob blockBlob = blobContainer.GetBlockBlobReference("blob-name");
//If the file is sended in multiple parts, theses parts size are 4Mo
blockBlob.StreamWriteSizeInBytes = 4194304; //4Mo (maximum)
await blockBlob.UploadFromFileAsync("C:\test.avi");
但是我遇到了这个错误

消息:流太长。
来源:System.Private.CoreLib
StackTrace:at System.IO.MemoryStream.Write(字节[]缓冲区,Int32 偏移量,Int32计数)在 Microsoft.WindowsAzure.Storage.Blob.BlobWriteStream.d_u5.MoveNext() 在C:\程序文件中 (x86)\Jenkins\workspace\release\u dotnet\u master\Lib\WindowsRuntime\Blob\BlobWriteStream.cs:line 144 ---来自引发异常的上一个位置的堆栈结束跟踪---在 System.Runtime.CompilerServices.TaskWaiter.ThrowForNonSuccess(任务 任务)在 System.Runtime.CompilerServices.TaskWaiter.HandleNonSuccessAndDebuggerNotification(任务 任务)在 Microsoft.WindowsAzure.Storage.Core.Util.StreamExtensions.d_u1`1.MoveNext() 在C:\程序文件中 (x86)\Jenkins\workspace\release\u dotnet\u master\Lib\Common\Core\Util\StreamExtensions.cs:line 308 ---来自引发异常的上一个位置的堆栈结束跟踪---在 System.Runtime.CompilerServices.TaskWaiter.ThrowForNonSuccess(任务 任务)在 System.Runtime.CompilerServices.TaskWaiter.HandleNonSuccessAndDebuggerNotification(任务 任务)在 Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob.c\u DisplayClass20\u 0.d.MoveNext() 在C:\程序文件中 (x86)\Jenkins\workspace\release\u dotnet\u master\Lib\WindowsRuntime\Blob\CloudBlockBlob.cs:line 301 ---来自引发异常的上一个位置的堆栈结束跟踪---在 System.Runtime.CompilerServices.TaskWaiter.ThrowForNonSuccess(任务 任务)在 System.Runtime.CompilerServices.TaskWaiter.HandleNonSuccessAndDebuggerNotification(任务 任务)在 Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob.c\u DisplayClass23\u 0.d.MoveNext() 在C:\程序文件中 (x86)\Jenkins\workspace\release\u dotnet\u master\Lib\WindowsRuntime\Blob\CloudBlockBlob.cs:line 397 ---来自引发异常的上一个位置的堆栈结束跟踪---在 System.Runtime.CompilerServices.TaskWaiter.ThrowForNonSuccess(任务 任务)在 System.Runtime.CompilerServices.TaskWaiter.HandleNonSuccessAndDebuggerNotification(任务 任务)在System.Runtime.CompilerServices.TaskAwaiter.GetResult()上 在 MyCompany.AzureServices.Blob.BlobService.d_u7.MoveNext() 在C:\MyProjectSource\MyCompany.AzureServices\Blob\BlobService.cs中:行 79 ---来自引发异常的上一个位置的堆栈结束跟踪---在 System.Runtime.CompilerServices.TaskWaiter.ThrowForNonSuccess(任务 任务)在 System.Runtime.CompilerServices.TaskWaiter.HandleNonSuccessAndDebuggerNotification(任务 任务)在System.Runtime.CompilerServices.TaskAwaiter.GetResult()上 在 MyCompany.AzureServices.Blob.MyProject.RecordBlobService.c\u DisplayClass1\u 0.d.MoveNext() 在里面 C:\MyProjectSource\MyCompany.AzureServices\Blob\MyProject\RecordBlobService.cs:line 二十五

根据本文()我尝试为大文件添加更多选项。 这是我的新代码

CloudBlobContainer blobContainer = blobClient.GetContainerReference("my-container-name");
blobContainer.CreateIfNotExistsAsync().Wait();
CloudBlockBlob blockBlob = blobContainer.GetBlockBlobReference("blob-name");
await blockBlob.UploadFromFileAsync("C:\test.avi");
TimeSpan backOffPeriod = TimeSpan.FromSeconds(2);
int retryCount = 1;
BlobRequestOptions bro = new BlobRequestOptions()
{
    //If the file to upload is more than 67Mo, we send it in multiple parts
    SingleBlobUploadThresholdInBytes = 67108864, //67Mo (maximum)
    //Number of threads used to send data
    ParallelOperationThreadCount = 1,
    //If the block fail, we retry once (retryCount) after 2 seconds (backOffPeriod)
    RetryPolicy = new ExponentialRetry(backOffPeriod, retryCount),
};
blobClient.DefaultRequestOptions = bro;
CloudBlockBlob blockBlob = blobContainer.GetBlockBlobReference("blob-name");
//If the file is sended in multiple parts, theses parts size are 4Mo
blockBlob.StreamWriteSizeInBytes = 4194304; //4Mo (maximum)
await blockBlob.UploadFromFileAsync("C:\test.avi");
但我又犯了同样的错误(流太长了)

我在Microsoft.WindowsAzure.Storage库中发现函数“UploadFromFileAsync”使用“UploadFromStreamAsync”,它使用MemoryStream。我认为我的错误来自那个MemoryStream,但在blob存储文章中写到,blob的最大大小是195Gb。那么我该如何使用它呢

我使用Microsoft.WindowsAzure.Storage 7.2.1版

谢谢

更新1:多亏了Tom Sun和Lu肇星,我尝试使用Microsoft.Azure.Storage.DataMovement。
遗憾的是,“TransferManager.UploadAsync”函数出现错误。我试着用谷歌搜索,但什么都没有…
有什么想法吗

这是我的代码:

 string storageConnectionString = "myStorageConnectionString";
   string filePath = @"C:\LargeFile.avi";
   string blobName = "large_file.avi";

    CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);
    CloudBlobClient blobClient = account.CreateCloudBlobClient();
    CloudBlobContainer blobContainer = blobClient.GetContainerReference("mycontainer");
    blobContainer.CreateIfNotExists();

   CloudBlockBlob destBlob = blobContainer.GetBlockBlobReference(blobName);
    // Setup the number of the concurrent operations
    TransferManager.Configurations.ParallelOperations = 64;

    // Setup the transfer context and track the upload progress
    var context = new SingleTransferContext();
    UploadOptions uploadOptions = new UploadOptions
    {
        DestinationAccessCondition = AccessCondition.GenerateIfExistsCondition()
    };
    context.ProgressHandler = new Progress<TransferStatus>(progress =>
    {
        Console.WriteLine("Bytes uploaded: {0}", progress.BytesTransferred);
    });

    // Upload a local blob
    TransferManager.UploadAsync(filePath, destBlob, uploadOptions, context, CancellationToken.None).Wait();
  {
  "version": "1.0.0-*",

  "dependencies": {
    "Microsoft.Azure.DocumentDB.Core": "0.1.0-preview",
    "Microsoft.Azure.Storage.DataMovement": "0.4.1",
    "Microsoft.IdentityModel.Protocols": "2.0.0",
    "NETStandard.Library": "1.6.1",
    "MyProject.Data.Entities": "1.0.0-*",
    "MyProject.Settings": "1.0.0-*",
    "WindowsAzure.Storage": "7.2.1"
  },

  "frameworks": {   
    "netcoreapp1.0": {
      "imports": [
        "dnxcore50",
        "portable-net451+win8"
      ],
      "dependencies": {
        "Microsoft.NETCore.App": {
          "type": "platform",
          "version": "1.0.0-*"
        }
      }
    }
  }
}
谢谢你的帮助

更新2(工作)

感谢Tom Sun,这是工作代码

string storageConnectionString = "myStorageConnectionString";
     CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);
      CloudBlobClient blobClient = account.CreateCloudBlobClient();
      CloudBlobContainer blobContainer = blobClient.GetContainerReference("mycontainer");
      blobContainer.CreateIfNotExistsAsync().Wait();
      string sourcePath = @"C:\Tom\TestLargeFile.zip";
      CloudBlockBlob destBlob = blobContainer.GetBlockBlobReference("LargeFile.zip");
      // Setup the number of the concurrent operations
      TransferManager.Configurations.ParallelOperations = 64;
      // Setup the transfer context and track the upoload progress
      var context = new SingleTransferContext
      {
          ProgressHandler =
          new Progress<TransferStatus>(
               progress => { Console.WriteLine("Bytes uploaded: {0}", progress.BytesTransferred); })
       };
      // Upload a local blob
      TransferManager.UploadAsync(sourcePath, destBlob, null, context, CancellationToken.None).Wait();
      Console.WriteLine("Upload finished !");
      Console.ReadKey();

SingleTransferContext
中,如果blob已经存在,则覆盖它。

我们正在Azure存储客户端库中积极研究该问题

请注意,由于PuxAdAdFraseSycNo()对于大型BLB不是一个可靠且有效的操作,所以我建议您考虑以下备选方案:

如果您可以接受命令行工具,您可以尝试,它能够以高性能传输Azure存储数据,并且其传输可以暂停和恢复


如果您希望以编程方式控制传输作业,请使用,这是AzCopy的核心。

我们正在Azure存储客户端库中积极研究该问题

请注意,由于PuxAdAdFraseSycNo()对于大型BLB不是一个可靠且有效的操作,所以我建议您考虑以下备选方案:

如果您可以接受命令行工具,您可以尝试,它能够以高性能传输Azure存储数据,并且其传输可以暂停和恢复

如果要以编程方式控制传输作业,请使用,这是AzCopy的核心。

我们可以使用它将大型文件上载到Azure blob存储。它对我来说是正确的,请尝试使用以下代码。有关Azure存储数据移动库的更多信息,请参阅:

string storageConnectionString=“存储连接字符串”;
CloudStorageAccount=CloudStorageAccount.Parse(storageConnectionString);
CloudBlobClient blobClient=account.CreateCloudBlobClient();
CloudBlobContainer blobContainer=blobClient.GetContainerReference(“mycontainer”);
blobContainer.CreateIfNotExists();
字符串sourcePath=@“C:\Tom\TestLargeFile.zip”;
CloudBlockBlob destBlob=blobContainer.getblockblobbreference(“LargeFile.zip”);
//设置并发操作的数量
转帐经理。
<?xml version="1.0" encoding="utf-8"?>
<packages>
  <package id="Microsoft.Azure.KeyVault.Core" version="1.0.0" targetFramework="net452" />
  <package id="Microsoft.Azure.Storage.DataMovement" version="0.4.1" targetFramework="net452" />
  <package id="Microsoft.Data.Edm" version="5.6.4" targetFramework="net452" />
  <package id="Microsoft.Data.OData" version="5.6.4" targetFramework="net452" />
  <package id="Microsoft.Data.Services.Client" version="5.6.4" targetFramework="net452" />
  <package id="Microsoft.WindowsAzure.ConfigurationManager" version="1.8.0.0" targetFramework="net452" />
  <package id="Newtonsoft.Json" version="6.0.8" targetFramework="net452" />
  <package id="System.Spatial" version="5.6.4" targetFramework="net452" />
  <package id="WindowsAzure.Storage" version="7.2.1" targetFramework="net452" />
</packages>
     string storageConnectionString = "myStorageConnectionString";
     CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);
      CloudBlobClient blobClient = account.CreateCloudBlobClient();
      CloudBlobContainer blobContainer = blobClient.GetContainerReference("mycontainer");
      blobContainer.CreateIfNotExistsAsync().Wait();
      string sourcePath = @"C:\Tom\TestLargeFile.zip";
      CloudBlockBlob destBlob = blobContainer.GetBlockBlobReference("LargeFile.zip");
      // Setup the number of the concurrent operations
      TransferManager.Configurations.ParallelOperations = 64;
      // Setup the transfer context and track the upoload progress
      var context = new SingleTransferContext
      {
          ProgressHandler =
          new Progress<TransferStatus>(
               progress => { Console.WriteLine("Bytes uploaded: {0}", progress.BytesTransferred); })
       };
      // Upload a local blob
      TransferManager.UploadAsync(sourcePath, destBlob, null, context, CancellationToken.None).Wait();
      Console.WriteLine("Upload finished !");
      Console.ReadKey();