Google cloud platform 将文件上载到Google存储有时会出现错误Google.api.Auth.OAuth2.Responses.TokenResponseException

Google cloud platform 将文件上载到Google存储有时会出现错误Google.api.Auth.OAuth2.Responses.TokenResponseException,google-cloud-platform,google-bigquery,Google Cloud Platform,Google Bigquery,我有一个.net控制台应用程序,它将一些文件上传到Google存储,然后填充一个BigQuery表 我正在使用Microsoft Visual Studio Enterprise 2019版本16.7.5 我开发了一个包含所有必需功能的客户端库,并在Windows窗体应用程序或控制台应用程序中使用它。 我在Windows 2016服务器中安装了console应用程序,并使用任务计划程序启动它,但有时-并不总是-我遇到了以下错误: Google.Apis.Auth.OAuth2.Responses

我有一个.net控制台应用程序,它将一些文件上传到Google存储,然后填充一个BigQuery表

我正在使用Microsoft Visual Studio Enterprise 2019版本16.7.5

我开发了一个包含所有必需功能的客户端库,并在Windows窗体应用程序或控制台应用程序中使用它。 我在Windows 2016服务器中安装了console应用程序,并使用任务计划程序启动它,但有时-并不总是-我遇到了以下错误:

Google.Apis.Auth.OAuth2.Responses.TokenResponseException: Error:"Server response does not contain a JSON object. Status code is: ProxyAuthenticationRequired", Description:"", Uri:""
否则,当我使用开发机器上的交互式web表单应用程序时,一切都正常

我已经在我的机器和服务器上定义了环境变量

GOOGLE\u APPLICATION\u CREDENTIALS=D:\HR\u Analytics\MySecret.json

我还研究了服务器的事件查看器。但似乎与我的问题无关

如有任何建议,将不胜感激

代码如下:

        public bool GCP_MoveToStorage(string targetFile,
                        string objectName = null,
                        string bucketName = null) {

        MyLogger.Trace($"Start Moving to Cloud Storage - SYNCRONOUS");

        #region From https://cloud.google.com/storage/docs/reference/libraries#client-libraries-usage-csharp
        // Instantiates a client.
        try {
            MyLogger.Trace($"Getting Storage Client");
            using StorageClient storageClient = StorageClient.Create();

            MyLogger.Trace($"Opening Target File {targetFile}");
            using var f = File.OpenRead(targetFile);
            objectName = objectName ?? Path.GetFileName(targetFile);

            MyLogger.Trace($"Starting upload to Cloud Storage in Bucket {bucketName}, file {objectName}");
            storageClient.UploadObject(
                bucketName,
                objectName,
                "text/plain",
                f);

            MyLogger.Debug($"Uploaded {objectName}.");
        }
        catch (Exception ex) {
            MyLogger.Debug(ex);
            return false;
        }
        #endregion
        MyLogger.Trace($"End of Moving to Cloud Storage");
        return true;

    }
当我尝试加载BigQuery表时也会遇到类似的问题

        internal void GCP_LoadIntoTable(string projectID, string localFileName, string datasetId, string separator) {
        MyLogger.Trace($"Starting loading to GCP table");
        string fileFieldSpec = Path.GetDirectoryName(localFileName)
            + "\\"
            + Path.GetFileNameWithoutExtension(localFileName)
            + "_Fields" + Path.GetExtension(localFileName);

        string tableId = Path.GetFileNameWithoutExtension(localFileName);
        string sourceName = Path.GetFileName(localFileName);

        MyLogger.Trace($"Getting BigQueryClient for project {PROJECT_ID}");
        BigQueryClient client = BigQueryClient.Create(PROJECT_ID);
        var gcsURI = "gs://dev-staging/" + sourceName;

        MyLogger.Trace($"Getting Dataset {datasetId}");
        var dataset = client.GetDataset(datasetId);

        MyLogger.Trace($"Retrieving schema from file: {fileFieldSpec}");
        var schema = GetFieldDefs(fileFieldSpec);

        MyLogger.Trace($"Getting table reference for table: {tableId}");
        var destinationTableRef = dataset.GetTableReference(
            tableId: tableId);

        CreateIfMissing(tableId, client, dataset, schema, destinationTableRef);

        // Create job configuration
        MyLogger.Trace($"Creating job options with SourceFormat: FileFormat.Csv, " +
            $"SkipLeadingRows: 1," +
            $"FieldDelimiter: {separator}," +
            $"WriteDisposition: {m_WriteDisposition}");
        var jobOptions = new CreateLoadJobOptions() {
            SourceFormat = FileFormat.Csv,
            SkipLeadingRows = 1,
            FieldDelimiter = separator,
            WriteDisposition = m_WriteDisposition
        };
        // prepare target table, optional deletion of some rows
        if (!PrepareTargetTable(client))
            return;

        // Create and run job
        MyLogger.Trace($"Creating load job options with sourceUri: {gcsURI}," +
            $"destination: {destinationTableRef}, " +
            $"schema: schema, " +
            $"options: jobOptions");
        var loadJob = client.CreateLoadJob(
            sourceUri: gcsURI, destination: destinationTableRef,
            schema: schema, options: jobOptions);

        MyLogger.Trace($"Creating Cancellation Token");
        var cts = new CancellationTokenSource();

        MyLogger.Trace($"Launching job");
        var job = loadJob.PollUntilCompletedAsync(null, null, cts.Token);  // Waits for the job to complete.
        MyLogger.Trace($"Waiting for job completed");
        job.Wait();

        if (job.Exception != null)
            // log some errors
            MyLogger.Error(job.Exception);

        // Display the number of rows uploaded
        BigQueryTable table = client.GetTable(destinationTableRef);
        MyLogger.Debug(
            $"Loaded {table.Resource.NumRows} rows to {table.FullyQualifiedId}");

        MyLogger.Trace($"End of loading to GCP table");
    }
以下是日志:

成功

    2020-11-11 15:24:19.3223|INFO|HR_Analytics_Core.BuilderUtil|Processing TE_HST_RELATIONS
2020-11-11 15:24:19.3576|TRACE|HR_Analytics_Core.TableBridge|TableBridge Constructor with connection String Data Source=MySrv\MyInstance;Initial Catalog=CLONE;Persist Security Info=True;User ID=usr;Password="pwd".
2020-11-11 15:24:19.3576|TRACE|HR_Analytics_Core.TableBridge|Start of getting Data wiht SQL: 
                SELECT *
                FROM Relazioni_HRP1001
2020-11-11 15:24:33.1487|TRACE|HR_Analytics_Core.TableBridge|End of getting Data.
2020-11-11 15:24:51.0125|TRACE|HR_Analytics_Core.TableBridge|Start of saving file to target dir: D:\HR_Analytics\Files\TE_HST_RELATIONS.txt
2020-11-11 15:24:51.0462|TRACE|HR_Analytics_Core.TableBridge|Writing data file 
2020-11-11 15:24:52.2459|TRACE|HR_Analytics_Core.TableBridge|Writing Field Definition file 
2020-11-11 15:24:52.2459|TRACE|HR_Analytics_Core.TableBridge|Start of saving file to target dir: D:\HR_Analytics\Files\TE_HST_RELATIONS.txt
2020-11-11 15:24:52.2459|TRACE|HR_Analytics_Core.TableBridge|Start of Moving to Cloud Storage - SYNCRONOUS
2020-11-11 15:24:52.2459|TRACE|HR_Analytics_Core.TableBridge|Getting Storage Client
2020-11-11 15:24:52.2459|TRACE|HR_Analytics_Core.TableBridge|Opening Target File D:\HR_Analytics\Files\TE_HST_RELATIONS.txt
2020-11-11 15:24:52.2459|TRACE|HR_Analytics_Core.TableBridge|Starting upload to Cloud Storage in Bucket dev-staging, file TE_HST_RELATIONS.txt
2020-11-11 15:24:56.5998|DEBUG|HR_Analytics_Core.TableBridge|Uploaded TE_HST_RELATIONS.txt.
2020-11-11 15:24:56.5998|TRACE|HR_Analytics_Core.TableBridge|End of Moving to Cloud Storage
2020-11-11 15:24:56.5998|TRACE|HR_Analytics_Core.TableBridge|Starting loading to GCP table
2020-11-11 15:24:56.5998|TRACE|HR_Analytics_Core.TableBridge|Getting BigQueryClient for project w3-hra-prod-analytics
2020-11-11 15:24:56.5998|TRACE|HR_Analytics_Core.TableBridge|Getting Dataset DEV_STAGING
2020-11-11 15:24:57.1628|TRACE|HR_Analytics_Core.TableBridge|Retrieving schema from file: D:\HR_Analytics\Files\TE_HST_RELATIONS_Fields.txt
2020-11-11 15:24:57.1628|TRACE|HR_Analytics_Core.TableBridge|Start of schema specification from FieldDef file
2020-11-11 15:24:57.1628|TRACE|HR_Analytics_Core.TableBridge|End of schema specification from FieldDef file
2020-11-11 15:24:57.1628|TRACE|HR_Analytics_Core.TableBridge|Getting table reference for table: TE_HST_RELATIONS
2020-11-11 15:24:57.2992|TRACE|HR_Analytics_Core.TableBridge|Creating job options with SourceFormat: FileFormat.Csv, SkipLeadingRows: 1,FieldDelimiter: |,WriteDisposition: WriteDisposition.WriteTruncate
2020-11-11 15:24:57.2992|TRACE|HR_Analytics_Core.TableBridge|Creating load job options with sourceUri: gs://dev-staging/TE_HST_RELATIONS.txt,destination: Google.Apis.Bigquery.v2.Data.TableReference, schema: schema, options: jobOptions
2020-11-11 15:24:57.6633|TRACE|HR_Analytics_Core.TableBridge|Waiting for job completed
2020-11-11 15:25:08.0859|DEBUG|HR_Analytics_Core.TableBridge|Loaded 374924 rows to w3-hra-prod-analytics.DEV_STAGING.TE_HST_RELATIONS
2020-11-11 15:25:08.0859|TRACE|HR_Analytics_Core.TableBridge|End of loading to GCP table
失败

2020-11-18 09:54:55.2939|INFO|HR_Analytics_Core.BuilderUtil|Processing TE_HST_RELATIONS
2020-11-11 15:24:19.3576|TRACE|HR_Analytics_Core.TableBridge|TableBridge Constructor with connection String Data Source=MySrv\MyInstance;Initial Catalog=CLONE;Persist Security Info=True;User ID=usr;Password="pwd".
2020-11-18 09:54:55.2939|TRACE|HR_Analytics_Core.TableBridge|Start of getting Data wiht SQL: 
                SELECT *
                FROM Relazioni_HRP1001
2020-11-18 09:55:08.4750|TRACE|HR_Analytics_Core.TableBridge|End of getting Data.
2020-11-18 09:55:26.4362|TRACE|HR_Analytics_Core.TableBridge|Start of saving file to target dir: D:\HR_Analytics\Files\TE_HST_RELATIONS.txt
2020-11-18 09:55:26.4362|TRACE|HR_Analytics_Core.TableBridge|Writing data file 
2020-11-18 09:55:28.5987|TRACE|HR_Analytics_Core.TableBridge|Writing Field Definition file 
2020-11-18 09:55:28.5987|TRACE|HR_Analytics_Core.TableBridge|Start of saving file to target dir: D:\HR_Analytics\Files\TE_HST_RELATIONS.txt
2020-11-18 09:55:28.5987|TRACE|HR_Analytics_Core.TableBridge|Start of Moving to Cloud Storage - SYNCRONOUS
2020-11-18 09:55:28.5987|TRACE|HR_Analytics_Core.TableBridge|Getting Storage Client
2020-11-18 09:55:28.5987|TRACE|HR_Analytics_Core.TableBridge|Opening Target File D:\HR_Analytics\Files\TE_HST_RELATIONS.txt
2020-11-18 09:55:28.5987|TRACE|HR_Analytics_Core.TableBridge|Starting upload to Cloud Storage in Bucket dev-staging, file TE_HST_RELATIONS.txt
2020-11-18 09:55:28.6409|DEBUG|HR_Analytics_Core.TableBridge|Google.Apis.Auth.OAuth2.Responses.TokenResponseException: Error:"Server response does not contain a JSON object. Status code is: ProxyAuthenticationRequired", Description:"", Uri:""
   at Google.Cloud.Storage.V1.StorageClientImpl.UploadHelper.CheckFinalProgress()
   at Google.Cloud.Storage.V1.StorageClientImpl.UploadHelper.Execute()
   at Google.Cloud.Storage.V1.StorageClientImpl.UploadObject(Object destination, Stream source, UploadObjectOptions options, IProgress`1 progress)
   at Google.Cloud.Storage.V1.StorageClientImpl.UploadObject(String bucket, String objectName, String contentType, Stream source, UploadObjectOptions options, IProgress`1 progress)
   at HR_Analytics_Core.TableBridge.GCP_MoveToStorage(String targetFile, String objectName, String bucketName) in C:\Users\ursini\source\repos\HR_Analytics\HR_Analytics_Core\TableBridge.cs:line 96