Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/google-cloud-platform/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Google cloud platform Sqoop作业通过Dataproc失败_Google Cloud Platform_Sqoop_Avro_Google Cloud Dataproc_Data Ingestion - Fatal编程技术网

Google cloud platform Sqoop作业通过Dataproc失败

Google cloud platform Sqoop作业通过Dataproc失败,google-cloud-platform,sqoop,avro,google-cloud-dataproc,data-ingestion,Google Cloud Platform,Sqoop,Avro,Google Cloud Dataproc,Data Ingestion,我已通过GCP Dataproc Cluster提交了Sqoop作业,并将其设置为avrodatafile配置参数,但该作业失败,错误如下: /08/12 22:34:34 INFO impl.YarnClientImpl: Submitted application application_1565634426340_0021 19/08/12 22:34:34 INFO mapreduce.Job: The url to track the job: http://sqoop-gcp-ing

我已通过GCP Dataproc Cluster提交了Sqoop作业,并将其设置为avrodatafile配置参数,但该作业失败,错误如下:

/08/12 22:34:34 INFO impl.YarnClientImpl: Submitted application application_1565634426340_0021
19/08/12 22:34:34 INFO mapreduce.Job: The url to track the job: http://sqoop-gcp-ingest-mzp-m:8088/proxy/application_1565634426340_0021/
19/08/12 22:34:34 INFO mapreduce.Job: Running job: job_1565634426340_0021
19/08/12 22:34:40 INFO mapreduce.Job: Job job_1565634426340_0021 running in uber mode : false
19/08/12 22:34:40 INFO mapreduce.Job:  map 0% reduce 0%
19/08/12 22:34:45 INFO mapreduce.Job: Task Id : attempt_1565634426340_0021_m_000000_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
19/08/12 22:34:50 INFO mapreduce.Job: Task Id : attempt_1565634426340_0021_m_000000_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
19/08/12 22:34:55 INFO mapreduce.Job: Task Id : attempt_1565634426340_0021_m_000000_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
19/08/12 22:35:00 INFO mapreduce.Job:  map 100% reduce 0%
19/08/12 22:35:01 INFO mapreduce.Job: Job job_1565634426340_0021 failed with state FAILED due to: Task failed task_1565634426340_0021_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

19/08/12 22:35:01 INFO mapreduce.Job: Counters: 11
    Job Counters 
        Failed map tasks=4
        Launched map tasks=4
        Other local map tasks=4
        Total time spent by all maps in occupied slots (ms)=41976
        Total time spent by all reduces in occupied slots (ms)=0
        Total time spent by all map tasks (ms)=13992
        Total vcore-milliseconds taken by all map tasks=13992
        Total megabyte-milliseconds taken by all map tasks=42983424
    Map-Reduce Framework
        CPU time spent (ms)=0
        Physical memory (bytes) snapshot=0
        Virtual memory (bytes) snapshot=0
19/08/12 22:35:01 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
19/08/12 22:35:01 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 30.5317 seconds (0 bytes/sec)
19/08/12 22:35:01 INFO mapreduce.ImportJobBase: Retrieved 0 records.
19/08/12 22:35:01 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@61baa894
19/08/12 22:35:01 ERROR tool.ImportTool: Import failed: Import job failed!
19/08/12 22:35:01 DEBUG manager.OracleManager$ConnCache: Caching released connection for jdbc:oracle:thin:@10.25.42.52:1521/uataca.aaamidatlantic.com/GCPREADER
Job output is complete

如果不指定
--作为avrodatafile
参数,它工作正常。

要解决此问题,提交作业时需要将
mapreduce.job.classloader
属性值设置为
true

gcloud dataproc作业提交hadoop--cluster=“${cluster\u NAME}”\
--class=“org.apache.sqoop.sqoop”\
--properties=“mapreduce.job.classloader=true”\
. . .
-- \
--作为avrodatafile\
. . .

您使用什么样的Sqoop和Dataproc版本?另外,您可以提供提交作业时使用的完整命令吗?这看起来类似于。