如何使用Apache Beam从Hive读取数据?
如何使用Apache Beam从Hive读取/如何在Apache Beam中将Hive用作源?HadoopInputFormatIO可用于从Hive读取,如下所示:如何使用Apache Beam从Hive读取数据?,hive,apache-beam,apache-beam-io,Hive,Apache Beam,Apache Beam Io,如何使用Apache Beam从Hive读取/如何在Apache Beam中将Hive用作源?HadoopInputFormatIO可用于从Hive读取,如下所示: Configuration conf = new Configuration(); conf.setClass("mapreduce.job.inputformat.class", HCatInputFormat.class, InputFormat.class); conf.setClass("key.class", LongW
Configuration conf = new Configuration();
conf.setClass("mapreduce.job.inputformat.class", HCatInputFormat.class,
InputFormat.class);
conf.setClass("key.class", LongWritable.class, WritableComparable.class);
conf.setClass("value.class", DefaultHCatRecord.class, Writable.class);
conf.set("hive.metastore.uris", "...");
HCatInputFormat.setInput(hiveConf, "myDatabase", "myTable", "myFilter");
PCollection<KV<LongWritable, DefaultHCatRecord>> data =
p.apply(HadoopInputFormatIO.<Long,
DefaultHCatRecord>read().withConfiguration(conf));
Configuration conf=new Configuration();
conf.setClass(“mapreduce.job.inputformat.class”),HCatInputFormat.class,
InputFormat.class);
conf.setClass(“key.class”、LongWritable.class、WritableComparable.class);
conf.setClass(“value.class”,DefaultHCatRecord.class,Writable.class);
conf.set(“hive.metastore.uris”和“…”);
setInput(hiveConf,“myDatabase”、“myTable”、“myFilter”);
收集数据=
p、 apply(HadoopInputFormatIO.read().withConfiguration(conf));
HadoopInputFormatIO可用于从Hive读取,如下所示:
Configuration conf = new Configuration();
conf.setClass("mapreduce.job.inputformat.class", HCatInputFormat.class,
InputFormat.class);
conf.setClass("key.class", LongWritable.class, WritableComparable.class);
conf.setClass("value.class", DefaultHCatRecord.class, Writable.class);
conf.set("hive.metastore.uris", "...");
HCatInputFormat.setInput(hiveConf, "myDatabase", "myTable", "myFilter");
PCollection<KV<LongWritable, DefaultHCatRecord>> data =
p.apply(HadoopInputFormatIO.<Long,
DefaultHCatRecord>read().withConfiguration(conf));
Configuration conf=new Configuration();
conf.setClass(“mapreduce.job.inputformat.class”),HCatInputFormat.class,
InputFormat.class);
conf.setClass(“key.class”、LongWritable.class、WritableComparable.class);
conf.setClass(“value.class”,DefaultHCatRecord.class,Writable.class);
conf.set(“hive.metastore.uris”和“…”);
setInput(hiveConf,“myDatabase”、“myTable”、“myFilter”);
收集数据=
p、 apply(HadoopInputFormatIO.read().withConfiguration(conf));
2017年7月合并的pull请求允许Beam 2.1.0
通过HCatalog
支持hive
2017年7月合并的拉动请求允许Beam 2.1.0
通过HCatalog
支持hive
发布答案,而不是将答案与问题放在一起发布答案,而不是将答案与问题放在一起发布答案您能分享pom吗?还有键值类的完整代码?你能分享pom吗?还有键值类的完整代码?