通过java连接到远程Hadoop集群(CDH4)

通过java连接到远程Hadoop集群(CDH4),hadoop,bigdata,apache-pig,cloudera,Hadoop,Bigdata,Apache Pig,Cloudera,我有一个遥远的Hadoop机器集群Cloudera CDH4。我正试着从我的电脑上运行一个pig脚本。下面是我的JAVA代码: import org.apache.pig.ExecType; import org.apache.pig.PigServer; import org.apache.pig.backend.executionengine.ExecException; import org.apache.pig.data.Tuple; public class TestPig { p

我有一个遥远的Hadoop机器集群Cloudera CDH4。我正试着从我的电脑上运行一个pig脚本。下面是我的JAVA代码:

import org.apache.pig.ExecType;
import org.apache.pig.PigServer;
import org.apache.pig.backend.executionengine.ExecException;
import org.apache.pig.data.Tuple;

public class TestPig {

public static void main(String args[]){

    PigServer pigServer;
    try {

        /** On définit les propriétés */
        Properties props = new Properties();

        props.setProperty("fs.default.name", "hdfs://master.node.ip.adress:8020");
        props.setProperty("mapred.job.tracker", "master.node.ip.adress:8021");





        System.setProperty("javax.xml.parsers.DocumentBuilderFactory", "com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl");

        /** mapreduce mode */
        pigServer = new PigServer(ExecType.MAPREDUCE, props);

        /** pig script's path */
        pigServer.registerScript("/user/admin/data/script.pig");

        /** printing the pig script's output */
        Iterator<Tuple> results = pigServer.openIterator("A");
        while(results.hasNext())
            System.out.println(results.next().toDelimitedString("\t"));


    } 
    catch (ExecException e) {   e.printStackTrace(); } 
    catch (IOException e) { e.printStackTrace(); }

}
}
我在互联网上搜索解决方案,但没有找到任何有效的方法

你有办法解决这个问题吗


谢谢。

你解决了吗?
<?xml version="1.0" encoding="UTF-8"?>
<!--Autogenerated by Cloudera CM on 2013-07-12T10:43:15.666Z-->
<configuration>
  <property>
    <name>fs.defaultFS</name>
    <value>hdfs://Master.cs236cloud.internal:8020</value>
  </property>
  <property>
    <name>fs.trash.interval</name>
    <value>1</value>
  </property>
  <property>
    <name>io.file.buffer.size</name>
    <value>65536</value>
  </property>
  <property>
    <name>io.compression.codecs</name>
       <value>org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.BZip2Codec,org.apache.hadoop.io.compress.DeflateCodec,org.apache.hadoop.io.compress.SnappyCodec,org.apache.hadoop.io.compress.Lz4Codec</value>
  </property>
  <property>
    <name>hadoop.security.authentication</name>
    <value>simple</value>
  </property>
  <property>
    <name>hadoop.rpc.protection</name>
    <value>authentication</value>
  </property>
  <property>
    <name>hadoop.security.auth_to_local</name>
    <value>DEFAULT</value>
  </property>
</configuration>
3250 DataNode
4468 Main
2776 HeadlampServer
2541 RunJar
5496 Jps
4467 EventCatcherService
2502 QuorumPeerMain
2650 JobTracker
3082 RunJar
2597 HRegionServer
2594 TaskTracker
2629 HMaster
2520
3003 SecondaryNameNode
4553 Main
3414 Bootstrap
2549 AlertPublisher
3172 NameNode
2127 Main
4583 Main
3350 Bootstrap