Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/335.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 卡拉夫奥斯基骆驼-hdfs2_Java_Apache Camel_Osgi_Hdfs_Karaf - Fatal编程技术网

Java 卡拉夫奥斯基骆驼-hdfs2

Java 卡拉夫奥斯基骆驼-hdfs2,java,apache-camel,osgi,hdfs,karaf,Java,Apache Camel,Osgi,Hdfs,Karaf,我很难让camel-hdfs2组件在Karaf4.0OSGi容器中按预期运行。这是一个非常简单的camel路由,即从HDFS读取文件,然后简单地将文件名写入/tmp中的新文件 我只需运行main方法(包括下面的方法),就可以在Karaf OSGi容器之外工作,但是当我尝试在Karaf中启动它时,我得到: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.LocalFil

我很难让camel-hdfs2组件在Karaf4.0OSGi容器中按预期运行。这是一个非常简单的camel路由,即从HDFS读取文件,然后简单地将文件名写入/tmp中的新文件

我只需运行main方法(包括下面的方法),就可以在Karaf OSGi容器之外工作,但是当我尝试在Karaf中启动它时,我得到:

java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.LocalFileSystem not found
 at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1882)
 at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2298)
 at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2311)
 at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:90)
 at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2350)
 at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2332)
 at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:369)
 at cas.example.camel_hdfs.LocalRouteBuilder.start(LocalRouteBuilder.java:83)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)[:1.8.0_51]
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)[:1.8.0_51]
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[:1.8.0_51]
 at java.lang.reflect.Method.invoke(Method.java:497)[:1.8.0_51]
 at org.apache.felix.scr.impl.helper.BaseMethod.invokeMethod(BaseMethod.java:231)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.helper.BaseMethod.access$500(BaseMethod.java:39)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.helper.BaseMethod$Resolved.invoke(BaseMethod.java:624)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.helper.BaseMethod.invoke(BaseMethod.java:508)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.helper.ActivateMethod.invoke(ActivateMethod.java:149)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.manager.SingleComponentManager.createImplementationObject(SingleComponentManager.java:315)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.manager.SingleComponentManager.createComponent(SingleComponentManager.java:127)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.manager.SingleComponentManager.getService(SingleComponentManager.java:871)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.manager.SingleComponentManager.getServiceInternal(SingleComponentManager.java:838)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.manager.AbstractComponentManager.activateInternal(AbstractComponentManager.java:850)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.manager.AbstractComponentManager.enable(AbstractComponentManager.java:419)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.config.ConfigurableComponentHolder.enableComponents(ConfigurableComponentHolder.java:376)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.BundleComponentActivator.initialize(BundleComponentActivator.java:172)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.BundleComponentActivator.<init>(BundleComponentActivator.java:120)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.Activator.loadComponents(Activator.java:258)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.Activator.access$000(Activator.java:45)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.Activator$ScrExtension.start(Activator.java:185)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.utils.extender.AbstractExtender.createExtension(AbstractExtender.java:259)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.utils.extender.AbstractExtender.modifiedBundle(AbstractExtender.java:232)[23:org.apache.felix.scr:1.8.2]
 at org.osgi.util.tracker.BundleTracker$Tracked.customizerModified(BundleTracker.java:479)[23:org.apache.felix.scr:1.8.2]
 at org.osgi.util.tracker.BundleTracker$Tracked.customizerModified(BundleTracker.java:414)[23:org.apache.felix.scr:1.8.2]
 at org.osgi.util.tracker.AbstractTracked.track(AbstractTracked.java:232)[23:org.apache.felix.scr:1.8.2]
 at org.osgi.util.tracker.BundleTracker$Tracked.bundleChanged(BundleTracker.java:443)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.framework.util.EventDispatcher.invokeBundleListenerCallback(EventDispatcher.java:913)[org.apache.felix.framework-5.0.1.jar:]
 at org.apache.felix.framework.util.EventDispatcher.fireEventImmediately(EventDispatcher.java:834)[org.apache.felix.framework-5.0.1.jar:]
 at org.apache.felix.framework.util.EventDispatcher.fireBundleEvent(EventDispatcher.java:516)[org.apache.felix.framework-5.0.1.jar:]
 at org.apache.felix.framework.Felix.fireBundleEvent(Felix.java:4544)[org.apache.felix.framework-5.0.1.jar:]
 at org.apache.felix.framework.Felix.startBundle(Felix.java:2166)[org.apache.felix.framework-5.0.1.jar:]
 at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:977)[org.apache.felix.framework-5.0.1.jar:]
 at org.apache.felix.fileinstall.internal.DirectoryWatcher.startBundle(DirectoryWatcher.java:1245)[4:org.apache.felix.fileinstall:3.5.0]
 at org.apache.felix.fileinstall.internal.DirectoryWatcher.startBundles(DirectoryWatcher.java:1217)[4:org.apache.felix.fileinstall:3.5.0]
 at org.apache.felix.fileinstall.internal.DirectoryWatcher.startAllBundles(DirectoryWatcher.java:1207)[4:org.apache.felix.fileinstall:3.5.0]
 at org.apache.felix.fileinstall.internal.DirectoryWatcher.doProcess(DirectoryWatcher.java:504)[4:org.apache.felix.fileinstall:3.5.0]
 at org.apache.felix.fileinstall.internal.DirectoryWatcher.process(DirectoryWatcher.java:358)[4:org.apache.felix.fileinstall:3.5.0]
 at org.apache.felix.fileinstall.internal.DirectoryWatcher.run(DirectoryWatcher.java:310)[4:org.apache.felix.fileinstall:3.5.0]
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.LocalFileSystem not found
 at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1788)
 at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1880)
... 46 more
这是我的RouteBuilder/Activator:

package cas.example.camel_hdfs;

import java.net.URI;

import org.apache.camel.CamelContext;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.main.Main;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.LocalFileSystem;
import org.apache.hadoop.hdfs.DistributedFileSystem;
import org.osgi.framework.BundleContext;

import aQute.bnd.annotation.component.Activate;
import aQute.bnd.annotation.component.Component;
import aQute.bnd.annotation.component.Deactivate;

@Component
public class LocalRouteBuilder extends RouteBuilder {

    private final String hdfsHost;
    private final String path;
    private static final String MARKED_SUFFIX = "ingested";

    /**
     * If running in OSGI...
     */
    private CamelContext cContext = null;

    public LocalRouteBuilder() {
        this("10.10.1.20", "/user/cloud-user/cas-docs", "cloud-user");
    }

    /**
     * If you use this constructor, make sure the HADOOP_USER_NAME is set via a
     * jvm property.
     * 
     * @param hdfsHost
     * @param path
     */
    public LocalRouteBuilder(final String hdfsHost, final String path) {
        this(hdfsHost, path, null);
    }

    /**
     * 
     * @param hdfsHost
     * @param path
     * @param userName
     */
    public LocalRouteBuilder(final String hdfsHost, final String path, final String userName) {
        this.cContext = this.getContext();
        this.hdfsHost = hdfsHost;
        this.path = path;
        if (userName != null) {
            System.setProperty("HADOOP_USER_NAME", userName);
        }
    }

    /**
     * {@inheritDoc}
     */
    @Override
    public void configure() throws Exception {

        from("hdfs2://" + hdfsHost + path + "?delay=5000&chunkSize=4096&connectOnStartup=true&readSuffix=" + MARKED_SUFFIX)

        .setBody(simple(path + "/${header[CamelFileName]}." + MARKED_SUFFIX))

        .to("log:cas.example.camel_hdfs.BasicRouteBuilder")

        .to("file:/tmp/RECEIVED")

        .stop().end();

    }

    @Activate
    public void start(BundleContext context) throws Exception {
        Configuration conf = new Configuration();
        conf.setClass("fs.file.impl", LocalFileSystem.class, FileSystem.class);
        conf.setClass("fs.hdfs.impl", DistributedFileSystem.class, FileSystem.class);
        FileSystem.get(URI.create("file:///"), conf);
        FileSystem.get(URI.create("hdfs://10.10.1.20:9000/"), conf);

        if (cContext != null) {
            cContext.stop();
            cContext = null;
        }
        // cContext = new OsgiDefaultCamelContext(context);
        cContext.addRoutes(this);
        cContext.start();
        cContext.startAllRoutes();
    }

    @Deactivate
    public void stop(BundleContext context) throws Exception {
        System.out.println("Stopping hdfs camel bundle");
        if (cContext != null) {
            cContext.stop();
            cContext = null;
        }
    }

    public static void main(String[] args) {
        try {
            Main m = new Main();
            m.addRouteBuilder(new LocalRouteBuilder("10.10.1.20", "/user/cloud-user/cas-docs", "cloud-user"));
            m.enableHangupSupport();
            m.enableTrace();
            m.run();
        } catch (Exception e) {
            e.printStackTrace();
            System.exit(-1);
        }
    }

}
为了以防万一,以下是捆绑包列表:

karaf@root()> list
START LEVEL 100 , List Threshold: 50
 ID | State    | Lvl | Version            | Name
----------------------------------------------------------------------------------------------
 58 | Active   |  80 | 0.0.1.SNAPSHOT     | karaf-feature-export
 59 | Active   |  80 | 0.0.1.SNAPSHOT     | cas-camel-hdfs
 60 | Active   |  80 | 2.4.0.201411031534 | bndlib
 61 | Active   |  80 | 2.15.2             | camel-blueprint
 62 | Active   |  80 | 2.15.2             | camel-catalog
 63 | Active   |  80 | 2.15.2             | camel-commands-core
 64 | Active   |  80 | 2.15.2             | camel-core
 65 | Active   |  80 | 2.15.2             | camel-spring
 66 | Active   |  80 | 2.15.2             | camel-karaf-commands
 67 | Active   |  80 | 1.1.1              | geronimo-jta_1.1_spec
 72 | Active   |  80 | 2.2.6.1            | Apache ServiceMix :: Bundles :: jaxb-impl
 84 | Active   |  80 | 3.1.4              | Stax2 API
 85 | Active   |  80 | 4.4.1              | Woodstox XML-processor
 86 | Active   |  80 | 2.15.2             | camel-core-osgi
 87 | Active   |  80 | 18.0.0             | Guava: Google Core Libraries for Java
 88 | Active   |  80 | 2.6.1              | Protocol Buffer Java API
 89 | Active   |  80 | 1.9.12             | Jackson JSON processor
 90 | Active   |  80 | 1.9.12             | Data mapper for Jackson JSON processor
 91 | Active   |  80 | 2.15.2             | camel-hdfs2
 92 | Active   |  80 | 1.2                | Commons CLI
 93 | Active   |  80 | 1.10.0             | Apache Commons Codec
 94 | Active   |  80 | 3.2.1              | Commons Collections
 95 | Active   |  80 | 1.5.0              | Commons Compress
 96 | Active   |  80 | 1.9.0              | Commons Configuration
 97 | Active   |  80 | 2.4.0              | Commons IO
 98 | Active   |  80 | 2.6                | Commons Lang
 99 | Active   |  80 | 3.3.0              | Apache Commons Math
100 | Active   |  80 | 3.3.0              | Commons Net
101 | Active   |  80 | 3.4.6              | ZooKeeper Bundle
102 | Active   |  80 | 1.7.7.1            | Apache ServiceMix :: Bundles :: avro
103 | Active   |  80 | 3.1.0.7            | Apache ServiceMix :: Bundles :: commons-httpclient
104 | Active   |  80 | 3.0.0.1            | Apache ServiceMix :: Bundles :: guice
105 | Active   |  80 | 2.3.0.2            | Apache ServiceMix :: Bundles :: hadoop-client
106 | Active   |  80 | 0.1.51.1           | Apache ServiceMix :: Bundles :: jsch
107 | Active   |  80 | 2.6.0.1            | Apache ServiceMix :: Bundles :: paranamer
108 | Active   |  80 | 0.52.0.1           | Apache ServiceMix :: Bundles :: xmlenc
109 | Active   |  80 | 1.2.0.5            | Apache ServiceMix :: Bundles :: xmlresolver
110 | Active   |  80 | 3.9.6.Final        | Netty
111 | Resolved |  80 | 1.1.0.1            | Snappy for Java
karaf@root()>
谢谢你的帮助

-本

编辑:

因此,我为我的定制包添加了包头(我做了一次karaf清理,所以包id从39更改为109)

我仍然不确定它为什么找不到LocalFileSystem类,因为它肯定是从以下位置导出的:

102 | Active    |  80 | 2.3.0.2            | Apache ServiceMix :: Bundles :: hadoop-client
这是作为camel-hdfs2特性的一部分安装的hadoop捆绑包

编辑2:
嗯,实际上我不知道为什么bundle:classes向我展示了所有这些类。我刚打开我的罐子,我看到了:

" zip.vim version v27
" Browsing zipfile /opt/apache-karaf-4.0.1/deploy/cas-camel-hdfs-0.0.1-SNAPSHOT.jar
" Select a file with cursor and press ENTER

META-INF/MANIFEST.MF
META-INF/
META-INF/maven/
META-INF/maven/com.inovexcorp.cas/
META-INF/maven/com.inovexcorp.cas/cas-camel-hdfs/
META-INF/maven/com.inovexcorp.cas/cas-camel-hdfs/pom.properties
META-INF/maven/com.inovexcorp.cas/cas-camel-hdfs/pom.xml
META-INF/services/
META-INF/services/org.apache.hadoop.fs.FileSystem
OSGI-INF/
OSGI-INF/cas.example.camel_hdfs.Hdfs2RouteBuilder.xml
OSGI-INF/cas.example.camel_hdfs.SimpleRouteBuilder.xml
cas/
cas/example/
cas/example/camel_hdfs/
cas/example/camel_hdfs/Hdfs2RouteBuilder.class
cas/example/camel_hdfs/SimpleRouteBuilder.class
core-default.xml
hdfs-default.xml
log4j.xml
Karaf中列出的类似乎与JAR中的实际类不匹配(但它必须看到我的包引用的是什么类?)。这是我的POM,以防万一:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.inovexcorp.cas</groupId>
    <artifactId>cas-camel-hdfs</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>Example Camel-HDFS Integration</name>
    <packaging>bundle</packaging>

    <properties>
        <project.build.sourceEncoding>UTF8</project.build.sourceEncoding>
        <camel.version>2.15.2</camel.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.apache.camel</groupId>
            <artifactId>camel-core</artifactId>
            <version>${camel.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.camel</groupId>
            <artifactId>camel-hdfs2</artifactId>
            <version>${camel.version}</version>
            <scope>provided</scope>
        </dependency><!-- <dependency> <groupId>org.apache.camel</groupId> <artifactId>camel-core-osgi</artifactId> 
            <version>${camel.version}</version> </dependency> -->
        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>1.2.17</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.ops4j.pax.logging</groupId>
            <artifactId>pax-logging-api</artifactId>
            <version>1.7.0</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>biz.aQute.bnd</groupId>
            <artifactId>bndlib</artifactId>
            <version>2.3.0</version>
            <scope>provided</scope>
        </dependency>
    </dependencies>


    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.3</version>
                <configuration>
                    <!-- http://maven.apache.org/plugins/maven-compiler-plugin/ -->
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.apache.felix</groupId>
                <artifactId>maven-bundle-plugin</artifactId>
                <extensions>true</extensions>
                <configuration>
                    <instructions>
                        <Bundle-SymbolicName>${project.groupId}_${project.artifactId}</Bundle-SymbolicName>
                        <Bundle-Name>${project.artifactId}</Bundle-Name>
                        <Bundle-Version>${project.version}</Bundle-Version>
                        <Import-Package>org.apache.camel.component.hdfs2,*;resolution:=required</Import-Package>
                        <Service-Component>*</Service-Component>
                    </instructions>
                </configuration>
            </plugin>
        </plugins>
    </build>

</project>

4.0.0
com.inovexcorp.cas
cas驼峰hdfs
0.0.1-快照
Camel-HDFS集成示例
捆
UTF8
2.15.2
org.apache.camel
驼芯
${camel.version}
假如
org.apache.camel
骆驼-hdfs2
${camel.version}
假如
log4j
log4j
1.2.17
假如
org.ops4j.pax.logging
pax日志api
1.7.0
假如
商务英语
bndlib
2.3.0
假如
org.apache.maven.plugins
maven编译器插件
3.3
1.8
1.8
org.apache.felix
maven捆绑插件
真的
${project.groupId}{project.artifactId}
${project.artifactId}
${project.version}
org.apache.camel.component.hdfs2,*;分辨率:=必需
*

看起来像是典型的导入包/导出包问题。 显然包含有问题的类的包59是否实际导出了它

Export-Package: org.apache.hadoop.fs
更新后: 很明显,您导入了正确的包,但是根据您的清单

59 | Active |  80 | 0.0.1.SNAPSHOT | cas-camel-hdfs
karaf@root()> bundle:classes 59 | grep LocalFileSystem
 org/apache/hadoop/fs/LocalFileSystem.class
 org/apache/hadoop/fs/LocalFileSystemConfigKeys.class
 org/apache/hadoop/fs/RawLocalFileSystem$1.class
 org/apache/hadoop/fs/RawLocalFileSystem$DeprecatedRawLocalFileStatus.class
 org/apache/hadoop/fs/RawLocalFileSystem$LocalFSFileInputStream.class
 org/apache/hadoop/fs/RawLocalFileSystem$LocalFSFileOutputStream.class
 org/apache/hadoop/fs/RawLocalFileSystem.class
您自己的包也包含这些类,所以请确保不要将它们打包在一起。
您对org.apache.servicemix.bundles.hadoop-client的依赖关系很可能在pom中标记为编译依赖关系

所以,59实际上是我要导入的包。我将更新我的问题,使之更清楚。我的bundle正在利用camel hdfs功能包,并导入我认为必要的bundle。功能102karaf@root()>功能102 org.apache.servicemix.bundles.hadoop-client[102]提供:。。。osgi.wiring.package;org.apache.hadoop.fs 2.3.0是:com.inovexcorp.cas_cas-camel-hdfs[109]org.apache.camel.camel-hdfs2[88]所必需的……我实际上不知道为什么karaf bundle:classes会显示这些类(除非它实际上正在查看我的bundle引用的所有类)。我已经用POM和JAR中内容的视图(通过vim)更新了这个问题。谢谢-本
Export-Package: org.apache.hadoop.fs
59 | Active |  80 | 0.0.1.SNAPSHOT | cas-camel-hdfs
karaf@root()> bundle:classes 59 | grep LocalFileSystem
 org/apache/hadoop/fs/LocalFileSystem.class
 org/apache/hadoop/fs/LocalFileSystemConfigKeys.class
 org/apache/hadoop/fs/RawLocalFileSystem$1.class
 org/apache/hadoop/fs/RawLocalFileSystem$DeprecatedRawLocalFileStatus.class
 org/apache/hadoop/fs/RawLocalFileSystem$LocalFSFileInputStream.class
 org/apache/hadoop/fs/RawLocalFileSystem$LocalFSFileOutputStream.class
 org/apache/hadoop/fs/RawLocalFileSystem.class