Java 架构x86_64的未定义符号_fcloseall“;

Java 架构x86_64的未定义符号_fcloseall“;,java,xcode,macos,hadoop,cmake,Java,Xcode,Macos,Hadoop,Cmake,我正在编译hadoop纱线节点管理器 编译环境:MacOS-10.14,java-1.7.0\u 80,cmake3.13.0-rc3withclang-1000.10.44.4,Maven 3.6.0,protocbuf2.5.0 我正试图在MacOS上安装Hadoop-2.2.0,但正如其文档所示 本机hadoop库仅在*nix平台上受支持。该库不能与Cygwin或Mac OS X平台配合使用 所以我必须重新编译Hadoop的源代码。在下载的hadoop-2.2.0-src文件夹中,运行mv

我正在编译hadoop纱线节点管理器

编译环境:
MacOS-10.14
java-1.7.0\u 80
cmake3.13.0-rc3
with
clang-1000.10.44.4
Maven 3.6.0
protocbuf2.5.0

我正试图在MacOS上安装Hadoop-2.2.0,但正如其文档所示

本机hadoop库仅在*nix平台上受支持。该库不能与Cygwin或Mac OS X平台配合使用

所以我必须重新编译Hadoop的源代码。在下载的
hadoop-2.2.0-src
文件夹中,运行
mvn-package-Pdist,native-DskipTests-Dtar
,编译几分钟后,新的本机库应该位于
hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/lib/native
。但是,我不断收到错误消息。有些已经通过修改源代码修复了,但现在我被困在编译hadoop纱线服务器节点管理器中

以下是错误消息:

 [exec] [ 57%] Linking C executable target/usr/local/bin/test-container-executor
 [exec] /Applications/CMake.app/Contents/bin/cmake -E cmake_link_script CMakeFiles/test-container-executor.dir/link.txt --verbose=1
 [exec] /Library/Developer/CommandLineTools/usr/bin/cc  -g -Wall -O2 -D_GNU_SOURCE -D_REENTRANT -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk -Wl,-search_paths_first -Wl,-headerpad_max_install_names  CMakeFiles/test-container-executor.dir/main/native/container-executor/test/test-container-executor.c.o  -o target/usr/local/bin/test-container-executor libcontainer.a 
 [exec] Undefined symbols for architecture x86_64:
 [exec]   "_fcloseall", referenced from:
 [exec]       _launch_container_as_user in libcontainer.a(container-executor.c.o)
 [exec] ld: symbol(s) not found for architecture x86_64
 [exec] clang: error: linker command failed with exit code 1 (use -v to see invocation)
 [exec] make[2]: *** [target/usr/local/bin/test-container-executor] Error 1
 [exec] make[1]: *** [CMakeFiles/test-container-executor.dir/all] Error 2
 [exec] make: *** [all] Error 2
我试图将cmake的编译器从
clang
切换到
gcc
,但没有用

关于错误信息,我发现以下代码

hadoop-2.2.0-src/hadoop纱线项目/hadoop纱线/hadoop纱线服务器/hadoop纱线服务器节点管理器/src/CMakeLists.txt

add_executable(test-container-executor
    main/native/container-executor/test/test-container-executor.c
)
target_link_libraries(test-container-executor
    container
)
output_directory(test-container-executor target/usr/local/bin)
/Library/Developer/CommandLineTools/usr/bin/cc  -g -Wall -O2 -D_GNU_SOURCE -D_REENTRANT -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk -Wl,-search_paths_first -Wl,-headerpad_max_install_names  CMakeFiles/test-container-executor.dir/main/native/container-executor/test/test-container-executor.c.o  -o target/usr/local/bin/test-container-executor libcontainer.a 
hadoop-2.2.0-src/hadoop纱线项目/hadoop纱线/hadoop纱线服务器/hadoop纱线服务器节点管理器/src/main/native/container executor/impl/configuration.c

int launch_container_as_user(const char *user, const char *app_id, 
               const char *container_id, const char *work_dir,
               const char *script_name, const char *cred_file,
               const char* pid_file, char* const* local_dirs,
               char* const* log_dirs, const char *resources_key,
               char* const* resources_values) {...}
hadoop-2.2.0-src/hadoop纱线项目/hadoop纱线/hadoop纱线服务器/hadoop纱线服务器节点管理器/target/native/CMakeFiles/test container executor.dir/link.txt

add_executable(test-container-executor
    main/native/container-executor/test/test-container-executor.c
)
target_link_libraries(test-container-executor
    container
)
output_directory(test-container-executor target/usr/local/bin)
/Library/Developer/CommandLineTools/usr/bin/cc  -g -Wall -O2 -D_GNU_SOURCE -D_REENTRANT -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk -Wl,-search_paths_first -Wl,-headerpad_max_install_names  CMakeFiles/test-container-executor.dir/main/native/container-executor/test/test-container-executor.c.o  -o target/usr/local/bin/test-container-executor libcontainer.a 
对于压缩文件
hadoop-2.2.0-src/hadoop-thread项目/hadoop-thread/hadoop-thread服务器/hadoop-thread服务器nodemanager/target/native/libcontainer.a
,我在解压缩后找到了
container executor.c.o
,但由于编码问题无法打开它

此外,以前编译此项目时出现错误:

 [exec] /Users/markdana/Downloads/hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/container-executor.c:1252:48: error: too many arguments to function call, expected 4, have 5
 [exec]     if (mount("none", mount_path, "cgroup", 0, controller) == 0) {
 [exec]         ~~~~~                                  ^~~~~~~~~~
 [exec] /Library/Developer/CommandLineTools/SDKs/MacOSX10.14.sdk/usr/include/sys/mount.h:399:1: note: 'mount' declared here
 [exec] int     mount(const char *, const char *, int, void *);
为了修复它,我将
mount.h
中函数
mount()
的声明临时修改为:

int mount(const char *, const char *, const char *,int, const char *);
这有点愚蠢,我知道,但至少是有效的。然后遇到问题中出现的新问题。我想知道他们是否担心,或者是链接库的一些bug


已经调试了一整天,对要做的事情毫无感觉。如果您能指出关键点,或分享一些处理cmake链接问题的类似经验,我们将不胜感激。

似乎OS X上不存在函数
fcloseall
。从:

fcloseall

此函数是对
fclose
的扩展。尽管OS X支持
fclose
,但不支持
fcloseall
。您可以使用
fclose
来实现
fcloseall
,方法是将文件指针存储在数组中并在数组中迭代


您需要重新设计应用程序,并存储每个应该使用
fcloseall
关闭的文件。之后,您可以对每个这样的文件使用simple
close
,如引文中所述。

非常感谢,我已经成功地构建了它。在文件
container executor.c
中,我搜索了
open
close
,在linux c中查找命令,并检查它们是否匹配。由于在
fcloseall()
之前,没有出现
fopen
,带有
fclose(stdin)
fclose(stdout)
fclose(stderr)
,我只是删除了
fcloseall()
,它就工作了!再次感谢!