Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/macos/8.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
C++ 在MacOSCatalina中编译hadoop本机库。错误:未知类型名称';constexpr';。别名声明是C++;11分机_C++_Macos_C++11_Unix_Hadoop - Fatal编程技术网

C++ 在MacOSCatalina中编译hadoop本机库。错误:未知类型名称';constexpr';。别名声明是C++;11分机

C++ 在MacOSCatalina中编译hadoop本机库。错误:未知类型名称';constexpr';。别名声明是C++;11分机,c++,macos,c++11,unix,hadoop,C++,Macos,C++11,Unix,Hadoop,我正在尝试在Unix中编译Hadoop本机库,我使用的是: Hadoop:branch-3.2() macOS:Catalina版本10.15.5 Protobuf:libprotoc2.5.0 OpenSSL 1.0.2t 2019年9月10日 gcc:Apple clang 11.0.3版(clang-1103.0.32.62)目标:x86_64-Apple-darwin19.5.0线程模型:posix InstalledDir:/Library/Developer/CommandLineT

我正在尝试在Unix中编译Hadoop本机库,我使用的是:

  • Hadoop:branch-3.2()
  • macOS:Catalina版本10.15.5
  • Protobuf:libprotoc2.5.0
  • OpenSSL 1.0.2t 2019年9月10日
  • gcc:Apple clang 11.0.3版(clang-1103.0.32.62)目标:x86_64-Apple-darwin19.5.0线程模型:posix InstalledDir:/Library/Developer/CommandLineTools/usr/bin
  • java版本“1.8.0_251”(版本1.8.0_251-b08)
  • 我在这里遵循了所有步骤: 在修复了openssl和protobuf的几个问题之后,现在看来Apache Hadoop MapReduce NativeTask的编译失败了,因为它与C++11存在依赖关系

    我是否需要以某种方式指定将CXXFLAGS参数设置为-std=c++11?我该怎么做?还是我的gcc中缺少一些配置?

    我尝试将别名设置为gcc,这样它会立即被调用,并带有如下所述的标志:,但仍然不起作用

    到目前为止,这些是我遇到的许多类似错误之一:

    [WARNING] /usr/local/include/snappy-stubs-public.h:61:16: warning: alias declarations are a C++11 extension [-Wc++11-extensions]
    
    [WARNING] using uint64 = std::uint64_t;
    
    [WARNING] In file included from /Users/josh/Dev/hadoop/repo/hadoop/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/TestCompressions.cc:29:
    
    [WARNING] /usr/local/include/snappy.h:197:10: error: unknown type name 'constexpr'
    
    [WARNING]   static constexpr int kBlockLog = 16;
    
    [INFO] Apache Hadoop MapReduce NativeTask ................. FAILURE [  1.995 s]
    
    在更仔细地阅读输出之后,我认为这是最终抛出错误的命令:

    [WARNING] /Library/Developer/CommandLineTools/usr/bin/make  -f CMakeFiles/nttest.dir/build.make CMakeFiles/nttest.dir/build
    [WARNING] [ 75%] Building CXX object CMakeFiles/nttest.dir/main/native/test/TestCompressions.cc.o
    [WARNING] /Library/Developer/CommandLineTools/usr/bin/c++   -I/Users/josh/Dev/hadoop/repo/hadoop/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/target/native/javah -I/Users/josh/Dev/hadoop/repo/hadoop/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src -I/Users/josh/Dev/hadoop/repo/hadoop/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/util -I/Users/josh/Dev/hadoop/repo/hadoop/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib -I/Users/josh/Dev/hadoop/repo/hadoop/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test -I/Users/josh/Dev/hadoop/repo/hadoop/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src -I/Users/josh/Dev/hadoop/repo/hadoop/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/target/native -I/Library/Java/JavaVirtualMachines/jdk1.8.0_251.jdk/Contents/Home/include -I/Library/Java/JavaVirtualMachines/jdk1.8.0_251.jdk/Contents/Home/include/darwin -I/usr/local/include -isystem /Users/josh/Dev/hadoop/repo/hadoop/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/../../../../hadoop-common-project/hadoop-common/src/main/native/gtest/include  -g -O2 -Wall -pthread -D_FILE_OFFSET_BITS=64 -DNDEBUG -DSIMPLE_MEMCPY -fno-strict-aliasing -fsigned-char -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX10.15.sdk   -o CMakeFiles/nttest.dir/main/native/test/TestCompressions.cc.o -c /Users/josh/Dev/hadoop/repo/hadoop/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/TestCompressions.cc
    [WARNING] In file included from /Users/josh/Dev/hadoop/repo/hadoop/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/TestCompressions.cc:29:
    [WARNING] In file included from /usr/local/include/snappy.h:45:
    [WARNING] /usr/local/include/snappy-stubs-public.h:54:14: warning: alias declarations are a C++11 extension [-Wc++11-extensions]
    [WARNING] using int8 = std::int8_t;
    

    谢谢大家!

    我也很难在OSX上用
    clang
    构建hadoop本机库。我更成功地使用了
    gcc
    10

    首次使用自制软件安装:

    brew install gcc
    
    这应该将
    gcc
    二进制文件放在路径中,但它们将是
    gcc-10
    g++-10
    等。因此设置一些环境魔法应该促使CMake使用这些而不是内置的二进制文件:

    export CC=$(which gcc-10)
    export CXX=$(which g++-10)
    export CPP=$(which cpp-10)
    export LD=$(which gcc-10)
    

    我在OSX上用
    clang
    构建hadoop本机库时也遇到了困难。我更成功地使用了
    gcc
    10

    首次使用自制软件安装:

    brew install gcc
    
    这应该将
    gcc
    二进制文件放在路径中,但它们将是
    gcc-10
    g++-10
    等。因此设置一些环境魔法应该促使CMake使用这些而不是内置的二进制文件:

    export CC=$(which gcc-10)
    export CXX=$(which g++-10)
    export CPP=$(which cpp-10)
    export LD=$(which gcc-10)
    

    嘿@growse!非常感谢你的回答!我添加了这些变量,它们看起来是这样的:echo$CC/usr/local/bin/gcc-10、$echo$CXX/usr/local/bin/g++-10、$echo$CPP/usr/local/bin/CPP-10、$echo$LD/usr/local/bin/gcc-10。所以可能使用了这些二进制文件,但我是否应该以某种方式安装版本11?因为错误仍然是一样的。它说它们是C++11扩展:(我在错误之前也看到了这一点,这使我认为CXX env变量被忽略:[WARNING][75%]正在构建CXX对象CMakeFiles/nttest.dir/main/native/test/TestCompressions.cc.o[警告]/Library/Developer/CommandLineTools/usr/bin/c++。有没有办法添加-std=c++11标志来编译hadoop?在使用新的环境变量重建之前,您是否执行了
    mvn clean
    git clean
    ?我担心CMake会“烤入”您以前的环境文件生成了makefiles。遗憾的是,在清理之后,我遇到了更多问题。在官方hadoop文档(最近更新)中发现:“本机hadoop库仅在*nix平台上受支持。该库不适用于Cygwin或Mac OS X平台。”。我想我们会被困在支持它之前。嘿@growse!非常感谢你的回答!我添加了这些变量,它们看起来是这样的:echo$CC/usr/local/bin/gcc-10,$echo$CXX/usr/local/bin/g++-10,$echo$CPP/usr/local/bin/CPP-10,$echo$LD/usr/local/bin/gcc-10。所以可能使用了这些二进制文件,但我应该安装它们吗rsion 11?因为错误仍然是一样的。它说它们是C++11扩展:(我在错误之前也看到了这一点,这使我认为CXX env变量被忽略:[警告][75%]构建CXX对象CMakeFiles/nttest.dir/main/native/test/TestCompressions.cc.o[警告]/Library/Developer/CommandLineTools/usr/bin/c++。有没有办法添加-std=c++11标志来编译hadoop?在使用新的环境变量重建之前,您是否执行了
    mvn clean
    git clean
    ?我担心CMake会“烤入”您以前的环境文件生成了makefiles。遗憾的是,在清理之后,我遇到了更多问题。在官方hadoop文档(最近更新)中发现:“本机hadoop库仅在*nix平台上受支持。该库不允许与Cygwin或Mac OS X平台一起使用。”。我想我们会一直坚持到支持它为止。