Apache spark CDH 5.12上带Spark 2.2的IPython安装

Apache spark CDH 5.12上带Spark 2.2的IPython安装,apache-spark,pyspark,ipython,lts,Apache Spark,Pyspark,Ipython,Lts,我用RHEL在CDH 5.12上有一个Spark 2.2的集群,我正在尝试将IPython设置为与pyspark2一起使用。我已经安装了IPython 5.x LTS长期支持,但无法使其正常工作 到目前为止 yum -y update yum install epel-release yum -y install python-pip yum groupinstall 'Development Tools' yum install python-devel pip install IPytho

我用RHEL在CDH 5.12上有一个Spark 2.2的集群,我正在尝试将IPython设置为与pyspark2一起使用。我已经安装了IPython 5.x LTS长期支持,但无法使其正常工作

到目前为止

yum -y update
yum install epel-release
yum -y install python-pip
yum groupinstall 'Development Tools'
yum install python-devel

pip install IPython==5.0 --user

但我无法让它工作。有人知道我遗漏了什么吗?

pyspark启动脚本寻找

# Determine the Python executable to use for the driver:
if [[ -n "$IPYTHON_OPTS" || "$IPYTHON" == "1" ]]; then
  # If IPython options are specified, assume user wants to run IPython
  # (for backwards-compatibility)
  PYSPARK_DRIVER_PYTHON_OPTS="$PYSPARK_DRIVER_PYTHON_OPTS $IPYTHON_OPTS"
  PYSPARK_DRIVER_PYTHON="ipython"
elif [[ -z "$PYSPARK_DRIVER_PYTHON" ]]; then
  PYSPARK_DRIVER_PYTHON="${PYSPARK_PYTHON:-"$DEFAULT_PYTHON"}"
fi
在~/.bashrc中设置以下变量


它可能真的很接近,我仍然得到这个错误:env:ipython:没有这样的文件或目录
echo "export PATH=$PATH:/path_to_downloaded_spark/spark-1.6.0/bin"
echo "export PYSPARK_DRIVER_PYTHON=ipython"
echo "export PYSPARK_DRIVER_PYTHON_OPTS='notebook'