Apache spark 使用IPython时PySpark异常

Apache spark 使用IPython时PySpark异常,apache-spark,ipython,pyspark,Apache Spark,Ipython,Pyspark,我在ubuntu 12.04中安装了PySpark和Ipython笔记本 在运行“ipython--profile=pyspark”安装之后,它会抛出以下异常 ubuntu_user@ubuntu_user-VirtualBox:~$ ipython --profile=pyspark Python 2.7.3 (default, Jun 22 2015, 19:33:41) Type "copyright", "credits" or "license" for more informa

我在ubuntu 12.04中安装了PySpark和Ipython笔记本

在运行“ipython--profile=pyspark”安装之后,它会抛出以下异常

ubuntu_user@ubuntu_user-VirtualBox:~$ ipython --profile=pyspark  
Python 2.7.3 (default, Jun 22 2015, 19:33:41) 
Type "copyright", "credits" or "license" for more information.

IPython 0.12.1 -- An enhanced Interactive Python.
?         -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help      -> Python's own help system.
object?   -> Details about 'object', use 'object??' for extra details.

IPython profile: pyspark
Error: Must specify a primary resource (JAR or Python or R file)
Run with --help for usage help or --verbose for debug output
---------------------------------------------------------------------------
Exception                                 Traceback (most recent call last)
/usr/lib/python2.7/dist-packages/IPython/utils/py3compat.pyc in execfile(fname, *where)
    173             else:
    174                 filename = fname
--> 175             __builtin__.execfile(filename, *where)

/home/ubuntu_user/.config/ipython/profile_pyspark/startup/00-pyspark-setup.py in <module>()
      6 sys.path.insert(0, os.path.join(spark_home, 'python/lib/py4j-0.8.2.1-src.zip'))
      7 
----> 8 execfile(os.path.join(spark_home, 'python/pyspark/shell.py'))
      9 

/home/ubuntu_user/spark/python/pyspark/shell.py in <module>()
     41     SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"])
     42 
---> 43 sc = SparkContext(pyFiles=add_files)
     44 atexit.register(lambda: sc.stop())
     45 

/home/ubuntu_user/spark/python/pyspark/context.pyc in __init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)
    108         """
    109         self._callsite = first_spark_call() or CallSite(None, None, None)
--> 110         SparkContext._ensure_initialized(self, gateway=gateway)
    111         try:
    112             self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,

/home/ubuntu_user/spark/python/pyspark/context.pyc in _ensure_initialized(cls, instance, gateway)
    232         with SparkContext._lock:
    233             if not SparkContext._gateway:
--> 234                 SparkContext._gateway = gateway or launch_gateway()
    235                 SparkContext._jvm = SparkContext._gateway.jvm
    236 

/home/ubuntu_user/spark/python/pyspark/java_gateway.pyc in launch_gateway()
     92                 callback_socket.close()
     93         if gateway_port is None:
---> 94             raise Exception("Java gateway process exited before sending the driver its port number")
     95 
     96         # In Windows, ensure the Java child processes do not linger after Python has exited.


Exception: Java gateway process exited before sending the driver its port number
下面是IPython设置

ubuntu_user@ubuntu_user-VirtualBox:~$ ls .config/ipython/profile_pyspark/
db              ipython_config.py           log  security
history.sqlite  ipython_notebook_config.py  pid  startup
IPython和Spark(PySpark)配置

ubuntu_user@ubuntu_user-VirtualBox:~$ vi .config/ipython/profile_pyspark/ipython_notebook_config.py

# Configuration file for ipython-notebook.

c = get_config()

# IPython PySpark
c.NotebookApp.ip = 'localhost'
c.NotebookApp.open_browser = False
c.NotebookApp.port = 7770


ubuntu_user@ubuntu_user-VirtualBox:~$ vi .config/ipython/profile_pyspark/startup/00-pyspark-setup.py
import os
import sys

spark_home = os.environ.get('SPARK_HOME', None)
sys.path.insert(0, spark_home + "/python")
sys.path.insert(0, os.path.join(spark_home, 'python/lib/py4j-0.8.2.1-src.zip'))

execfile(os.path.join(spark_home, 'python/pyspark/shell.py'))
在.bashrc或.bash_配置文件中设置以下环境变量:

ubuntu_user@ubuntu_user-VirtualBox:~$ vi .bashrc 
export SPARK_HOME="/home/ubuntu_user/spark"
export PYSPARK_SUBMIT_ARGS="--master local[2]"

我是apache spark和IPython的新手。如何解决这个问题?

当我的虚拟机没有足够的Java内存时,我也遇到了同样的异常。所以我为我的虚拟机分配了更多内存,这个异常就消失了

步骤:关闭VM->VirtualBox设置->系统选项卡->设置内存


(不过,这可能只是一种解决方法。我想修复此异常的正确方法可能是在java内存方面正确配置Spark。)

可能是在Spark旁定位pyspark外壳时出错

export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.9-src.zip:$PYTHONPATH
这将适用于Spark 1.6.1。如果您有不同的版本,请尝试定位.zip文件并将路径添加到解压缩

两个想法: 你的JDK在哪里?我没有看到您的文件中配置了JAVA_HOME参数。考虑到:

Error: Must specify a primary resource (JAR or Python or R file)
第二,确保您的端口7770已打开并可供JVM使用

Error: Must specify a primary resource (JAR or Python or R file)