Python 3.x 未定义名称“spark”
我正在运行下面的代码,得到的错误名称“spark”没有定义。我已经安装了“仍在获取错误”,请帮助我解决此错误Python 3.x 未定义名称“spark”,python-3.x,pyspark,spyder,Python 3.x,Pyspark,Spyder,我正在运行下面的代码,得到的错误名称“spark”没有定义。我已经安装了“仍在获取错误”,请帮助我解决此错误 df = spark.createDataFrame([ (1, 144.5, 5.9, 33, 'M'), (2, 167.2, 5.4, 45, 'M'), (3, 124.1, 5.2, 23, 'F'), (4, 144.5, 5.9, 33, 'M'), (5, 133.2, 5.7, 54, 'F'), (3, 124.1, 5
df = spark.createDataFrame([
(1, 144.5, 5.9, 33, 'M'),
(2, 167.2, 5.4, 45, 'M'),
(3, 124.1, 5.2, 23, 'F'),
(4, 144.5, 5.9, 33, 'M'),
(5, 133.2, 5.7, 54, 'F'),
(3, 124.1, 5.2, 23, 'F'),
(5, 129.2, 5.3, 42, 'M'),
], ['id', 'weight', 'height', 'age', 'gender'])
试试这个:
from pyspark.sql.session import SparkSession
spark = SparkSession.builder.getOrCreate()
试试这个:
from pyspark.sql.session import SparkSession
spark = SparkSession.builder.getOrCreate()
参见本页:参见本页: