Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/18.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Spark SQL with Scala:RegisterEmptable的弃用警告_Scala_Apache Spark_Apache Spark Sql - Fatal编程技术网

Spark SQL with Scala:RegisterEmptable的弃用警告

Spark SQL with Scala:RegisterEmptable的弃用警告,scala,apache-spark,apache-spark-sql,Scala,Apache Spark,Apache Spark Sql,在尝试创建临时表时出现以下警告 请帮助解决此警告 scala>df.RegisterEmptable(“df”) 警告:有一个弃用警告;有关详细信息,请使用-deprecation重新运行 Spark 2.0中不推荐使用RegisterEmptable方法 createOrReplaceTempView是受支持的替换功能Spark 2.0中不推荐使用RegisterEmptable方法 createOrReplaceTempView是受支持的替换功能Spark代码查看文档中的此消息 改用crea

在尝试创建临时表时出现以下警告 请帮助解决此警告

scala>df.RegisterEmptable(“df”) 警告:有一个弃用警告;有关详细信息,请使用-deprecation重新运行


Spark 2.0中不推荐使用
RegisterEmptable
方法


createOrReplaceTempView
是受支持的替换功能

Spark 2.0中不推荐使用
RegisterEmptable
方法

createOrReplaceTempView
是受支持的替换功能

Spark代码查看文档中的此消息

改用createOrReplaceTempView(视图名称)

使用
createOrReplaceTempView
连接示例数据集的示例用法演示:

   package com.examples

import com.droolsplay.util.SparkSessionSingleton
import org.apache.log4j.{Level, Logger}
import org.apache.spark.internal.Logging
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions._

/**
  * Join Example and some basics demonstration using sample data.
  *
  * @author : Ram Ghadiyaram
  */
object JoinExamplesv2 extends Logging {
  // switch off  un necessary logs
  Logger.getLogger("org").setLevel(Level.OFF)
  Logger.getLogger("akka").setLevel(Level.OFF)
  //  val spark: SparkSession = SparkSession.builder.config("spark.master", "local").getOrCreate;
  val spark: SparkSession = SparkSessionSingleton.getInstance(Option(this.getClass.getName))

  /**
    * main
    *
    * @param args Array[String]
    */
  def main(args: Array[String]): Unit = {
    import spark.implicits._
    /**
      * create 2 dataframes here using case classes one is Person df1 and another one is profile df2
      */
    val df1 = spark.sqlContext.createDataFrame(
      spark.sparkContext.parallelize(
        Person("Sarath", 33, 2)
          :: Person("Vasudha Nanduri", 30, 2)
          :: Person("Ravikumar Ramasamy", 34, 5)
          :: Person("Ram Ghadiyaram", 42, 9)
          :: Person("Ravi chandra Kancharla", 43, 9)
          :: Nil))


    val df2 = spark.sqlContext.createDataFrame(
      Profile("Spark", 2, "SparkSQLMaster")
        :: Profile("Spark", 5, "SparkGuru")
        :: Profile("Spark", 9, "DevHunter")
        :: Nil
    )

    // you can do alias to refer column name with aliases to  increase readablity

    val df_asPerson = df1.as("dfperson")
    val df_asProfile = df2.as("dfprofile")
    /** *
      * Example displays how to join them in the dataframe level
      * next example demonstrates using sql with createOrReplaceTempView
      */
    val joined_df = df_asPerson.join(
      df_asProfile
      , col("dfperson.personid") === col("dfprofile.personid")
      , "inner")
    joined_df.select(
      col("dfperson.name")
      , col("dfperson.age")
      , col("dfprofile.name")
      , col("dfprofile.profileDescription"))
      .show

    /// example using sql statement after registering createOrReplaceTempView

    df_asPerson.createOrReplaceTempView("dfPerson");
    df_asProfile.createOrReplaceTempView("dfprofile")
    // this is example of plain sql
    val dfJoin = spark.sqlContext.sql(
      """SELECT dfperson.name, dfperson.age, dfprofile.profileDescription
                          FROM  dfperson JOIN  dfprofile
                          ON dfperson.personid == dfprofile.personid""")
    logInfo("Example using sql statement after registering createOrReplaceTempView ")
    dfJoin.show(false)

  }

  // models here

  case class Person(name: String, age: Int, personid: Int)

  case class Profile(name: String, personId: Int, profileDescription: String)

}
结果:

+--------------------+---+-----+------------------+
|                name|age| name|profileDescription|
+--------------------+---+-----+------------------+
|              Sarath| 33|Spark|    SparkSQLMaster|
|     Vasudha Nanduri| 30|Spark|    SparkSQLMaster|
|  Ravikumar Ramasamy| 34|Spark|         SparkGuru|
|      Ram Ghadiyaram| 42|Spark|         DevHunter|
|Ravi chandra Kanc...| 43|Spark|         DevHunter|
+--------------------+---+-----+------------------+

18/11/12 23:03:38 INFO JoinExamplesv2: Example using sql statement after registering createOrReplaceTempView 
+----------------------+---+------------------+
|name                  |age|profileDescription|
+----------------------+---+------------------+
|Sarath                |33 |SparkSQLMaster    |
|Vasudha Nanduri       |30 |SparkSQLMaster    |
|Ravikumar Ramasamy    |34 |SparkGuru         |
|Ram Ghadiyaram        |42 |DevHunter         |
|Ravi chandra Kancharla|43 |DevHunter         |
Spark代码从doc查看此消息

改用createOrReplaceTempView(视图名称)

使用
createOrReplaceTempView
连接示例数据集的示例用法演示:

   package com.examples

import com.droolsplay.util.SparkSessionSingleton
import org.apache.log4j.{Level, Logger}
import org.apache.spark.internal.Logging
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions._

/**
  * Join Example and some basics demonstration using sample data.
  *
  * @author : Ram Ghadiyaram
  */
object JoinExamplesv2 extends Logging {
  // switch off  un necessary logs
  Logger.getLogger("org").setLevel(Level.OFF)
  Logger.getLogger("akka").setLevel(Level.OFF)
  //  val spark: SparkSession = SparkSession.builder.config("spark.master", "local").getOrCreate;
  val spark: SparkSession = SparkSessionSingleton.getInstance(Option(this.getClass.getName))

  /**
    * main
    *
    * @param args Array[String]
    */
  def main(args: Array[String]): Unit = {
    import spark.implicits._
    /**
      * create 2 dataframes here using case classes one is Person df1 and another one is profile df2
      */
    val df1 = spark.sqlContext.createDataFrame(
      spark.sparkContext.parallelize(
        Person("Sarath", 33, 2)
          :: Person("Vasudha Nanduri", 30, 2)
          :: Person("Ravikumar Ramasamy", 34, 5)
          :: Person("Ram Ghadiyaram", 42, 9)
          :: Person("Ravi chandra Kancharla", 43, 9)
          :: Nil))


    val df2 = spark.sqlContext.createDataFrame(
      Profile("Spark", 2, "SparkSQLMaster")
        :: Profile("Spark", 5, "SparkGuru")
        :: Profile("Spark", 9, "DevHunter")
        :: Nil
    )

    // you can do alias to refer column name with aliases to  increase readablity

    val df_asPerson = df1.as("dfperson")
    val df_asProfile = df2.as("dfprofile")
    /** *
      * Example displays how to join them in the dataframe level
      * next example demonstrates using sql with createOrReplaceTempView
      */
    val joined_df = df_asPerson.join(
      df_asProfile
      , col("dfperson.personid") === col("dfprofile.personid")
      , "inner")
    joined_df.select(
      col("dfperson.name")
      , col("dfperson.age")
      , col("dfprofile.name")
      , col("dfprofile.profileDescription"))
      .show

    /// example using sql statement after registering createOrReplaceTempView

    df_asPerson.createOrReplaceTempView("dfPerson");
    df_asProfile.createOrReplaceTempView("dfprofile")
    // this is example of plain sql
    val dfJoin = spark.sqlContext.sql(
      """SELECT dfperson.name, dfperson.age, dfprofile.profileDescription
                          FROM  dfperson JOIN  dfprofile
                          ON dfperson.personid == dfprofile.personid""")
    logInfo("Example using sql statement after registering createOrReplaceTempView ")
    dfJoin.show(false)

  }

  // models here

  case class Person(name: String, age: Int, personid: Int)

  case class Profile(name: String, personId: Int, profileDescription: String)

}
结果:

+--------------------+---+-----+------------------+
|                name|age| name|profileDescription|
+--------------------+---+-----+------------------+
|              Sarath| 33|Spark|    SparkSQLMaster|
|     Vasudha Nanduri| 30|Spark|    SparkSQLMaster|
|  Ravikumar Ramasamy| 34|Spark|         SparkGuru|
|      Ram Ghadiyaram| 42|Spark|         DevHunter|
|Ravi chandra Kanc...| 43|Spark|         DevHunter|
+--------------------+---+-----+------------------+

18/11/12 23:03:38 INFO JoinExamplesv2: Example using sql statement after registering createOrReplaceTempView 
+----------------------+---+------------------+
|name                  |age|profileDescription|
+----------------------+---+------------------+
|Sarath                |33 |SparkSQLMaster    |
|Vasudha Nanduri       |30 |SparkSQLMaster    |
|Ravikumar Ramasamy    |34 |SparkGuru         |
|Ram Ghadiyaram        |42 |DevHunter         |
|Ravi chandra Kancharla|43 |DevHunter         |

尝试使用“df”以外的其他名称更改临时表尝试使用“df”以外的其他名称更改临时表