向SQL Alchemy会话注册用户定义的函数
我正在使用sql alchemy会话在Azure Databricks中的数据库上运行sql查询。我的查询包含一个用户定义的函数,但当我运行查询时,它会返回 infoMessages=[“*org.apache.hive.service.cli.HiveSQLException:错误 正在运行的查询:org.apache.spark.sql.AnalysisException:未定义 功能:我的方法 下面的例子向SQL Alchemy会话注册用户定义的函数,sql,sqlalchemy,user-defined-functions,pyhive,Sql,Sqlalchemy,User Defined Functions,Pyhive,我正在使用sql alchemy会话在Azure Databricks中的数据库上运行sql查询。我的查询包含一个用户定义的函数,但当我运行查询时,它会返回 infoMessages=[“*org.apache.hive.service.cli.HiveSQLException:错误 正在运行的查询:org.apache.spark.sql.AnalysisException:未定义 功能:我的方法 下面的例子 class DatabaseQuery(DatabaseLibrary):
class DatabaseQuery(DatabaseLibrary):
def __init__(self):
self.table_model = None
self.column_names = []
self.conn = None
self.meta = None
self.engine = None
self.session = None
self.query_list = []
def connect_database(self, region, token, database, http_path):
try:
dbfs_engine = create_engine(
"databricks+pyhive://token:"
+ token
+ "@"
+ region
+ "xxxxxx/"
+ database,
connect_args={"http_path": http_path},
echo=True,
)
self._set_metadata_databricks(dbfs_engine)
Session = sessionmaker(bind=dbfs_engine)
self.session = Session()
self.engine = dbfs_engine
self.conn = dbfs_engine.connect()
except Exception as e:
traceback.print_exc()
raise
def my_method(name):
return Upper(name)
query =("select my_method(names.NAME) from db.names").fetchall()
result = self.session.execute(query).fetchall()
使用pyspark,我可以简单地通过向spark对象注册udf来完成这项工作
convert_maximo_date = udf(common.convert_maximo_date)
self.spark.udf.register("convert_maximo_date", convert_maximo_date)
这是否可以对SQL Alchemy连接执行类似的操作,以便执行具有用户定义函数的查询