Apache spark 使用pyspark在databricks中实现FileNotFound异常
我试图在databricks中使用pyspark实现异常处理,其中我需要检查文件是否存在于源位置Apache spark 使用pyspark在databricks中实现FileNotFound异常,apache-spark,pyspark,apache-spark-sql,databricks,azure-databricks,Apache Spark,Pyspark,Apache Spark Sql,Databricks,Azure Databricks,我试图在databricks中使用pyspark实现异常处理,其中我需要检查文件是否存在于源位置 **df= spark.read.csv.option("inferschema","true").load("mnt/pnt/abc.csv") try: df = open("abc.csv","rt") print("File opened"
**df= spark.read.csv.option("inferschema","true").load("mnt/pnt/abc.csv")
try:
df = open("abc.csv","rt")
print("File opened")
except FileNotFoundError:
print("File does not exist")
except:
print("Other error")**
我希望有类似于上面代码片段的东西,但是我不能采用这种方法。我想请求一些帮助,非常感谢您不能直接排除java.io错误,但是您可以执行以下操作:
def read_file(path):
try:
dbutils.fs.ls(path)
return spark.read.option("inferschema","true").csv(path)
except Exception as e:
if 'java.io.FileNotFoundException' in str(e):
print('File does not exists')
else:
print('Other error')
read_file('mnt/pnt/abc.csv')