Mysql 如何在spark sql中处理反斜杠和引号?

Mysql 如何在spark sql中处理反斜杠和引号?,mysql,apache-spark-sql,Mysql,Apache Spark Sql,one spark sql表的字段kv中有一个json字符串; 比如 如何使用sql选择此行 我试过: select * from table_a where get_json_object(kv, "$.ad_report_status") = '1\\"}' select * from table_a where get_json_object(kv, "$.ad_report_status") = '1\"}' select * from table_a where get_json_ob

one spark sql表的字段kv中有一个json字符串; 比如

如何使用sql选择此行

我试过:

select * from table_a where get_json_object(kv, "$.ad_report_status") = '1\\"}'
select * from table_a where get_json_object(kv, "$.ad_report_status") = '1\"}'
select * from table_a where get_json_object(kv, "$.ad_report_status") = '1\\\"}'
select * from table_a where get_json_object(kv, "$.ad_report_status") = '1\\\"}'
以上所有sql都不起作用


那么如何匹配“ad_report_status”字段的值呢?

您可以通过编码和解码来存储和检索数据。有人将json存储到表中,但我无法搜索它。我知道我应该处理像\“这样的特殊字符,但每次尝试都失败了。
select * from table_a where get_json_object(kv, "$.ad_report_status") = '1\\"}'
select * from table_a where get_json_object(kv, "$.ad_report_status") = '1\"}'
select * from table_a where get_json_object(kv, "$.ad_report_status") = '1\\\"}'
select * from table_a where get_json_object(kv, "$.ad_report_status") = '1\\\"}'