Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
PostgreSQL JDBC接收器提升错误null(数组)类型不';没有到SQL数据库列类型的映射_Jdbc_Apache Kafka_Apache Kafka Connect_Debezium - Fatal编程技术网

PostgreSQL JDBC接收器提升错误null(数组)类型不';没有到SQL数据库列类型的映射

PostgreSQL JDBC接收器提升错误null(数组)类型不';没有到SQL数据库列类型的映射,jdbc,apache-kafka,apache-kafka-connect,debezium,Jdbc,Apache Kafka,Apache Kafka Connect,Debezium,我在尝试使用Kafka JDBC接收器复制数据库时遇到问题。当我将服务器运行到一个包含数组数据类型的表时,它会给出这个错误 ... Caused by: org.apache.kafka.connect.errors.ConnectException: null (ARRAY) type doesn't have a mapping to the SQL database column type ... 我希望保留相同的数组条件,不希望像对SQL Server所做的那样将其转换为字符串(因为S

我在尝试使用Kafka JDBC接收器复制数据库时遇到问题。当我将服务器运行到一个包含数组数据类型的表时,它会给出这个错误

...
Caused by: org.apache.kafka.connect.errors.ConnectException: null (ARRAY) type doesn't have a mapping to the SQL database column type
...
我希望保留相同的数组条件,不希望像对SQL Server所做的那样将其转换为字符串(因为SQL Server不允许数组数据类型)

这是我的连接配置:

{"name" :"pgsink_'$topic_name'",
    "config":{"connector.class":"io.confluent.connect.jdbc.JdbcSinkConnector",
            "tasks.max":"1",
            "topics":"'$table'",
            "connection.url":"jdbc:postgresql://",
            "connection.user":"",
            "connection.password":"",
            "transforms":"unwrap",
            "transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState",
            "transforms.unwrap.drop.tombstones": "false",
            "delete.handling.mode":"drop",
            "auto.create":"true",
            "auto.evolve":"true",
            "insert.mode":"upsert",
            "pk.fields":" '$pk'",
            "pk.mode":"record_key",
            "delete.enabled":"true",
            "destination.table.format":"public.'$table'",
            "connection.attempts":"60",
            "connection.backoff.ms":"100000"

}}
我的Kafka源代码来自Debezium,因为我希望保留相同的数据类型,所以我不将SMT放入源代码中。这是源配置:

{
"name":"pg_prod",
    "config":{
        "connector.class":"io.debezium.connector.postgresql.PostgresConnector",
    "plugin.name":"wal2json_streaming",
        "database.hostname":"",
        "database.port":"",
        "database.user":"",
        "database.password":"",
        "database.dbname":"",
        "database.server.name":"",
    "database.history.kafka.bootstrap.servers": "",
    "database.history.kafka.topic": "",
        "transforms":"unwrap,reroute",
    "table.whitelist":"public.table",
    "transforms.unwrap.type":"io.debezium.transforms.ExtractNewRecordState",
    "transforms.unwrap.delete.handling.mode": "drop",
    "transforms.unwrap.drop.tombstones": "false",
        "decimal.handling.mode":"double",
        "time.precision.mode":"connect",
    "transforms.reroute.type":"org.apache.kafka.connect.transforms.RegexRouter",
    "transforms.reroute.regex":"postgres.public.(.*)",
    "transforms.reroute.replacement":"$1",
    "errors.tolerance": "all",
        "errors.log.enable":true,
        "errors.log.include.messages":true,
    "kafkaPartition": "0",
    "snapshot.delay.ms":"1000",
    "schema.refresh.mode":"columns_diff_exclude_unchanged_toast"
    }
}