Sql server 如何在使用merge语句插入/更新大容量xml数据时捕获错误

Sql server 如何在使用merge语句插入/更新大容量xml数据时捕获错误,sql-server,xml,xquery-sql,Sql Server,Xml,Xquery Sql,我想知道当我们使用merge语句插入/更新批量数据时,如果记录的数据有问题,如何捕获错误 请参见示例xml: <?xml version = "1.0" encoding="UTF-8" standalone="yes"?> <document> <employee> <id>1</id> <name>test1</name> <salary>2000</salary>

我想知道当我们使用merge语句插入/更新批量数据时,如果记录的数据有问题,如何捕获错误

请参见示例xml:

<?xml version = "1.0" encoding="UTF-8" standalone="yes"?>
<document>
  <employee>
  <id>1</id>
  <name>test1</name>
  <salary>2000</salary>
  </employee>
  <employee>
  <id>2</id>
  <name>test2</name>
  <salary>4000</salary>
  </employee>
  <employee>
  <id>3A</id>
  <name>test3</name>
  <salary>8000</salary>
  </employee>
</document>

1.
测试1
2000
2.
测试2
4000
3A
测试3
8000
这里是错误3A id为int类型,我们正在插入字母数字值,因此无法插入。我想以这样一种方式编写一个存储过程,它应该详细保存错误日志。因此,稍后当我们看到错误日志时,我们可以很容易地理解问题是什么或问题在哪里

请参见下面的示例:

CREATE TABLE [employee]
(
    [id]        INT,
    [name]      NVARCHAR(100),
    [salary]    INT,
)
GO

DECLARE @XML XML ='<?xml version = "1.0" encoding="UTF-8" standalone="yes"?>
<document>
  <employee>
  <id>1</id>
  <name>test1</name>
  <salary>2000</salary>
  </employee>
  <employee>
  <id>2</id>
  <name>test2</name>
  <salary>4000</salary>
  </employee>
  <employee>
  <id>3</id>
  <name>test3</name>
  <salary>8000</salary>
  </employee>
</document>'
MERGE employee AS [target]
USING   
(
    SELECT   
         tab.col.value('id[1]','int') as id
        ,tab.col.value('name[1]','nvarchar(100)') as name
        ,tab.col.value('salary[1]','int') as salary              
    FROM @xml.nodes('//employee') AS tab(col) 
) 
 AS [source] (id,name,salary) ON ([target].[id] = [source].[id])
        WHEN MATCHED THEN 
        UPDATE 
        SET 
            [target].[name]     = [source].[name],
            [target].[salary]   = [source].[salary]              
        WHEN NOT MATCHED THEN       
            INSERT (id,name,salary) 
                VALUES ([source].id,[source].name,[source].salary);
创建表[员工]
(
[id]INT,
[姓名]NVARCHAR(100),
[薪金]国际,
)
去
声明@XML=
1.
测试1
2000
2.
测试2
4000
3.
测试3
8000
'
将员工合并为[目标]
使用
(
挑选
tab.col.value('id[1],'int')作为id
,tab.col.value('name[1]”,'nvarchar(100)”作为名称
,tab.col.value('salary[1],'int')作为薪水
从@xml.nodes('//employee')作为选项卡(列)
) 
作为([target].[id]=[source].[id])上的[source](id、姓名、薪资)
当匹配时
更新
设置
[target].[name]=[source].[name],
[目标].[薪资]=[来源].[薪资]
当不匹配时
插入(身份证、姓名、工资)
值([source].id[source].name[source].salary);

请详细指导我。谢谢

一种方法是使用块和日志数据将代码包装到错误表中:

结构:

CREATE TABLE #employee
(
    [id]        INT,
    [name]      NVARCHAR(100),
    [salary]    INT,
);

CREATE TABLE #error_log(ID INT IDENTITY(1,1),
                        create_date DATETIME NOT NULL DEFAULT GETDATE(),
                        message NVARCHAR(1000));


DECLARE @XML XML ='<?xml version = "1.0" encoding="UTF-8" standalone="yes"?>
<document>
  <employee>
  <id>1</id>
  <name>test1</name>
  <salary>2000</salary>
  </employee>
  <employee>
  <id>2</id>
  <name>test2</name>
  <salary>4000</salary>
  </employee>
  <employee>
  <id>3A</id>
  <name>test3</name>
  <salary>8000</salary>
  </employee>
</document>';

输出:

╔════╦═════════════════════╦═════════════════════════════════════════════╗
║ ID ║     Create_date     ║                   message                   ║
╠════╬═════════════════════╬═════════════════════════════════════════════╣
║  1 ║ 2015-12-26 11:04:12 ║ Conversion failed when converting the       ║
║    ║                     ║ nvarchar value '3A' to data type int.       ║
╚════╩═════════════════════╩═════════════════════════════════════════════╝

很好,但是我们可以捕获数据中有错误的行吗?@Thomas不必在插入之前编写自定义代码来验证数据,否。请注意,SQL Server错误消息只提供有关值的详细信息。您可以在提供的XML中搜索值
3A
。请记住,
MERGE
是全部或全部。因此,如果出现一个错误,则不会插入/更新任何记录
╔════╦═════════════════════╦═════════════════════════════════════════════╗
║ ID ║     Create_date     ║                   message                   ║
╠════╬═════════════════════╬═════════════════════════════════════════════╣
║  1 ║ 2015-12-26 11:04:12 ║ Conversion failed when converting the       ║
║    ║                     ║ nvarchar value '3A' to data type int.       ║
╚════╩═════════════════════╩═════════════════════════════════════════════╝