Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/azure/11.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
无法使用azure sdk python for azure数据工厂创建数据集_Azure_Dataset_Azure Data Factory_Azure Sdk Python - Fatal编程技术网

无法使用azure sdk python for azure数据工厂创建数据集

无法使用azure sdk python for azure数据工厂创建数据集,azure,dataset,azure-data-factory,azure-sdk-python,Azure,Dataset,Azure Data Factory,Azure Sdk Python,我试图使用azure sdk for python在ADF中创建数据集,不幸的是,我遇到了此错误消息。我不确定下面的代码有什么问题 dsOut_name = 'POC_DatasetName' ds_ls ="AzureBlobStorage" output_blobpath = '/tempdir' df_name = 'pipeline1' dsOut_azure_blob = AzureBlobDataset(linked_service_name=ds_ls, folder_path=o

我试图使用azure sdk for python在ADF中创建数据集,不幸的是,我遇到了此错误消息。我不确定下面的代码有什么问题

dsOut_name = 'POC_DatasetName'
ds_ls ="AzureBlobStorage"
output_blobpath = '/tempdir'
df_name = 'pipeline1'
dsOut_azure_blob = AzureBlobDataset(linked_service_name=ds_ls, folder_path=output_blobpath)
dsOut = adf_client.datasets.create_or_update(rg_name, df_name, dsOut_name, dsOut_azure_blob)
print_item(dsOut)

请帮助

我可以复制您的问题,这行
ds\u ls=“AzureBlobStorage”
是错误的,应该是
ds\u ls=LinkedService引用(reference\u name=ls\u name)

你可以参考我完整的工作样本

确保您的服务负责人在数据工厂的
访问控制(IAM)
中具有RBAC角色(例如
所有者
参与者
),并且您已完成所有操作

我的软件包版本:

azure-mgmt-datafactory  0.6.0
azure-mgmt-resource  3.1.0
azure-common  1.1.23

代码:

来自azure.common.credentials导入服务PrincipalCredentials
从azure.mgmt.resource导入ResourceManagementClient
从azure.mgmt.datafactory导入DataFactoryManagementClient
从azure.mgmt.datafactory.models导入*
订阅\u id=“”
ls_name='storageLinkedService'
rg_名称=“”
df_名称=“”
凭证=服务原则凭证(客户端id=“”,
机密=“”,租户=“”)
resource\u client=ResourceManagementClient(凭据、订阅\u id)
adf_client=DataFactoryManagementClient(凭据、订阅id)
存储\u string=SecureString('DefaultEndpointsProtocol=https;AccountName=;AccountKey='))
ls\u azure\u存储=AzureStorageLinkedService(连接字符串=存储字符串)
ls=adf_客户端。链接_服务。创建_或_更新(rg_名称、df_名称、ls_名称、ls_azure_存储)
ds\U ls=LinkedServiceReference(引用名称=ls\U名称)
#创建Azure blob数据集(输出)
dsOut\u name='ds\u out'
输出\u blobpath='/'
dsOut\u azure\u blob=AzureBlobDataset(链接的\u服务\u名称=ds\u ls,文件夹\u路径=output\u blobpath)
dsOut=adf_客户端。数据集。创建_或_更新(rg_名称、df_名称、dsOut_名称、dsOut_azure_blob)
打印(数据输出)

azure-mgmt-datafactory  0.6.0
azure-mgmt-resource  3.1.0
azure-common  1.1.23
from azure.common.credentials import ServicePrincipalCredentials
from azure.mgmt.resource import ResourceManagementClient
from azure.mgmt.datafactory import DataFactoryManagementClient
from azure.mgmt.datafactory.models import *


subscription_id = '<subscription-id>'
ls_name = 'storageLinkedService'
rg_name = '<group-name>'
df_name = '<datafactory-name>'

credentials = ServicePrincipalCredentials(client_id='<client id of the service principal>',
                                          secret='<secret of the service principal>', tenant='<tenant-id>')
resource_client = ResourceManagementClient(credentials, subscription_id)
adf_client = DataFactoryManagementClient(credentials, subscription_id)


storage_string = SecureString('DefaultEndpointsProtocol=https;AccountName=<storage account name>;AccountKey=<storage account key>')

ls_azure_storage = AzureStorageLinkedService(connection_string=storage_string)
ls = adf_client.linked_services.create_or_update(rg_name, df_name, ls_name, ls_azure_storage)

ds_ls = LinkedServiceReference(reference_name=ls_name)


# Create an Azure blob dataset (output)
dsOut_name = 'ds_out'
output_blobpath = '<container name>/<folder name>'
dsOut_azure_blob = AzureBlobDataset(linked_service_name=ds_ls, folder_path=output_blobpath)
dsOut = adf_client.datasets.create_or_update(rg_name, df_name, dsOut_name, dsOut_azure_blob)
print(dsOut)