Giter Club home page Giter Club logo

vsts-publish-adf's Introduction

vsts-publish-adf's People

Contributors

dbrtly avatar liprec avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

vsts-publish-adf's Issues

Issue with delete action

Hi,

I just tried your delete action.
I dont fill it for linked services but the job deletes all items.

YAML :

steps:
- task: liprec.vsts-publish-adf.delete-adf-items.delete-adf-items@2
  displayName: 'Delete items from '
  inputs:
    azureSubscription: 'aaa'
    ResourceGroupName: 'bbb'
    DatafactoryName: ccc
    PipelineFilter: '*'
    DatasetFilter: '*'
    TriggerFilter: '*'
    Throttle: 1

Log :

2020-05-13T08:36:59.8124795Z Found 13 linked service(s).
2020-05-13T08:36:59.8126590Z Delete linked service 'LS_SQL'.
2020-05-13T08:37:00.6369964Z Delete linked service 'LS_SFDC'.
2020-05-13T08:37:00.9948543Z Delete linked service 'LS_KV_'.
2020-05-13T08:37:01.3383580Z ##[error]Error deleting 'LS_KV_' linked service : {"error":{"code":"BadRequest","message":"The document cannot be deleted since it is referenced by LS_ASQL_dusqlsrv_datacatalog.","target":"/Microsoft.DataFactory/factories/dudfqa01/linkedservices/LS_KV_","details":null}}
2020-05-13T08:37:01.3403222Z Delete linked service 'LS_FILE_'.
2020-05-13T08:37:01.3403740Z 13 linked service(s) deleted.
2020-05-13T08:37:01.3407453Z ##[error]Error deleting 'LS_KV_' linked service : {"error":{"code":"BadRequest","message":"The document cannot be deleted since it is referenced by LS_ASQL_.","target":"/Microsoft.DataFactory/factories/dudfqa01/linkedservices/LS_KV_","details":null}}
2020-05-13T08:37:02.1480163Z Delete linked service 'LS_FILE_'.
2020-05-13T08:37:03.3605479Z Delete linked service 'LS_FILE_'.
2020-05-13T08:37:03.3606066Z Delete linked service 'LS_DBR_'.
2020-05-13T08:37:03.6289694Z Delete linked service 'LS_DB2_'.
2020-05-13T08:37:04.3672617Z Delete linked service 'LS_ASQL_'.
2020-05-13T08:37:04.8071223Z Delete linked service 'LS_ASQL_'.
2020-05-13T08:37:05.2073642Z Delete linked service 'LS_ADLS_'.
2020-05-13T08:37:05.7605143Z Delete linked service 'LS_ADLS_'.
2020-05-13T08:37:06.0828130Z Delete linked service 'LS_ABLB_'.

Best regards,
Florian

Document details

āš  Do not edit this section. It is required for docs.jppp.org āžŸ GitHub issue linking.

Trigger pipeline task doesn't find pipelines when there are many pipelines.

Hi, thanks for providing these tasks, they are saving me a lot of work!

We have a DTAP ADF setup in DevOps with a release pipeline that deploys from dev -> test -> acc -> prod. One issue that I have experienced, is that at any point in time, we may have manually enabled/disabled some triggers in test,a cc or propd, and we want the status of the triggers, i.e. whether they are startted or stopped to be preserved when a deployment happens.

I have implemented this in a bit of a roun-about way as I don't yet know much about creating Devops tasks. I have created two ADF pipelines. A Push triggers pipeline, which uses the ADF REST api to get the current state of all the trigegrs in the specified DF and writes it to the blob storage. Then I have a pipeline ADF Pop triggers, which reasds the presevred state and starts/stops the trigegrs as appropriate to restore that state.

I then planned to use you Devops task to trigger the appropriate ADF pipeline at the appropriate plaxces in my release pipeline. Unfortunately, it doesn't find my ADF pipleines. I have had a small look at your code, but don't really have time to dig into it, but my guess as to the cause of this issue is that we have rather a lot of pipeliens in our DF and the REST API that gets the pipeline info returns this as blocks and I suspect that you code only ever processes the first block returned.

Hope you can help. Thanks, Nick.

Toggle trigger error (v2)

Hi, I am using v1 extension to set triggers off and on. Today I tried v2 but it ends up with errors. Maybe I have setup something incorrectly, dunno. Do you have any idea what is wrong? Thanks.

This is screenshot of my setup

image

Errors I have:

2018-10-26T07:35:04.7287123Z ##[section]Starting: Toggle triggers in jarda-factory-tst - STOP
2018-10-26T07:35:04.7291371Z ==============================================================================
2018-10-26T07:35:04.7291513Z Task : Azure Data Factory (V2) Trigger
2018-10-26T07:35:04.7291600Z Description : Start/stop an Azure Data Factory Trigger
2018-10-26T07:35:04.7292131Z Version : 2.0.3
2018-10-26T07:35:04.7292226Z Author : Jan Pieter Posthuma
2018-10-26T07:35:04.7292333Z Help : More Information
2018-10-26T07:35:04.7292451Z ==============================================================================
2018-10-26T07:35:06.0205559Z Found 2 trigger(s).
2018-10-26T07:35:06.0214550Z Toggle 'DWJA_Daily' to 'undefined'.
2018-10-26T07:35:06.0226756Z Toggle 'DWJA_Daily_NEW' to 'undefined'.
2018-10-26T07:35:06.1410428Z ##[error]Unhandled: Cannot create property 'request' on string 'An error occurred while deserializing the responseBody

<title>404 - File or directory not found.</title> <style type="text/css"> </style>

Server Error

404 - File or directory not found.

The resource you are looking for might have been removed, had its name changed, or is temporarily unavailable.

. The error is: {}.' 2018-10-26T07:35:06.1697802Z ##[error]Unhandled: Cannot create property 'request' on string 'An error occurred while deserializing the responseBody <title>404 - File or directory not found.</title> <style type="text/css"> </style>

Server Error

404 - File or directory not found.

The resource you are looking for might have been removed, had its name changed, or is temporarily unavailable.

. The error is: {}.' 2018-10-26T07:35:06.1795642Z ##[section]Finishing: Toggle triggers in jarda-factory-tst - STOP

QUESTIONS RE: Azure Data Factory Delete Items & Azure Data Factory Deployment Tasks

Delete
Do the filters for the delete task require either * (for all), for none, or being explicit about the named resources you wish to delete? Is thre any way to do an 'intelligent' filter to determine what resources are no longer in the ARM template and, therefore, delete them, vs. having to be specific?

Deploy
When using the Microsoft recommended Resource Group Update task, we can override template params. It doesn't look like your deploy task allows for this. Would one generate and tranform the ARM templates for CI/CD through a different means and then consume those with this task?

Thanks!

Error in TypeData "Microsoft.Azure.Commands.Common.Authentication.Abstractions.IAzureContextContainer": The TypeConverter was ignored because it already occurs.

We're trying to use the Azure Data Factory Deployment task from our on-premises agent, and we receive the following error: ADFOnPremisesError.txt

When we run the same step on the Hosted VS2017 agent, and it works successfully. Here's the output:
ADFHosted2017.txt

One thing we noticed is that on-premises is running Import-Module -Name C:\Program Files\WindowsPowerShell\Modules\AzureRM\5.7.0\AzureRM.psd1 -Global
...and hosted 2017 is running Import-Module -Name C:\Program Files\WindowsPowerShell\Modules\AzureRM*2.1.0*\AzureRM.psd1 -Global

Could the newer version of AzureRM be incompatible?

We've looked at the version of AzureRM that we have installed on our on-prem agent following the guidance here, https://github.com/Azure/azure-powershell/blob/preview/documentation/troubleshoot-module-load.md#potential-reason-1-older-version-stored-in-user-or-global-scope, and we do not have multiple versions on our on-prem agent.

We're able to use our hand-rolled PowerShell scripts to do the deploy, however we like your VSTS extension and would like to switch to it. Our current policy says that we must use the on-prem agent.

Can you think of what may be causing this? Any help is very much appreciated.

Thanks

Azure Data Factory Delete Items removes Linked Services

Hi,

I used the Azure Data Factory Delete Items task with Azure Devops to copy an Azure Data Factory from one environment to another. My intention is to delete triggers but not linked services. Below is the YAML configuration of the task:

steps:
- task: liprec.vsts-publish-adf.delete-adf-items.delete-adf-items@2
  displayName: 'Delete items from xxxxxxxxxxx'
  inputs:
    azureSubscription: 'XXXXXXXXX'
    ResourceGroupName: XXXXXXXXX
    DatafactoryName: XXXXXXXXX
    TriggerFilter: '*'

Even though I left LinkedServices blank, it seems to be deleting all of them. When I provide a value to the LinkedServices as the following:

ServiceFilter: TESTINGTESTING

it works as expected and does not delete the said LinkedServices

My Task Version is 2.* (can't find if 2.0 or 2.2)

Thanks for the help!

Tasks fails behind when agent is behind proxy

Hello,
We are experiencing an issue running this task from devops.

Task : Azure Data Factory Trigger
Description : Start/stop an Azure Data Factory Trigger
Version : 2.2.0
Author : Jan Pieter Posthuma
Help : More Information

We are using a private agent to run the whole pipeline, but when we reach the task stop adf triggers it fails with the next error:
"
2020-04-03T11:55:39.5962047Z ##[debug]a8353143-a5bb-4b8a-bc30-78206609c459 auth param serviceprincipalid = ***
2020-04-03T11:55:39.5964210Z ##[debug]a8353143-a5bb-4b8a-bc30-78206609c459 auth param serviceprincipalkey = ***
2020-04-03T11:55:39.5965894Z ##[debug]a8353143-a5bb-4b8a-bc30-78206609c459 data environmentAuthorityUrl = https://login.windows.net/
2020-04-03T11:55:39.5967922Z ##[debug]a8353143-a5bb-4b8a-bc30-78206609c459 auth param tenantid = ***
2020-04-03T11:55:39.5969647Z ##[debug]a8353143-a5bb-4b8a-bc30-78206609c459=https://management.azure.com/
2020-04-03T11:55:39.5971122Z ##[debug]Parsed task inputs
2020-04-03T11:55:39.6195880Z (node:35240) Warning: Use Cipheriv for counter mode of aes-256-ctr
2020-04-03T11:55:39.6851206Z ##[error]Error login in to Azure. Please check the Service Configuration. Error: Failed to acquire token for application with the provided secret.
Error: tunneling socket could not be established, cause=socket hang up.
2020-04-03T11:55:39.6863609Z ##[debug]Processed: ##vso[task.issue type=error;]Error login in to Azure. Please check the Service Configuration. Error: Failed to acquire token for application with the provided secret. %0AError: tunneling socket could not be established, cause=socket hang up.
2020-04-03T11:55:39.6865325Z ##[debug]task result: Failed
2020-04-03T11:55:39.6866464Z ##[error]Unhandled: Azure clients require credentials.
2020-04-03T11:55:39.6867714Z ##[debug]Processed: ##vso[task.issue type=error;]Unhandled: Azure clients require credentials.
2020-04-03T11:55:39.6869986Z ##[debug]Processed: ##vso[task.complete result=Failed;]Unhandled: Azure clients require credentials.
2020-04-03T11:55:39.6870808Z ##[debug]task result: Failed
2020-04-03T11:55:39.6872330Z ##[debug]Processed: ##vso[task.complete result=Failed;]
2020-04-03T11:55:39.6891126Z ##[section]Finishing: Stops trigger to be deployed
"

We have tested the configuration from devops, agent and access to Azure and it is working fine, but we get the above error.

How can we solve it?
Alejandro MejĆ­as

Data Factory V2 : Storage account Linked Service input malformed

I have a data set in ADF v2 connected through a storage account linked service. The storage account linked service is connected via the storage connection string stored in keyvault.
Following is list of linked services which are getting deployed.

  1. Storage Account connected via azure keyvault
  2. Azure Keyvault
    The pipeline is running successfully via UI. When I try to deploy it to a new data factory v2, it throws following error :-
    2019-05-12T08:03:43.3146714Z Found 2 Linked Service files
    2019-05-12T08:03:44.5026347Z ##[error]Error deploying 'FInanceBlob.json' (HTTP Status Code: BadRequest
    Error Code: InputIsMalformedDetailed
    Error Message: Input is malformed. Reason: Could not get integration runtime details for B9AEBE78-33DD-4F91-9A2B-19ACF9BFB7DE-NewIR
    Request Id: 11a3d7f6-4ffd-4c9c-9c58-ba04895ad560

I tried deploying to test key vault by removing pipeline and storage account data set and linked service i.e. only azure key vault linked service. It was successfully deployed.

Have attached the logs :-
ReleaseLogs_24.zip

Azure Data Factory Deployment : fails silently (ADF instance not found, Login failed, ...)

The Azure Data Factory Deployment task fails silently when eg the ADF instance cannot be found or login fails. Only when adding System.Debug = True variable to the pipeline reveals this information (however, by default this variable is False).

I suspect that also with other errors this would happen.

It would be nicer to at least show the error message, so you don't have to enable debug mode on the pipeline.

Example log (anonimized):

[...]
2019-12-24T11:02:26.1822500Z ##[debug]00000000-0000-0000-0000-000000000000 data subscriptionid = xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
2019-12-24T11:02:26.1823926Z ##[debug]00000000-0000-0000-0000-000000000000 data subscriptionname = mysubscription
2019-12-24T11:02:26.1824823Z ##[debug]00000000-0000-0000-0000-000000000000 auth param serviceprincipalid = ***
2019-12-24T11:02:26.1825327Z ##[debug]00000000-0000-0000-0000-000000000000 auth param serviceprincipalkey = ***
2019-12-24T11:02:26.1825853Z ##[debug]00000000-0000-0000-0000-000000000000 data environmentAuthorityUrl = https://login.windows.net/
2019-12-24T11:02:26.1826612Z ##[debug]00000000-0000-0000-0000-000000000000 auth param tenantid = ***
2019-12-24T11:02:26.1828020Z ##[debug]00000000-0000-0000-0000-000000000000=https://management.azure.com/
2019-12-24T11:02:26.1829181Z ##[debug]Parsed task inputs
2019-12-24T11:02:26.3913447Z ##[debug]Azure client retrieved.
2019-12-24T11:02:26.4680068Z ##[debug]Datafactory not found: myadfinstance.
2019-12-24T11:02:26.4683165Z ##[debug]task result: Failed
2019-12-24T11:02:26.4685051Z ##[debug]Processed: ##vso[task.complete result=Failed;]

Add runId output to trigger jobs

Hello,

First of all thank you for the plugin!
I'm currently trying to monitor the status of some pipeline that I have triggered but I need the runId generated at trigger time.
What would it take to catch the body from the createRun endpoint you are using and then passing it as an output variable? The complex part would be to handle multiple outputs when several pipelines are triggered at same time.

(here is the endpoint doc https://docs.microsoft.com/fr-fr/azure/data-factory/quickstart-create-data-factory-rest-api#create-pipeline-run)

Azure Datafactory Deployment Task is not working in Releases pipeline

Last week it was successfully completed (Version: 2.2.1) .

This week getting below error(Current Version: 2.3.2)

##[error]Error deploying '' trigger : {"error":{"code":"TriggerEnabledCannotUpdate","message":"Cannot update enabled Trigger; the trigger needs to be disabled first. ","target":null,"details":null}}

Handling dependent pipelines

We have multiple pipelines which are dependent on other pipelines which needs that the dependencies get deployed before the dependants. Is there a way this can be handled through the Deploy JSON files Azure Data Factory task in Azure DevOps

Fail when deploying LinkedServices

We are getting an error when trying to deploy our linkservices, the error we are getting looks as following, and has this message:

2020-09-08T07:20:20.8859602Z ##[error]Error deploying 'LS_SQL_Archive_SalesforceRetailer' linked service : {"error":{"code":"InputIsMalformedDetailed","message":"Input is malformed. Reason: Could not get integration runtime details for IR-twdbx007dsa","target":null,"details":null}}

image

AzureRM.DataFactoryV2 module is older version

Hi,
When using folder property in Pipeline JSON file, pipeline are not created in given folder. When I tried using updated AzureRM.DataFactoryV2 I'm able to deploy to required folder.
Therefore, It would be great if you updated all reference PowerShell module to so that we can still continue to use this task.

{
"name": "FileToSQLTableTemplate",
"properties": {
.....
	"folder": {
		"name": "Test Pipelines"
	}
}

Thanks
Syedhamjath Syedmansur

Pipeline parameters are not passed to ADF pipeline

I am triggering the execution of ADF pipeline from Azure DevOps as follows:

    steps:
    - task: liprec.vsts-publish-adf.trigger-adf-pipeline.trigger-adf-pipeline@2
      displayName: 'Trigger ADF Training pipeline'
      inputs:
        azureSubscription: aml-workspace-serviceconnection
        ResourceGroupName: ms-mlops-rg
        DatafactoryName: adf-workspace-mlops
        PipelineFilter: Train_Pipeline
        PipelineParameter: azdo-variables-train.json

The content of azdo-variables-train.json:

{
    "pipeline_type": "Train",
    "days_qty": "10",
    "receiver": "[email protected]"
}

In ADF , I created empty parameters for the pipeline Train_Pipeline. These parameters have the same names: pipeline_type, days_qty and receiver.

image

I expect that the values from azdo-variables-train.json will be passed to these parameters. However, it does not happen.

If, however, I define the values of parameters of the pipeline Train_Pipeline inside ADF, then obviously everything works fine.

ADF V1 Json files not deployed

I created a release pipeline in visual studio online and added the Azure data factory deployment task downloaded from here, but json are not deployed even after the release got succeeded.
@liprec

Delete Partially succeeded - (But No objects are deleted)

Dear Author,

I run Azure Data Factory Delete Items Task Version 2.0 to completely delete all my pipelines, linked service, dataset, trigger. I place * in all the four filters. When the task is executing, it is able to detect all the objects with respect to each category. Message says, Delete objects.

Finally the task ends up with partially succeeded. And no objects are removed from Data Factory

Below is the log message.

2019-11-21T18:50:35.7556392Z ##[section]Starting: Delete items from xx-xx-xx
2019-11-21T18:50:35.7662168Z ==============================================================================
2019-11-21T18:50:35.7662332Z Task : Azure Data Factory Delete Items
2019-11-21T18:50:35.7662564Z Description : Delete Azure Data Factory V2 items, like Datasets, Pipelines, Linked Services or Triggers
2019-11-21T18:50:35.7662670Z Version : 2.1.0
2019-11-21T18:50:35.7662744Z Author : Jan Pieter Posthuma
2019-11-21T18:50:35.7662848Z Help : More Information
2019-11-21T18:50:35.7663099Z ==============================================================================

2019-11-21T18:50:37.5956567ZĀ FoundĀ 1Ā trigger(s).

2019-11-21T18:50:37.5965870ZĀ DeleteĀ triggerĀ 'blob_
2019-11-21T18:50:37.9696337ZĀ FoundĀ 9Ā pipeline(s).
2019-11-21T18:50:37.9697144ZĀ DeleteĀ pipelineĀ 'pl_c
2019-11-21T18:50:37.9706031ZĀ DeleteĀ pipelineĀ 'pipe
2019-11-21T18:50:37.9801815ZĀ DeleteĀ pipelineĀ 'pl_c
2019-11-21T18:50:37.9804089ZĀ DeleteĀ pipelineĀ 'pl_m
2019-11-21T18:50:37.9805233ZĀ DeleteĀ pipelineĀ 'Test
2019-11-21T18:50:38.1055973ZĀ DeleteĀ pipelineĀ 'Stag
2019-11-21T18:50:38.1106869ZĀ DeleteĀ pipelineĀ 'pl_D
2019-11-21T18:50:38.1209029ZĀ DeleteĀ pipelineĀ 'pl_D
2019-11-21T18:50:38.1243555ZĀ DeleteĀ pipelineĀ 'pl_D
2019-11-21T18:50:38.4469909ZĀ FoundĀ 34Ā dataset(s).
2019-11-21T18:50:38.4470681ZĀ DeleteĀ datasetĀ 'ds_in
2019-11-21T18:50:38.4480124ZĀ DeleteĀ datasetĀ 'ds_ou
2019-11-21T18:50:38.4489029ZĀ DeleteĀ datasetĀ 'ds_lk
2019-11-21T18:50:38.4496875ZĀ DeleteĀ datasetĀ 'ds_ad
2019-11-21T18:50:38.4503872ZĀ DeleteĀ datasetĀ 'dial_
2019-11-21T18:50:38.5902643ZĀ DeleteĀ datasetĀ 'log_t
2019-11-21T18:50:38.5927317ZĀ DeleteĀ datasetĀ 'Event
2019-11-21T18:50:38.6050612ZĀ DeleteĀ datasetĀ 'Read_
2019-11-21T18:50:38.6156342ZĀ DeleteĀ datasetĀ 'Delim
2019-11-21T18:50:38.6317178ZĀ DeleteĀ datasetĀ 'Src_B
2019-11-21T18:50:38.7104017ZĀ DeleteĀ datasetĀ 'Trigg
2019-11-21T18:50:38.7366317ZĀ DeleteĀ datasetĀ 'BO_On
2019-11-21T18:50:38.7401143ZĀ DeleteĀ datasetĀ 'Delim
2019-11-21T18:50:38.7700259ZĀ DeleteĀ datasetĀ 'log_t
2019-11-21T18:50:38.7785957ZĀ DeleteĀ datasetĀ 'sqldb
2019-11-21T18:50:38.8490319ZĀ DeleteĀ datasetĀ 'src_M
2019-11-21T18:50:38.8596609ZĀ DeleteĀ datasetĀ 'ds_sq
2019-11-21T18:50:38.8614034ZĀ DeleteĀ datasetĀ 'src_M
2019-11-21T18:50:38.9162993ZĀ DeleteĀ datasetĀ 'ds_sq
2019-11-21T18:50:38.9183386ZĀ DeleteĀ datasetĀ 'Meta_
2019-11-21T18:50:38.9721417ZĀ DeleteĀ datasetĀ 'Meta_
2019-11-21T18:50:38.9832812ZĀ DeleteĀ datasetĀ 'ds_Ev
2019-11-21T18:50:39.0016196ZĀ DeleteĀ datasetĀ 'ds_In
2019-11-21T18:50:39.0568957ZĀ DeleteĀ datasetĀ 'ds_v_
2019-11-21T18:50:39.0693441ZĀ DeleteĀ datasetĀ 'ds_Pe
2019-11-21T18:50:39.1022572ZĀ DeleteĀ datasetĀ 'ds_Pe
2019-11-21T18:50:39.1123709ZĀ DeleteĀ datasetĀ 'SQLDB
2019-11-21T18:50:39.1476313ZĀ DeleteĀ datasetĀ 'ds_pe
2019-11-21T18:50:39.1921451ZĀ DeleteĀ datasetĀ 'ds_St
2019-11-21T18:50:39.1953091ZĀ DeleteĀ datasetĀ 'ds_ET
2019-11-21T18:50:39.2402893ZĀ DeleteĀ datasetĀ 'ds_SQ
2019-11-21T18:50:39.2535789ZĀ DeleteĀ datasetĀ 'ds_sq
2019-11-21T18:50:39.2671677ZĀ DeleteĀ datasetĀ 'ds_po
2019-11-21T18:50:39.3124775ZĀ DeleteĀ datasetĀ 'ds_Tr
2019-11-21T18:50:39.7755642ZĀ FoundĀ 29Ā linkedĀ service(s).
2019-11-21T18:50:39.7757124ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:39.7763369ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:39.7767753ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:39.7772085ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:39.7776120ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:39.9168692ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:39.9168874ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:39.9168999ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:39.9169112ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:39.9169212ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.0340951ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.0362220ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.0386744ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.0432193ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.0465508ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.1696214ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.1723678ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.1756685ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.1797109ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.1878150ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.2916248ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.3008706ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.3034396ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.3130337ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.3220047ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.4101383ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.4183395ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.4246196ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.4283415ZĀ DeleteĀ linkedĀ service
2019-11-21T18:50:40.5932389Z ##[section]Finishing: Delete items from opsadp-d-02-df

Thanks and Regards,
Vijay

how to check if df exists

Before stopping a trigger, I need to check if the data factory exists. What approach should we take in ensuring that we only stop the trigger if the specified data factory exists?

I'm not understanding what we are supposed to do on the initial deployment of a data factory. Is this necessarily always a manual step?

Stop/Start triggers task fail intermittantly

Hi,
I was using this task for some time without any issues.
Recently it started throwing up errors intermittantly (Sometimes it works and sometimes it doesn't) :
##[error]Unable to find environment with name 'AzureCloud'
##[error]There was an error with the service principal used for the deployment.

And also we are unable to move this task into a task group.

Could this be fixed at the earliest please.
Regards,
Prashanth PS

I've faced issue with "Use Cipheriv for counter mode of aes-256-ctr" for azure-pipelines-task-lib upper that 2.8.0

Dear Jan,
I've faced the issue with Cipheriv.
The solution is "use crypto.createCipheriv(), not crypto.createCipher()"
Could you please make changes?
Logs:

Task         : Azure Data Factory Delete Items
Description  : Delete Azure Data Factory V2 items, like Datasets, Pipelines, Linked Services or Triggers
Version      : 2.1.0
Author       : Jan Pieter Posthuma
Help         : [More Information](https://github.com/liprec/vsts-publish-adf)
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr
(node:3464) Warning: Use Cipheriv for counter mode of aes-256-ctr

Azure Data Factory Deployment with Integration Service as Parameter

We have been trying to Queue a Build to deploy Data Factory V2 which uses Integration Runtime. When we create this build, it asks for parameters like Subscription, Resource Group Name, data Factory Name, Version, Path to Linked Services, Path to Datasets, Path to Pipelines.

When we queue this build, it throws an error below:
Error Code: InputIsMalformedDetailed
Error Message: Input is malformed. Reason: Could not get integration runtime details for integrationRuntimeSFTP2

This is the Integration Runtime we use for a SFTP Linked Service.

Can someone advise how to do this or are there any other plugin we can use?

Deploy fails when dependant LS does not exist yet

When I try to deploy new Linked Services to empty Data Factory (all need to be created), the process fails for the first time with the following error message:
`
Found 3 linked service(s) definitions.

Deploy linked service 'LS_ADLS'.
Deploy linked service 'LS_AzureKeyVault'.
Deploy linked service 'LS_SQL_Stackoverflow'.
##[error]Error deploying 'LS_ADLS' linked service : {"error":{"code":"BadRequest","message":"The document creation or update failed because of invalid reference 'ls_azurekeyvault'.","target":"/subscriptions/0278080f-e1af-4ee8-98b3-881a286350aa/resourceGroups/rg-blog-uat/providers/Microsoft.DataFactory/factories/adf-blog-uat/linkedservices/LS_ADLS","details":null}}
`

I have 3 Linked Services, where 'LS_AzureKeyVault' should be deployed first, as the others use it to get secrets as property.
Changing a name (to 'AzureKeyVault') did not help - I thought that maybe it's a sorting problem.
When I reviewed the code - I believe that all tasks are run in 5 parallel threads.
This might be a serious issue when deploying dependant ADF objects.

Let me know if you couldn't reproduce the issue.

Incomplete deploy when missing resource

When you configure to deploy Dataflow, Dataset, and Pipeline, but your source doesn't have Dataflow artifacts, only Datasets are deployed, and Pipelines are ignored.
Thinking about an automatic Build/Release, you have to configure all the possibilities for further deploys, and it isn't possible with this issue.

Deployment of new pipeline is failing first time

Hi,
The first time deployment of new pipeline using ADF deployment task is failing however the same task works fine in second attempt.
[error]Error deploying '' pipeline : {"error":{"code":"BadRequest","message":"The document creation or update failed because of invalid reference ''
Thanks
Aman

Azure Data Factory Delete Items

Hi

The delete task - Azure Data Factory Delete Items - has random behaviour.
1- It was not able to find piplelines for some piplines.
2- For a pipline filter - it once delted 16 pipleine only - in next run it deleted 11 pipeline and in yest another run it deleted only 6 pipeline - Number of pipleines were more than 50.
3- the delete task can delete only 50 datasets in one task.

Regards
Rajeev

ADF Pipeline run - Failing - missing file

When running a pipeline from Azure DevOps using the plugin, i get this error:

2019-01-16T16:18:01.3901657Z ##[section]Starting: Trigger pipeline run - Mount Points
2019-01-16T16:18:01.3907565Z ==============================================================================
2019-01-16T16:18:01.3907891Z Task         : Azure Data Factory Trigger Pipeline
2019-01-16T16:18:01.3908095Z Description  : Trigger Azure Data Factory V2 Pipelines
2019-01-16T16:18:01.3908258Z Version      : 2.0.4
2019-01-16T16:18:01.3908407Z Author       : Jan Pieter Posthuma
2019-01-16T16:18:01.3908598Z Help         : [More Information](https://github.com/liprec/vsts-publish-adf)
2019-01-16T16:18:01.3908820Z ==============================================================================
2019-01-16T16:18:01.4179068Z ##[error]File not found: 'C:\vstsagent\A1\_work\_tasks\trigger-adf-pipeline_da9d5e69-e1b0-4af0-8f4b-a236c0c1d55d\2.0.4\./dist/deleteadfitems.js'
2019-01-16T16:18:01.4184965Z ##[section]Finishing: ***

Config
image

Stop / Start triggers Syntax Issue

Getting the below message while stopping the triggers

Starting: Toggle trigger(s) in xxx(stop)

Task : Azure Data Factory Trigger
Description : Start/stop an Azure Data Factory Trigger
Version : 2.3.2
Author : Jan Pieter Posthuma
Help : More Information

(node:6056) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): SyntaxError: Invalid regular expression: /*/: Nothing to repeat
Finishing: Toggle trigger(s) in xxx(stop)

Error in ADF v2 deployment plugin - Unable to deserialize the response

Hi @liprec ,

I used the plugin to deploy the ADFv2 in Release pipeline. I have created the build and provided the artifact path in the pipeline configuration. Yet, after deleting the existing pipeline and datasets the new ones are not getting deployed.
Instead we are getting the error "Unable to deserialize the response"

What is the reason behind this error. Could you please assist?
adf

Hoping for a swift response

Azure Data Factory Pipelines Manage - Not able to find the datafactory under the resouce group

I'm trying to use "Azure Data Factory Pipelines Manage" deployment step, however it is failing to find the data factory under the resource group. Actually, the ADF is present in that particular resource group

2019-03-19T13:59:13.8880345Z ##[error]HTTP Status Code: NotFound
Error Code: ResourceNotFound
Error Message: The Resource 'Microsoft.DataFactory/dataFactories/*******' under resource group 'DEV*****RG' was not found.
Request Id: d88a1d8c-6812-49a4-aa01-a9b23d7f0987
Timestamp (Utc):03/19/2019 13:59:13
2019-03-19T13:59:13.9038066Z ##[section]Finishing: Set pipelines to 'suspend' in intplfadf

Suspend is bugged

Copy from marketplace Q&A

Suspend is bugged - when I use this extension, pipelines do not pause, while using automation script (AzureRM.DataFactories) works
By George 6/9/2017

##[error]Error deploying 'adla_retaillink_ls.json' ('Value' cannot be null.)

Hi liprec

When deploying ADF artifacts from GitHub repo, i get the following error.

2018-10-02T01:19:59.7270721Z ##[section]Starting: ADF Deploy JSON files to $(datafactory)
2018-10-02T01:19:59.7275505Z ==============================================================================
2018-10-02T01:19:59.7275634Z Task         : Azure Data Factory Deployment
2018-10-02T01:19:59.7275757Z Description  : Deploy Azure Data Factory Datasets, Pipelines and/or Linked Services using JSON files
2018-10-02T01:19:59.7275848Z Version      : 1.1.11
2018-10-02T01:19:59.7275912Z Author       : Jan Pieter Posthuma
2018-10-02T01:19:59.7276000Z Help         : [More Information](https://github.com/liprec/vsts-publish-adf)
2018-10-02T01:19:59.7276081Z ==============================================================================
2018-10-02T01:20:05.6474402Z ##[command]Import-Module -Name C:\Program Files\WindowsPowerShell\Modules\AzureRM\2.1.0\AzureRM.psd1 -Global
2018-10-02T01:20:10.1393756Z ##[warning]The names of some imported commands from the module 'AzureRM.Websites' include unapproved verbs that might make them less discoverable. To find the commands with unapproved verbs, run the Import-Module command again with the Verbose parameter. For a list of approved verbs, type Get-Verb.
2018-10-02T01:20:10.1670681Z ##[warning]The names of some imported commands from the module '
AzureRM' include unapproved verbs that might make them less discoverable. To find the commands with unapproved verbs, run the Import-Module command again with the Verbose parameter. For a list of approved verbs, type Get-Verb.
2018-10-02T01:20:10.2353547Z ##[command]Import-Module -Name C:\Program Files\WindowsPowerShell\Modules\AzureRM.profile\2.1.0\AzureRM.profile.psd1 -Global
2018-10-02T01:20:10.3036133Z ##[command]Add-AzureRMAccount -ServicePrincipal -Tenant *** -Credential System.Management.Automation.PSCredential -Environment AzureCloud
2018-10-02T01:20:12.4787770Z ##[command]Select-AzureRMSubscription -SubscriptionId 1824bc1e-b99a-4dab-9a84-b0d5f05f83c7 -TenantId ***
2018-10-02T01:20:16.3696856Z Cleared all existing Trigger
2018-10-02T01:20:17.8295335Z Cleared all existing Pipeline
2018-10-02T01:20:19.4243194Z Cleared all existing Dataset
2018-10-02T01:20:20.8037971Z Cleared all existing Linked Service
2018-10-02T01:20:20.8195900Z Start deploying Linked Service
2018-10-02T01:20:20.8222896Z Found 8 Linked Service files
2018-10-02T01:20:21.0838768Z ##[error]Error deploying 'adla_retaillink_ls.json' ('Value' cannot be null.)
2018-10-02T01:20:21.1218503Z ##[section]Finishing: ADF Deploy JSON files to $(datafactory)

Looking at the contents of the ADLA Linked service...i dont see anything wrong. Should there be a more informative error to assist validation/research (keep in mind i do replace some of the values using a JSON Patch in my build pipeline. I've **** some of the sensitive parts (prefix only)

{
    "name": "adla_retaillink_ls",
    "properties": {
        "type": "AzureDataLakeAnalytics",
        "typeProperties": {
            "accountName": "azueus2devadlaretaillink",
            "servicePrincipalId": "*****4-c442-4e4b-b0ba-6c31be2d657d",
            "servicePrincipalKey": {
                "type": "AzureKeyVaultSecret",
                "store": {
                    "referenceName": "kvlt_dataplatform_ls",
                    "type": "LinkedServiceReference"
                },
                "secretName": "azueus2-adla-adladatalake-spnkey"
            },
            "tenant": "*******-d7f2-403e-b764-0dbdcf0505f6",
            "subscriptionId": "*******-b99a-4dab-9a84-b0d5f05f83c7",
            "resourceGroupName": "azu-eus2-dev-rg-IngestRetailLink"
        }
    }
}

is the deploy doing a check to see if the keys exists on Keyvault?

Build failing

We curious about test build failing , Have you more information about this give us for a future use of this extension.

Artifacts use file name, not json name property

Hello,

When creating artifacts, e.g. LinkedServices, it uses the file name.

Line 380 in deploy-adf-json.psm1 - $linkedServiceName = ($JsonFile.Name).Replace('.json', '')

Ideally, it should use the name property from within the json.

$linkedServiceName = ($JsonFile | ConvertFrom-Json).name

Unfortunately, due to linked service nesting (when using Azure Key Vault), we need to deploy Key Vault first as it's a dependency. The way we have of doing this is to put a number at the start of the file name in the VSTS repo to enforce the order they are deployed. But we do not want this numbering to be included in the deployed artifacts

Many Thanks,
Mike.

Databricks Linked service fail

Invalid linked service payload error in deploying databricks linked service. More information in error, The type properties nested in payload is null.

Using file pattern with Azure Data Factory Deployment

Hello,
im trying to see if its possible to use a file pattern to only catch all the files with a ending with WaterMark.json.

I tried different ways but none seems to work

$(System.DefaultWorkingDirectory)/_build_datafactory/drop/pipeline/*WaterMark.json

##[error]Error: Task failed while initializing. Error: Not found PipelinePath: d:\a\r1\a_build_datafactory\drop\pipeline*WaterMark.json.

any ideas?
thank you,

https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/file-matching-patterns?view=azure-devops#asterisk_examples

Cannot execute pipelines

Dear Jan,

Could you please confirm that the triggering of pipelines was not updated in the last version 2.2, and it works as earlier?
The problem is that starting from v2.2 I cannot trigger DataFactory pipelines using my code (see below). I do not get any error though. Just DataFactory pipelines are not executed.

image

Please see the triggering code below.
Please notice that I can stil execute the pipelines directly from DataFactory.

Could you please point me to a possible reason of this issue?

Thanks.

  jobs:

  - job: "Training"
    displayName: "Training"

    steps:
    - task: liprec.vsts-publish-adf.trigger-adf-pipeline.trigger-adf-pipeline@2
      displayName: 'Trigger ADF Training pipeline'
      inputs:
        azureSubscription: aml-workspace-serviceconnection
        ResourceGroupName: dev-ml-rg
        DatafactoryName: dev-adf-ws
        PipelineFilter: BA_Train_Pipeline
        PipelineParameterType: Path
        PipelineParameterPath: azdo-variables-train.json

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    šŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. šŸ“ŠšŸ“ˆšŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ā¤ļø Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.