Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Handling Bearer Tokens in Azure Pipeline for HTTP Objects

So in Azure Data Factory, for a pipeline, I had an HTTP object set up for copying data from an API, it was using a basic Password and Username. Now the API uses a bearer token to authorize calls. I've been able to code up a solution in Python, but I really don't know how to get Azure to handle this authentication process, in the Copy step.

Is there a way to call for the bearer token earlier and then pass it as part of the HTTP link service password?

Python Script:

import http.client

conn = http.client.HTTPSConnection("www.url.com")

headers = {
    'authorization': "Basic [removed]",
    'cache-control': "no-cache",
    }
conn.request("GET", "/v1/oauth2/accesstoken?grant_type=client_credentials", headers=headers)
res = conn.getresponse()
data = res.read()

import json
datajson = json.loads(data.decode("utf-8"))
headers = {
    'authorization': "Bearer " + datajson["access_token"],
    'cache-control': "no-cache",
    }

conn.request("GET", "/data?data-date=2018-12-09", headers=headers)

res = conn.getresponse()
data = res.read()

print(data.decode("utf-8"))
like image 386
Dick McManus Avatar asked Nov 01 '25 19:11

Dick McManus


1 Answers

Unfortunately, according to Copy data from an HTTP endpoint by using Azure Data Factory, the only supported authentication methods are: Anonymous, Basic, Digest, Windows, or ClientCertificate.

But, you might be able to do a workaround by using the additionalHeaders of the Dataset's properties to pass the bearer token to the HTTP endpoint.

To get the token (and even you might be able to get data this way), you could use Web activity in Azure Data Factory to perform the HTTP requests.

Hope it helps!

like image 165
Itay Podhajcer Avatar answered Nov 03 '25 10:11

Itay Podhajcer