โ11-28-2023 01:55 AM
โ02-04-2024 11:02 AM
Even i am receiving the same.
I have performed following steps:
1. created SP on Azure.
2. Granted contributor role on azure resource where DBx workspace is created.
3. Gerated token using, az account get-access-token --resource 2ff814a6-3304-4ab8-85cb-cd0e6f879c1d.
4. Used this token as bearer token for calling DBx create token api.
5. But in the response i get status code as 200 and html code.
In my scenario i am calling a powershell inline script in azure ci pipeline. Also i tried to replicated the same process via postman, i revceive the same error.
Note: If i use all the above information and execute the command locally on power shell, it works fine and returns me token.
โ02-04-2024 08:28 PM
you need set DATABRICKS_TOKEN in header
curl --request GET "https://${DATABRICKS_HOST}/api/2.0/clusters/get" \
--header "Authorization: Bearer ${DATABRICKS_TOKEN}" \
--data '{ "cluster_id": "1234-567890-a12bcde3" }'
โ02-04-2024 11:01 PM
Hello @feiyun0112 ,
Thanks for the response. I am already using header in my cURL call and also i tried following command using power shell.
โ02-18-2024 12:47 PM
@TestuserAva
Can you please share your approach to tackle this problem in case if you solved.
For me it works using powershell Invoke-WebRequest but not using postman.
โ02-28-2024 11:30 AM
Hey all! I'm having the exact same problem. Did you manage to make it work @Abhishek10745 @TestuserAva ?
Could you please share the solution if you did? Thanks!
โ02-28-2024 01:06 PM
Hello @SJR ,
In the scenario which i mentioned in the previous comment, my ci pipeline was using a pool or scaleset which did not have access to this azure databricks service. Hence, when my service principal tried to create PAT token using databricks api (running on incorrect scaleset / pool) , it did not allow me to login hence i could not perform any operation using databricks api.
When i used the correct scaleset/pool which was give access to this azure databricks service, it worked.
Hope this helps you to investigate in correct direction.
โ02-29-2024 07:28 AM
Hello @Abhishek10745
It was just like you said! We have a completely private instance of Databricks and the DevOps Pipeline that I was using didin't have access to the private vnet. Switching pools solved the problem. Thanks for all the help!
Excited to expand your horizons with us? Click here to Register and begin your journey to success!
Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!