- 1016 Views
- 1 replies
- 1 kudos
Resolved! Databricks Apps On Aws
ContextMigrating from Azure Databricks (Premium) to AWS Databricks (Premium) in eu‑west‑2 (London) with Unity Catalog attached.On Azure, Databricks Apps are available (Compute → Apps and New → App). (ы)Goal: run the same Streamlit apps on AWS.What we...
- 1016 Views
- 1 replies
- 1 kudos
- 1 kudos
[Resolved]. The root cause was that Databricks Apps (including Serverless App) are not available in all AWS regions. My primary workspace was created in eu-west-2 (London), where this feature is not supported.After creating a new workspace in eu-west...
- 1 kudos
- 1131 Views
- 5 replies
- 1 kudos
Cross-region serverless compute network access to Azure storage account
Given we have Unity Catalog and an Azure Databricks workspace with both in Azure west us region, and we want to allow serverless compute to access data in catalogs that use external locations on an Azure Storage account in west us 3, how can we get t...
- 1131 Views
- 5 replies
- 1 kudos
- 1 kudos
Turns out we don't have a Databricks-managed NAT Gateway because our workspace is deployed in our own VNet and we have SCC enabled. I opened a track with Microsoft Support and will be working with them today; if we get it figured out I'll share the i...
- 1 kudos
- 1864 Views
- 6 replies
- 2 kudos
Resolved! Delta Sharing Error from Azure Databricks - "received more than two lines"
Hello,I am trying to query a Delta table located on AWS S3 from Azure Databricks using Delta Sharing.My setup includes a Delta Sharing server running on AWS Fargate. The server itself is running correctly, and I can successfully query it from my loca...
- 1864 Views
- 6 replies
- 2 kudos
- 2 kudos
I was able to connect after changing the delta-sharing-server version to 1.1.0.Thank you for your kind help!
- 2 kudos
- 1242 Views
- 2 replies
- 0 kudos
Terraform Databricks Integration - specially for Unity Catalog in AWS S3
We are attempting to provision Unity Catalog using Terraform, but we're encountering issues with establishing authentication with AWS through IAM Roles and Policies.For EC2/Cluster instances, the instance profile works fine with a trust relationship ...
- 1242 Views
- 2 replies
- 0 kudos
- 0 kudos
Is this work tested? I'm getting an errorError: Self-referential block on index.tf line 31, in resource "aws_iam_role" "reader": 31: "arn:aws:iam::${data.aws_caller_identity.current.account_id}:role/${aws_iam_role.reader.name}" Conf...
- 0 kudos
- 1020 Views
- 5 replies
- 5 kudos
Resolved! Recent Databricks UI issue with mouse
For the past week or so, I've been having a weird UI issue with Databricks. Frequently, when I try to select text within a notebook, by dragging the mouse, it behaves as if I have clicked on a notebook within the LHS explorer pane, and loads it. So, ...
- 1020 Views
- 5 replies
- 5 kudos
- 5 kudos
Thanks for the tip! I'm a bit surprised that I've only just started triggering this - and am now seeing it all the time. But yes, closing the explorer window is probably the way to go if it can't be fixed
- 5 kudos
- 3982 Views
- 14 replies
- 1 kudos
Running jobs as service principal, while pulling code from Azure DevOps
In our Dataplatform, our jobs are defined in a dataplatform_jobs.yml within a Databricks Asset Bundle, and then pushed to Databricks via an Azure Devops Pipeline (Azure Devops is where our codebase resides). Currently, this results in workflows looki...
- 3982 Views
- 14 replies
- 1 kudos
- 1 kudos
Hi @LuukDSL had you try the solution I have provided above ?
- 1 kudos
- 876 Views
- 4 replies
- 4 kudos
Resolved! %run command gives error on free edition
Hi,I'm testing out running one Notebook from another using the %run magic command in the Databricks Free Edition. Just really simple test stuff but get the following error:Failed to parse %run command: string matching regex '\$[\w_]+' expected but 'p...
- 876 Views
- 4 replies
- 4 kudos
- 4 kudos
Hello @JonnyData! This parsing error typically appears when the notebook path isn't in the expected format. Could you share the exact %run command you're using?Also, please ensure the path is a workspace path, either absolute (starting with /) or rel...
- 4 kudos
- 920 Views
- 2 replies
- 1 kudos
databricks sdk version installed in serverless compute differ
Hello, I've encountered an issue with my Python notebook where app.list() is failing in some serverless compute clusters but works fine in others. After investigating further, I noticed the following version differences:Working Cluster: SDK version 0...
- 920 Views
- 2 replies
- 1 kudos
- 1 kudos
oh, I found the doc and confirmed that list() method is added with version 0.27.0Added list() method for w.apps workspace-level service.https://github.com/databricks/databricks-sdk-py/blob/v0.40.0/CHANGELOG.mdNow, how can I update this to newer versi...
- 1 kudos
- 4958 Views
- 4 replies
- 2 kudos
Databricks runtime and Java Runtime
The Databricks runtime is shipped with two Java Runtimes: JRE 8 and JRE 17. While the first one is used by default, you can use the environment variable JNAME to specify the other JRE: JNAME: zulu17-ca-amd64.FWIW, AFAIK JNAME is available since DBR 1...
- 4958 Views
- 4 replies
- 2 kudos
- 2 kudos
@Witold Thanks for the original post here, Any luck with jdk-21 on DBR-17?I'm using some java-17 features in the code alongside spark-4.0.0 which I wanted to run on DBR-17. Sadly the generic jname=zulu21-ca-amd64 did not work for me. I also tried oth...
- 2 kudos
- 3338 Views
- 1 replies
- 0 kudos
How can I install maven coordinates using init script?
Hi,I need to install the below maven coordinates on the clusters using databricks init scripts.1. coordinate: com.microsoft.azure:synapseml_2.12:0.11.2 with repo https://mmlspark.azureedge.net/maven2. coordinate: com.microsoft.azure:spark-mssql-conne...
- 3338 Views
- 1 replies
- 0 kudos
- 1350 Views
- 4 replies
- 2 kudos
Asset Bundle on Free Edition
I am trying to use Databricks Asset Bundles in the workspace in my Free Edition account. I can deploy the bundle from my own computer, but I want to use the feature within Databricks. I am able to clone an empty repo to a Git folder and create the bu...
- 1350 Views
- 4 replies
- 2 kudos
- 2 kudos
Hi @daboncanplay,Databricks Asset Bundles rely on Terraform under the hood to manage the Databricks resources the bundle defines. It automatically downloads the binaries it needs. If it can't, you'll get an error like the one you see. This looks like...
- 2 kudos
- 433 Views
- 1 replies
- 0 kudos
Delta Live Tables not enabled, nor is feature enablement in account settings
I am the account admin of a Premium workspace and I need Delta Live Tables enabled.I created a support request, but it was closed automatically due to lack of a support contract, despite being a Premium customer.There is no way in the UI to enable th...
- 433 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Anonymous ,Are you sure? If you have premium workspace, you shouldn't do anything, DLT should be there. But they changed name recently, so now it's called Lakeflow Declartive Pipelines.Check if you can see in UI ETL Pipelines button. Go to:- Jobs...
- 0 kudos
- 2037 Views
- 6 replies
- 4 kudos
Requesting Assistance with /billing-profiles API Endpoint Access - 503 TEMPORARILY_UNAVAILABLE
Hi,I am trying to use the `/billing-profiles` endpoint from the Accounts API to retrieve billing profile and currency information. I am authenticated using OAuth2 client credentials, but I am consistently getting the following error:Status Code: 503 ...
- 2037 Views
- 6 replies
- 4 kudos
- 4 kudos
Hi @Advika ,Yes, I am able to access the workspace endpoint using OAuth2 setup .Our Databricks account is in the ap-southeast-2 region. Please let me know if any additional information is needed.
- 4 kudos
- 830 Views
- 2 replies
- 0 kudos
Azure Databricks databricks-cli authentication with M2M using environment variables
Which environment variables do I have to set to use the databricks-cli with m2m oauth using Microsoft Entra ID managed service principals? I already added the service principal to the workspace.I found the following documentation, but I am still conf...
- 830 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @ashraf1395, I’m trying to use environment variables to configure the Databricks CLI, instead of relying on the .databrickscfg file.@Chris2794 Have you found a way to authenticate using Machine-to-Machine (M2M) OAuth with Microsoft Entra ID manage...
- 0 kudos
- 6531 Views
- 33 replies
- 5 kudos
Databricks to SFTP: Connection Fails Even with Whitelisted NAT Gateway IP
Hi community,I’m experiencing a strange issue with my connection from Databricks to an SFTP server.I provided them with an IP address created for Databricks via a NAT gateway, and that IP is whitelisted on their side. However, even though I have the ...
- 6531 Views
- 33 replies
- 5 kudos
- 5 kudos
Hi @szymon_dybczak , Thanks for the suggestion. I tried to attach the ip address to the NAT but it requires a NAT also to have an internet routing preference and Azure doesn't allow that. Quite a unique scenario indeed. We have created a workaround n...
- 5 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
59 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 119 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |