- 6744 Views
- 4 replies
- 3 kudos
Resolved! [VNET injection] Container and container subnet
Hi,I was researching everywhere and could not find the answer. I understand that when workspace is created, it has 2 subnets, host and container. The VM, which runs the Databricks container, is in host subnet, which logically means the container is a...
- 6744 Views
- 4 replies
- 3 kudos
- 3 kudos
Hi @data_bricklayer ,Public Subnet (host): The public subnet is typically used for resources that need to communicate with the internet or other Azure services. In Azure Databricks, this subnet is used for driver nodes of the clusters that require ou...
- 3 kudos
- 6946 Views
- 3 replies
- 2 kudos
Connecting external location from a different tenant in Azure
Hi,we have a setup with 2 different Azure tenants. In tenant A we have a storage account that we want to connect as an external location to a databricks workspace in the tenant B. For that we have established a private endpoint from the storage accou...
- 6946 Views
- 3 replies
- 2 kudos
- 2 kudos
It would be superb to connect between two tenants with Azure Databricks Access Connector
- 2 kudos
- 4732 Views
- 1 replies
- 0 kudos
Resolved! Prevent service principal UUID from appearing on job name
Hello!I am using service principal id to authenticate my databricks bundle. But when the job runs, this id is automatically appended to both the name and tags column on the jobs run page. In my databricks.yml file I have name: "[${var.environment}]" ...
- 4732 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi!Sounds like the "development" mode. DAB will automatically prefix your jobname with <env> <user name> if you set "mode" to "development" in the databricks.yml file. The name lookup for service principals apparently doesn't work nicely and you get ...
- 0 kudos
- 3419 Views
- 4 replies
- 5 kudos
Why all workspace users can see my user folder
Hi, I am a Databricks account admin user with admin access to our workspace. My user folder for some reason is visible to all workspace users. I have checked permissions settings where possible and cannot see anything that would indicate fully shared...
- 3419 Views
- 4 replies
- 5 kudos
- 5 kudos
workspace is visible for all , you have to make changes in Admin console ,you will find this feature there to disable it
- 5 kudos
- 2790 Views
- 1 replies
- 1 kudos
Resolved! How to Pass Azure variable to databricks.yml file
Hello I would like to find a way to pass a variable from my Azure variables to my databricks yml file. For example I would like to pass the variable BUNDLE_TARGET to the location in this databricks.yml fileIs there a way to do something like this?...
- 2790 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @TheManOfSteele ,Here are the examples of how to achieve that. I think the simplest way would be to set environment variables.azure devops - How can I pass parameters to databricks.yml in Databricks Asset Bundles? - Stack OverflowDatabricks Asset ...
- 1 kudos
- 3972 Views
- 0 replies
- 0 kudos
Azure Pipeline Release Bundle Validate Failure
Hello,I am trying to create a CI/CD pipeline. After the build pipeline, I am trying to create a release to databricks that runs a notebook. I am trying to run this as a service principal. During the bundle validate step I am getting this error. "Erro...
- 3972 Views
- 0 replies
- 0 kudos
- 2596 Views
- 3 replies
- 0 kudos
Resolved! Workspace selector not working
On the top-right of the Databricks GUI is a selector to select workspaces from.Since yesterday morning (approximately the same time the Microsoft outage happened) that selector stopped working. Instead of a dropdown of Workspaces, we only get a spin...
- 2596 Views
- 3 replies
- 0 kudos
- 0 kudos
Glad to know the issue has stopped occurring, but in case it re-occurs in future, we could collect some backend logs, to understand what is causing the slowness, when the issue is and resolve the service. I would request you to involve the azure team...
- 0 kudos
- 4030 Views
- 2 replies
- 0 kudos
Azure Network Settings in regards to Databricks Table Monitoring
I have set up my Unity Catalog on an Azure Data Lake which uses the companies virtual network to allow access.I have all privileges on my account, so I am able to create, alter or delete catalogs, schemas and tables. I can do these things either usin...
- 4030 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Kaniz,Thanks for the response and for identifying the problem.I would like some steps on how to adjust the network settings, as everything I have tried so far, hasn't seemed to work.
- 0 kudos
- 1970 Views
- 1 replies
- 0 kudos
databricks-connect 14.3 spark error against 14.3 cluster with data_security_mode = NONE
I am running into an issue with trying to use a 14.3 cluster with databricks-connect 14.3.My cluster config: { "autoscale": { "min_workers": 2, "max_workers": 10 }, "cluster_name": "Developer Cluster", "spark_version": "14.3.x-scala2...
- 1970 Views
- 1 replies
- 0 kudos
- 0 kudos
Are you running the latest version of the Databricks connect?
- 0 kudos
- 4164 Views
- 2 replies
- 3 kudos
Unable to read resources - Unsupported Protocol Scheme (Terraform AWS)
Hello everyone!Over the last few weeks my company has been trying to deploy a Databricks workspace on AWS adapted to the customer's needs, using Terraform. To do this, we started from a base code on Databricks own github (https://github.com/databrick...
- 4164 Views
- 2 replies
- 3 kudos
- 3 kudos
Whats the solution for this? facing same issue.
- 3 kudos
- 3990 Views
- 2 replies
- 0 kudos
Databricks dashboard programatically
Hi,How can I create a databricks dashboard, filters and visuals programatically (api, terraform, sdk, cli...)?Thanks,Pawel
- 3990 Views
- 2 replies
- 0 kudos
- 0 kudos
Maybe slightly late (maybe because development was late :P), but hopefully it will also help other.1. There seems to be support added to the newest terraform databricks provider - 1.49.0 - here2. Other solution would be to use databricks cli (e.g. `d...
- 0 kudos
- 1756 Views
- 1 replies
- 2 kudos
List deployed Databricks asset bundles (DABs)?
Is there a databricks cli command or REST API to list all the DABs that have been deployed to a workspace?
- 1756 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @Schofield ,Unfortunately, I don't think there is out of the box command that will provide you this information yet. As a workaround, you can try write some code that will extract this information from REST API. For example, you can use /api/2.1/j...
- 2 kudos
- 1986 Views
- 2 replies
- 2 kudos
Resolved! delta sharing issue after enable predictive optimization
Some of our delta sharing tables are not working May be related to this, or maybe not, we enabled predictive optimization on all tables a few days agoes not working any morebut any new tables created works fine after setting thisSET TBLPROPERTIES (de...
- 1986 Views
- 2 replies
- 2 kudos
- 2 kudos
after some debugging, I find out a very unique cause if we used a JSON string in column comment, and it make sense that a JSON string in column comment breaks delta sharingexample: column COMMENT {"key": "primary_key", "is_identity": "true"}The erro...
- 2 kudos
- 4831 Views
- 2 replies
- 1 kudos
Delta Lake S3 multi-cluster writes - DynamoDB
Hi there!I'm trying to figure out how the multi-writers architecture for Delta Lake tables is implemented under the hood.I understand that a DynamoDB table is used to provide mutual exclusion, but the question is: where is the table located? Is it in...
- 4831 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi, could you please help me here? How can i use this configuration in DataBricks? So I will maintain my transcription logs there, and with Parallel, I can use the Delta-RS job.spark.conf.set("spark.delta.logStore.s3a.impl", "io.delta.storage.S3Dynam...
- 1 kudos
- 5909 Views
- 1 replies
- 0 kudos
Technical Architecture - Feedback
Hello MembersI have designed a Technical Architecture (image attached). I would like some feedback on the current design (especially from 5.1 and onwards) and maybe some more ideas or anything else I can use instead of Azure Service Bus and Cosmos DB...
- 5909 Views
- 1 replies
- 0 kudos
- 0 kudos
In the step 3 you will want to consider using Databricks Workflows for orchestration. The ADF databricks notebook activity is not actively developed by microsoft and the API it uses is legacy by Databricks So neither vendor is actively supporting t...
- 0 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
79 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 54 | |
| 38 | |
| 36 | |
| 25 |