- 5402 Views
- 4 replies
- 0 kudos
Override default Personal Compute policy using terraform / disable Personal Compute policy
I want to programmatically do some adjustments to the default personal compute resource or preferably create my own custom one based on the same configuration or policy family (in which all users can gain access to) when deploying a new workspace usi...
- 5402 Views
- 4 replies
- 0 kudos
- 0 kudos
Only way I got it working was by importing the pre-existing policy into terraform and do an overwrite as already mentioned by @jsimonovic . The full code example looks like this:import { id = "001BF0AC280610B4" # Polcy ID of the pre-existing person...
- 0 kudos
- 1042 Views
- 1 replies
- 1 kudos
Resolved! Retention for hive_metastore tables
HiI have a notebook that creates tables in the hive_metastore with the following code: df.write.format("delta").mode("overwrite").saveAsTable(output_table_name) Which is the retantion for the data saved in the hive metastore? is there any configurati...
- 1042 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi mattiagsAs long as you do not delete the data via notebook or in the data lake, it will not be deleted in any other way. This means that there is no retention time in this sense, or conversely, it is infinite until you deliberately delete the data...
- 1 kudos
- 3154 Views
- 3 replies
- 0 kudos
Resolved! Databricks on AWS - Changes to your Unity Catalog storage credentials
Hi Context: On June 30, 2023, AWS updated its IAM role trust policy, which requires updating Unity Catalog storage credentials. Databricks previously sent an email communication to customers in March 2023 on this topic and updated the documentation a...
- 3154 Views
- 3 replies
- 0 kudos
- 0 kudos
Thank you for the response @MoJaMa - we will try it out tomorrow and post an update here.
- 0 kudos
- 4180 Views
- 4 replies
- 1 kudos
Resolved! How to use Databricks CLI as a service principal?
Hi all,I have a question about how to use Databricks CLI on my local environment as a service principal?I have installed Databricks CLI and configured the file `.databrickscfg` as shown below. [DEFAULT] host = https://adb-123123123.1.azuredatabr...
- 4180 Views
- 4 replies
- 1 kudos
- 1 kudos
got you.I found a working solution. Try this one:[devsp] azure_workspace_resource_id = /subscriptions/bc0cd1..././.../Databricks/workspaces/my-workspace azure_tenant_id = bc0cd1... azure_client_id = fa0cd1... azure_client_secr...
- 1 kudos
- 2187 Views
- 4 replies
- 0 kudos
- 2187 Views
- 4 replies
- 0 kudos
- 0 kudos
Thank you so much!! I solve this by reinstalling chrome browser. I got this issue last week and can not solve it even if wait, clear cache, restart..etc. But worked in another type of browser. So I reinstall Chrome browser and it worked. Thank you
- 0 kudos
- 1600 Views
- 4 replies
- 0 kudos
Interface for Databricks CE
My Databricks CE interface does not have the quick guides or toggles for Data Science & Engineering/Machine Learning. this is what it looks like and I want to see the quick guides. 
- 1600 Views
- 4 replies
- 0 kudos
- 2293 Views
- 1 replies
- 0 kudos
Resolved! Docs/Info for metastore, artifact blob storage endpoints etc for Azure Databricks
Hi! This is probably a very newbie type of question and my google skills seems to be lacking but is there any in depth documentation/explanations of the metastore, artifact blob storage, system tables storage, log blob storage and event hubs services...
- 2293 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi,You can check below: System Tables Storage Purpose: System tables storage is used to store system-level metadata and configuration data for the Azure Databricks workspace.Data Stored: This includes metadata related to the Unity Catalog, cluster c...
- 0 kudos
- 4051 Views
- 2 replies
- 3 kudos
Resolved! Databricks Asset Bundles + Artifacts + Poetry
Hello,I've configured the DABs on our project successfully. Moreover, I could switch from setuptools to poetry almost successfully. In the project's databricks.yml I configured it as the documentation suggested, I've just changed the name of the arti...
- 4051 Views
- 2 replies
- 3 kudos
- 3 kudos
Hi @Fiabane ,Could you first check:Do you see your .whl file in your artifacts folder?Could you try to install the package by running the code in your notebook : %pip install <path to your wheel>As far as I understand you want to have a job ...
- 3 kudos
- 1833 Views
- 2 replies
- 0 kudos
Databricks Kryo setup
I would like to consolidate all our Spark jobs in Databricks. One of those jobs that are currently running in Azure HDInsight is not properly working using a Databricks JAR job.It uses Spark 3.3 RDDs and requires configuring Kryo serialisation. There...
- 1833 Views
- 2 replies
- 0 kudos
- 0 kudos
Integrating Spark tasks with Databricks can greatly improve your workflow. For tasks that require Kryo serialization, make sure you configure your Spark session correctly. You may need to adjust the serialization settings in your Spark configuration....
- 0 kudos
- 892 Views
- 1 replies
- 2 kudos
Azure Databricks Classic Compute Plane Firewall
I’m designing a compute plane configuration that will align our data platform with internal policies from a security perspective. As part of this exercise I'm documenting how the permissible traffic inbound and outbound is controlled using NSG rules,...
- 892 Views
- 1 replies
- 2 kudos
- 2 kudos
@Jim-Shady wrote:I’m designing a compute plane configuration that will align our data platform with internal policies from a security perspective. As part of this exercise I'm documenting how the permissible traffic inbound and outbound is controlled...
- 2 kudos
- 5263 Views
- 1 replies
- 0 kudos
Resolved! How to deploy to Databricks Assets Bundle from Azure DevOps using Service Principal?
I have a CI/CD process that after a Pull Request (PR) to main it deploys to staging.It works using a Personal Access Token using Azure Pipelines.From local, deploying using Service Principal works (https://community.databricks.com/t5/administration-a...
- 5263 Views
- 1 replies
- 0 kudos
- 0 kudos
I needed to deploy a job using CI/CD Azure Pipelines without using the OAuth, this is the way:First you need to have configured the Service Principal, for that you need to generate it in your workspace with this you will have:A host: Which is your wo...
- 0 kudos
- 2558 Views
- 5 replies
- 0 kudos
Resolved! Unable to Create Job Task Using Git Provider Invalid Path
I am attempting to create a task in a job using the Git Provider as a source and GitHub is the provider. The repo is a private repo. Regardless of how I enter the path to the notebook I receive the same error that the notebook path is invalid and o...
- 2558 Views
- 5 replies
- 0 kudos
- 0 kudos
Like I said in a previous response. This started working automatically a few days ago with no changes on our end. The developer who was working on this decided to try it one more time and it just worked, no error this time. I don't know if Databri...
- 0 kudos
- 1354 Views
- 1 replies
- 0 kudos
Bring data from databricks to sharepoint list using the Power Automate
Good afternoon to all and I am new to this community.We are trying to bring data from databricks to sharepoint list using the Power Automate app (create workflow and trigger it when there is new record or exising record is modified in source table in...
- 1354 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi all, Can anyone assist me with this request ?Thanks in advance
- 0 kudos
- 858 Views
- 1 replies
- 0 kudos
Tabs for notebooks
Browsing this page of the documentation, the displayed GIF shows a notebook that is opened in its own tab. I've been looking for how to enable this feature in my own workspace, but cannot find it.Does anyone know how to enable this feature?
- 858 Views
- 1 replies
- 0 kudos
- 0 kudos
Nope.It seems that is some kind of new version of the UI.In the SQL editor one can open multiple tabs. But for python notebooks I have no idea.
- 0 kudos
- 1135 Views
- 1 replies
- 0 kudos
How to generate an Azure Subscription from a Databricks Generated Service Principal?
Hello, I currently have a Service Principal (SP) Client_Id and its associated secret, I generated it directly from my workspace in Databricks, i was following this post: https://github.com/databricks/cli/issues/1722, but I don't know how to generate ...
- 1135 Views
- 1 replies
- 0 kudos
- 0 kudos
Learn to summon an Azure Subscription from a Databricks-generated Service Principal. Harness the power of data with this vital step in Azure infrastructure management. Mastering it is as crucial as surviving Fnaf
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
49 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 110 | |
| 37 | |
| 34 | |
| 25 | |
| 24 |