cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

Sas
by New Contributor II
  • 1489 Views
  • 3 replies
  • 0 kudos

Resolved! Table is being dropped when cluster terminates in comunity edition

Hi ExpertI have created an external table in databricks community edition. Table is external table. But when i cluster is terminated, i am not able to query the table any more. What is the reason? What i need to do so that table is not dropped. Table...

  • 1489 Views
  • 3 replies
  • 0 kudos
Latest Reply
venkateshgunda
New Contributor III
  • 0 kudos

Separation of Storage and Compute:Databricks separates storage from compute. Data stored in DBFS (Databricks File System) or external storage systems is persistent and not tied to the lifecycle of a cluster.When you create tables and databases in Dat...

  • 0 kudos
2 More Replies
Frustrated_DE
by New Contributor II
  • 335 Views
  • 1 replies
  • 1 kudos

Delta live table segregation

Hi,   I've recently been prototyping on Databricks, I was hoping to develop using DLT pipelines in medallion architecture but with isolation of bronze/silver & gold layers in different catalogs in UC for security purposes.At the moment is there a lim...

  • 335 Views
  • 1 replies
  • 1 kudos
Latest Reply
tyler-xorbix
New Contributor III
  • 1 kudos

Hi @Frustrated_DE ,This seems to be a long-requested feature based on this previous post: Solved: Re: DLT pipeline - Databricks Community - 45740An alternative solution maybe to delegate permissions on the table level for this pipeline specifically. ...

  • 1 kudos
thiagoawstest
by Contributor
  • 411 Views
  • 0 replies
  • 0 kudos

read AWS Secret Manager boto3

Hello, I need to read secrets_value from AWS Secrets Manager, reading via python using boto3 I can retrieve the secret.I need to run it on a notebook, is it possible to read without informing the credentials, only through the IAM role? I tried to add...

  • 411 Views
  • 0 replies
  • 0 kudos
thiagoawstest
by Contributor
  • 735 Views
  • 2 replies
  • 0 kudos

create databricks scope by reading AWS secrets manager

Hi, I have datbricks on AWS, I created some secrets in AWS Secrets Manger, I would need to create the scopes based on AWS secrets manager.When I use Azure's Key Vault, when creating the scope, it uses the option -scope-backend-type AZURE_KEYVAULT, bu...

  • 735 Views
  • 2 replies
  • 0 kudos
Latest Reply
Yeshwanth
Honored Contributor
  • 0 kudos

Hi @thiagoawstest  Step 1: Create Secret ScopeYou can create a secret scope using the Databricks REST API as shown below: python import requests import json # Define the endpoint and headers url = "https://<databricks-instance>/api/2.0/secrets/scope...

  • 0 kudos
1 More Replies
lurban
by New Contributor
  • 5126 Views
  • 6 replies
  • 0 kudos

Creating Databricks Alerts/Notifications without using the UI

I am trying to work on a Databricks notebook that can dynamically create alerts using the Databricks API (Documentation linked here: https://docs.databricks.com/api/azure/workspace/alerts/list). I can successfully create an alert, however what is mis...

  • 5126 Views
  • 6 replies
  • 0 kudos
Latest Reply
subbaram
New Contributor II
  • 0 kudos

Hi Saugat,Is there a way that i can get the owner of a schema/table dynamically and send out a notification to him using Rest API.

  • 0 kudos
5 More Replies
KrzysztofPrzyso
by New Contributor III
  • 7570 Views
  • 3 replies
  • 1 kudos

databricks-connect, dbutils, abfss path, URISyntaxException

When trying to use `dbutils.fs.cp` in the #databricks-connect #databricks-connect context to upload files to Azure Datalake Gen2 I get a malformed URI errorI have used the code provided here:https://learn.microsoft.com/en-gb/azure/databricks/dev-tool...

KrzysztofPrzyso_0-1707241094344.png
Data Engineering
abfss
databricks-connect
  • 7570 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @KrzysztofPrzyso, It appears that you’re encountering an issue with relative paths in absolute URIs when using dbutils.fs.cp in the context of Databricks Connect to upload files to Azure Data Lake Gen2. Let’s break down the problem and explore po...

  • 1 kudos
2 More Replies
564824
by New Contributor II
  • 4560 Views
  • 6 replies
  • 0 kudos

Resolved! Why is Photon increasing DBU used per hour?

I noticed that enabling photon acceleration is increasing the number of DBU utilized per hour which in turn increases our cost.In light of this, I am interested in gaining clarity on the costing of Photon acceleration as I was led to believe that Pho...

  • 4560 Views
  • 6 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

well that depends on what kinds of tests you do.  In data warehousing there are different kinds of loads.What have you tested?  Data transformations or analytical queries.  Because for the latter databricks sql is a better choice than a common spark ...

  • 0 kudos
5 More Replies
high-energy
by New Contributor III
  • 613 Views
  • 1 replies
  • 0 kudos

Resolved! Accessing a series in a DataFrame

Frequently I see this syntax to access a series in DBX. df['column_name'] However, I get this as my output from that.Column<'derived_value'>What's the correct way to access a series 

  • 613 Views
  • 1 replies
  • 0 kudos
Latest Reply
high-energy
New Contributor III
  • 0 kudos

I realized I was looking at the wrong dataframe type.I needed a Pandas dataframe, not a databricks dataframe. 

  • 0 kudos
akshayauser
by New Contributor
  • 773 Views
  • 2 replies
  • 1 kudos

Create a table name without back tick when using set variable

When i tried to create a table name with variable like this-- Set a string variableSET table_suffix = 'suffix';-- Use dynamic SQL to create a table with the variable as a suffix in the table nameCREATE TABLE IF NOT EXISTS <dbname>.my_table_${table_su...

  • 773 Views
  • 2 replies
  • 1 kudos
Latest Reply
brockb
Valued Contributor
  • 1 kudos

Hi,It's possible that the `identifier` clause is what you're looking for (https://docs.databricks.com/en/sql/language-manual/sql-ref-names-identifier-clause.html#identifier-clause). If so, this basic example should work: DECLARE mytab = '`mycatalog`....

  • 1 kudos
1 More Replies
nehaa
by New Contributor II
  • 659 Views
  • 1 replies
  • 0 kudos

Filter in DBX dashboards

How to add a column from Table1 as a filter to Table2 (Also called as on-click action filter) in databricks Dashboards?Both the tables are getting data through sql query 

  • 659 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Honored Contributor
  • 0 kudos

To add a column from Table1 as a filter to Table2 in Databricks Dashboards, you can use the dashboard parameters feature. Here are the steps: Create a visualization for each of your SQL queries. You can do this by clicking the '+' next to the Resul...

  • 0 kudos
high-energy
by New Contributor III
  • 1381 Views
  • 3 replies
  • 2 kudos

Resolved! Union and Column data types

I have three data frames that I create in python. I want to write all three of these to the same delta table. In code I bring the three of them together using the union operation.When I do this the data in the columns is no longer aligned correctly.I...

  • 1381 Views
  • 3 replies
  • 2 kudos
Latest Reply
high-energy
New Contributor III
  • 2 kudos

Aligning the data types and column order across all three data frames before attempting to union them together solved the problem. The below snippet highlights what was happening.data = [[2021, "test", "Albany", "M", 42]] df1 = spark.createDataFrame...

  • 2 kudos
2 More Replies
nistrate
by New Contributor III
  • 6905 Views
  • 2 replies
  • 5 kudos

Resolved! Restricting Workflow Creation and Implementing Approval Mechanism in Databricks

Hello Databricks Community,I am seeking assistance understanding the possibility and procedure of implementing a workflow restriction mechanism in Databricks. Our aim is to promote a better workflow management and ensure the quality of the notebooks ...

  • 6905 Views
  • 2 replies
  • 5 kudos
Latest Reply
Avvar2022
Contributor
  • 5 kudos

I believe this has to happen in 2 steps.step1: Currently admin can't restrict workflow creation in databricks  currently any user with workspace access can create workflows. Admins should be able to restrict workflow creation. Databricks doesn't have...

  • 5 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels