cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

Ayur
by New Contributor II
  • 3892 Views
  • 3 replies
  • 4 kudos

Resolved! Unsupported_operation : Magic commands (e.g. %py, %sql and %run) are not supported with the exception of %pip within a Python notebook. Cells containing magic commands are ignored - DLT pipeline

Hi,I'm trying to use magic command(to change to python in a notebook with sql as a default language) in a dlt pipeline,.When starting the pipeline cells containing magic command are ignored., with the warning message below:"Magic commands (e.g. %py, ...

  • 3892 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Yassine Dehbi​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 4 kudos
2 More Replies
chanda02
by New Contributor II
  • 1735 Views
  • 2 replies
  • 0 kudos

GRANT permission does not work on column level access control for table in unity catalog

I am trying to run below SQL command for granting column specific control to user but it throws error.I have given USE_CATALOG for catalog and USE_SCHEMA permission for schema for the user and have set current catalog and schema.I am using Databricks...

image
  • 1735 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, you can find the best practises here: https://docs.databricks.com/data-governance/unity-catalog/best-practices.htmlAlso, for cluster access modes for unity catalog : https://docs.databricks.com/data-governance/unity-catalog/index.html#cluster-sec...

  • 0 kudos
1 More Replies
Leszek
by Contributor
  • 909 Views
  • 1 replies
  • 3 kudos

How to handle schema changes in streaming Delta Tables?

I'm using Structure Streaming when moving data from one Delta Table to another.How to handle schema changes in those tables (e.g. adding new column)?

  • 909 Views
  • 1 replies
  • 3 kudos
Latest Reply
Murthy1
Contributor II
  • 3 kudos

Hello,I think the only way of handling is to mention the schema within the job through a schema file. The other way is to restart the job to infer the new schema automatically.

  • 3 kudos
KevSpally
by New Contributor
  • 1210 Views
  • 1 replies
  • 0 kudos

When accessing a view in Unity Catalog; access to underlying tables of the view is also needed.

My goal is that i want to provide users access to a view but not the underlying tables. I only want them to see specific columns and rows of the table. When i just give the select permissions on the view the user gets an error that they also need acc...

  • 1210 Views
  • 1 replies
  • 0 kudos
Latest Reply
jonathan_ruiz
New Contributor II
  • 0 kudos

I have exactly the same question, did anyone get the right answer?

  • 0 kudos
chanansh
by Contributor
  • 1295 Views
  • 1 replies
  • 0 kudos

Resolved! autoloader documentation does not work

I am trying to following the documentation here:https://learn.microsoft.com/en-us/azure/databricks/getting-started/etl-quick-startMy code looks like:(spark.readStream .format("cloudFiles") .option("header", "true") #.option("cloudFiles.partitio...

  • 1295 Views
  • 1 replies
  • 0 kudos
Latest Reply
Murthy1
Contributor II
  • 0 kudos

Hi,It seems like you are writing to a path which is not empty and has some non - delta format files. Also, can you confirm if the path mentioned in the error message "`s3://nbu-ml/projects/rca/msft/dsm09collectx/delta` " is the path you are writing t...

  • 0 kudos
MichaelN1
by New Contributor II
  • 5655 Views
  • 6 replies
  • 8 kudos

How to change the width of the lines for Python Format in a notebook?

Hidoes anyone know how to change the default 80 symbols for the line length when executing "Format Python Code" in Databricks notebook?These days ultra wide monitors are new standards and having 80 line width with breaks to a new line is not usable

  • 5655 Views
  • 6 replies
  • 8 kudos
Latest Reply
Rajeev_Basu
Contributor III
  • 8 kudos

It may not be possible presently. but would be good to be recommended to Databricks as a feedback.

  • 8 kudos
5 More Replies
Rishabh-Pandey
by Esteemed Contributor
  • 8837 Views
  • 7 replies
  • 8 kudos

Resolved! connect databricks to teradata

hey i want to know can we connect databricks to the teradata database and if yes what will be the procedure ??? help would be appreciated

  • 8837 Views
  • 7 replies
  • 8 kudos
Latest Reply
jose_gonzalez
Moderator
  • 8 kudos

use the JDBC driver from here https://docs.databricks.com/integrations/jdbc-odbc-bi.html

  • 8 kudos
6 More Replies
venkat-bodempud
by New Contributor III
  • 4039 Views
  • 4 replies
  • 7 kudos

Power BI - Databricks Integration using Service Principal

Hello Community,We are able to connect to databricks(using Personal access token) from Power BI Desktop and we able to set up scheduling databricks notebook using DataFactory for every 10 minutes(as per our requirement). We want to avoid using the pe...

  • 4039 Views
  • 4 replies
  • 7 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 7 kudos

You can use the token generated for the service principal and use it. As a security best practice, when authenticating with automated tools, systems, scripts, and apps, Databricks recommends you use access tokens belonging to service principals inste...

  • 7 kudos
3 More Replies
chanansh
by Contributor
  • 1356 Views
  • 2 replies
  • 0 kudos

how to compute difference over time of a spark structure streaming?

I have a table with a timestamp column (t) and a list of columns for which I would like to compute the difference over time (v), by some key(k): v_diff(t) = v(t)-v(t-1) for each k independently.Normally I would write:lag_window = Window.partitionBy(C...

  • 1356 Views
  • 2 replies
  • 0 kudos
Latest Reply
chanansh
Contributor
  • 0 kudos

I found this but could not make it work https://www.databricks.com/blog/2022/10/18/python-arbitrary-stateful-processing-structured-streaming.html

  • 0 kudos
1 More Replies
Skesaram
by New Contributor II
  • 969 Views
  • 1 replies
  • 0 kudos

Need help to connect to local DB from Data bricks

jdbcHostname="478"jdbcPort=1433jdbcDatabase="Onprem_AzureDB"jdbcUsername="upendra"jdbcPassword="upendrakumar"jdbcDriver="com.microsoft.sqlserver.jdbc.SQLServerDriver"jdbcUrl=f"jdbc:sqlserver://{jdbcHostname}:{jdbcPort};databaseName={jdbcDatabase};use...

  • 969 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, Could you please verify the network connectivity from Databricks to the SQL server? Please make sure SQL:port is allowed in your firewall rules or security groups.

  • 0 kudos
lasmali
by New Contributor II
  • 1772 Views
  • 3 replies
  • 0 kudos

Instance Profile creation via the Databricks REST API returns "INVALID_PARAMETER_VALUE"

# Problem StatementWe have a need to create Instance Profiles via the Databricks REST API, but the endpoint returns an "INVALID_PARAMETER_VALUE" error with "Syntactically invalid AWS instance profile ARN" message even when provided an appropriate ARN...

  • 1772 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @lasse Lidegaard​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 0 kudos
2 More Replies
mebinjoy
by New Contributor II
  • 3022 Views
  • 6 replies
  • 8 kudos

Resolved! Certificate not received.

I had completed the Data Engineering Associate V3 certification today morning and I'm yet to receive my certification. I had received a mail stating that I had passed and the certification would be mailed.

  • 3022 Views
  • 6 replies
  • 8 kudos
Latest Reply
Anonymous
Not applicable
  • 8 kudos

Hi @Mebin Joy​ Thank you for reaching out! Please submit a ticket to our Training Team here: https://help.databricks.com/s/contact-us?ReqType=training  and our team will get back to you shortly. Regards

  • 8 kudos
5 More Replies
cmilligan
by Contributor II
  • 5746 Views
  • 6 replies
  • 1 kudos

Resolved! Reference a single item tuple using .format() in spark.sql()

I'm trying to pass the elements of a tuple into a sql query using .format(). This works fine when I have multiple items in my tuple, but when using a single item in a tuple I get an error.tuple1 = (1,2,3) tuple2 = (5,)   combo = tuple1 + tuple2   pri...

Result
  • 5746 Views
  • 6 replies
  • 1 kudos
Latest Reply
Lakshay
Esteemed Contributor
  • 1 kudos

Could you please post the code and the error that you are getting?

  • 1 kudos
5 More Replies
jerry747847
by New Contributor III
  • 5967 Views
  • 6 replies
  • 11 kudos

Resolved! When to increase maximum bound vs when to increase cluster size?

Hello experts,For the below question, I am trying to understand why option C was selected instead of B? As B would also have resolved the issueQuestion 40A data analyst has noticed that their Databricks SQL queries are running too slowly. They claim ...

  • 5967 Views
  • 6 replies
  • 11 kudos
Latest Reply
JRL
New Contributor II
  • 11 kudos

On a sql server, there are wait states. Wait states occur when several processors (vCPUs) are processing and several threads are working through the processors. A longer running thread that has dependencies, can cause the thread that may have begun o...

  • 11 kudos
5 More Replies
190809
by Contributor
  • 1110 Views
  • 2 replies
  • 1 kudos

Resolved! Loading tables to gold, one loads and the other two fail but same process.

Hi team, I am still fairly new to working with delta tables. I have created a df by reading in data from existing silver tables in my lakehouse. I read in the silver tables usiung sql into a workbook, do some manipulation, unnest some fiels and then ...

  • 1110 Views
  • 2 replies
  • 1 kudos
Latest Reply
190809
Contributor
  • 1 kudos

Hi @Pravin Chaubey​ thanks for responding. I discovered the issue. I had to load them as unmanaged tables but had previously not specified a path when doing .saveAsTable() and so those two tables that were failing to load were in fact managed tables ...

  • 1 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels