cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

venkat-bodempud
by New Contributor III
  • 4118 Views
  • 4 replies
  • 7 kudos

Power BI - Databricks Integration using Service Principal

Hello Community,We are able to connect to databricks(using Personal access token) from Power BI Desktop and we able to set up scheduling databricks notebook using DataFactory for every 10 minutes(as per our requirement). We want to avoid using the pe...

  • 4118 Views
  • 4 replies
  • 7 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 7 kudos

You can use the token generated for the service principal and use it. As a security best practice, when authenticating with automated tools, systems, scripts, and apps, Databricks recommends you use access tokens belonging to service principals inste...

  • 7 kudos
3 More Replies
chanansh
by Contributor
  • 1398 Views
  • 2 replies
  • 0 kudos

how to compute difference over time of a spark structure streaming?

I have a table with a timestamp column (t) and a list of columns for which I would like to compute the difference over time (v), by some key(k): v_diff(t) = v(t)-v(t-1) for each k independently.Normally I would write:lag_window = Window.partitionBy(C...

  • 1398 Views
  • 2 replies
  • 0 kudos
Latest Reply
chanansh
Contributor
  • 0 kudos

I found this but could not make it work https://www.databricks.com/blog/2022/10/18/python-arbitrary-stateful-processing-structured-streaming.html

  • 0 kudos
1 More Replies
Skesaram
by New Contributor II
  • 1002 Views
  • 1 replies
  • 0 kudos

Need help to connect to local DB from Data bricks

jdbcHostname="478"jdbcPort=1433jdbcDatabase="Onprem_AzureDB"jdbcUsername="upendra"jdbcPassword="upendrakumar"jdbcDriver="com.microsoft.sqlserver.jdbc.SQLServerDriver"jdbcUrl=f"jdbc:sqlserver://{jdbcHostname}:{jdbcPort};databaseName={jdbcDatabase};use...

  • 1002 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, Could you please verify the network connectivity from Databricks to the SQL server? Please make sure SQL:port is allowed in your firewall rules or security groups.

  • 0 kudos
lasmali
by New Contributor II
  • 1840 Views
  • 3 replies
  • 0 kudos

Instance Profile creation via the Databricks REST API returns "INVALID_PARAMETER_VALUE"

# Problem StatementWe have a need to create Instance Profiles via the Databricks REST API, but the endpoint returns an "INVALID_PARAMETER_VALUE" error with "Syntactically invalid AWS instance profile ARN" message even when provided an appropriate ARN...

  • 1840 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @lasse Lidegaard​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 0 kudos
2 More Replies
mebinjoy
by New Contributor II
  • 3102 Views
  • 6 replies
  • 8 kudos

Resolved! Certificate not received.

I had completed the Data Engineering Associate V3 certification today morning and I'm yet to receive my certification. I had received a mail stating that I had passed and the certification would be mailed.

  • 3102 Views
  • 6 replies
  • 8 kudos
Latest Reply
Anonymous
Not applicable
  • 8 kudos

Hi @Mebin Joy​ Thank you for reaching out! Please submit a ticket to our Training Team here: https://help.databricks.com/s/contact-us?ReqType=training  and our team will get back to you shortly. Regards

  • 8 kudos
5 More Replies
cmilligan
by Contributor II
  • 5934 Views
  • 6 replies
  • 1 kudos

Resolved! Reference a single item tuple using .format() in spark.sql()

I'm trying to pass the elements of a tuple into a sql query using .format(). This works fine when I have multiple items in my tuple, but when using a single item in a tuple I get an error.tuple1 = (1,2,3) tuple2 = (5,)   combo = tuple1 + tuple2   pri...

Result
  • 5934 Views
  • 6 replies
  • 1 kudos
Latest Reply
Lakshay
Esteemed Contributor
  • 1 kudos

Could you please post the code and the error that you are getting?

  • 1 kudos
5 More Replies
jerry747847
by New Contributor III
  • 6115 Views
  • 6 replies
  • 11 kudos

Resolved! When to increase maximum bound vs when to increase cluster size?

Hello experts,For the below question, I am trying to understand why option C was selected instead of B? As B would also have resolved the issueQuestion 40A data analyst has noticed that their Databricks SQL queries are running too slowly. They claim ...

  • 6115 Views
  • 6 replies
  • 11 kudos
Latest Reply
JRL
New Contributor II
  • 11 kudos

On a sql server, there are wait states. Wait states occur when several processors (vCPUs) are processing and several threads are working through the processors. A longer running thread that has dependencies, can cause the thread that may have begun o...

  • 11 kudos
5 More Replies
190809
by Contributor
  • 1141 Views
  • 2 replies
  • 1 kudos

Resolved! Loading tables to gold, one loads and the other two fail but same process.

Hi team, I am still fairly new to working with delta tables. I have created a df by reading in data from existing silver tables in my lakehouse. I read in the silver tables usiung sql into a workbook, do some manipulation, unnest some fiels and then ...

  • 1141 Views
  • 2 replies
  • 1 kudos
Latest Reply
190809
Contributor
  • 1 kudos

Hi @Pravin Chaubey​ thanks for responding. I discovered the issue. I had to load them as unmanaged tables but had previously not specified a path when doing .saveAsTable() and so those two tables that were failing to load were in fact managed tables ...

  • 1 kudos
1 More Replies
weldermartins
by Honored Contributor
  • 4205 Views
  • 2 replies
  • 1 kudos

Resolved! How to make spark-submit work on windows?

I have Jupyter Notebook installed on my machine working normally. I tested running a Spark application by running the spark-submit command and it returns the message that the file was not found. What do you need to do to make it work?Below is a file ...

image
  • 4205 Views
  • 2 replies
  • 1 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 1 kudos

Hi, yet this is not tested in my lab, but could you please check and confirm if this works: https://stackoverflow.com/questions/37861469/how-to-submit-spark-application-on-cmd

  • 1 kudos
1 More Replies
sasidhar
by New Contributor II
  • 7635 Views
  • 4 replies
  • 8 kudos

custom python module not found while using dbx on pycharm

Am new to databricks and pyspark. Building a pyspark application using pycharm IDE. I have tested the code in local and wanted to run on databricks cluster from IDE itself. Following the dbx documentation and able to run the single python file succes...

  • 7635 Views
  • 4 replies
  • 8 kudos
Latest Reply
Meghala
Valued Contributor II
  • 8 kudos

Even I got error​

  • 8 kudos
3 More Replies
najmead
by Contributor
  • 2043 Views
  • 2 replies
  • 0 kudos

Error Creating Primary Key Constraint

I am trying to add a primary key constraint to an existing table, and I get the following error;Cannot create or update table because the child column(s) `my_primary_key` of primary key `pk` cannot be set to nullable. Either drop the constraint, or c...

  • 2043 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, Could you please confirm if you are using the latest databricks-sql-connector ? (https://pypi.org/project/databricks-sql-connector/)

  • 0 kudos
1 More Replies
Bhanu1
by New Contributor III
  • 1256 Views
  • 2 replies
  • 0 kudos

The new horizontal view of tasks *****. Can we please have the option for vertical view of a workflow?

The new horizontal view of tasks *****. Can we please have the option for vertical view of a workflow?

  • 1256 Views
  • 2 replies
  • 0 kudos
Latest Reply
Bhanu1
New Contributor III
  • 0 kudos

Hi Debayan,This was how workflows used to look like before  These are now shown from left to right instead of from top to bottom. It is a pain to scroll through a long workflow now as mouses don't have the capability to scroll left and right.

  • 0 kudos
1 More Replies
data_explorer
by New Contributor II
  • 900 Views
  • 1 replies
  • 0 kudos

Is there anyway to execute grant and revoke statements to a user for an object based on a condition?

SELECT if((select count(*) from information_schema.table_privileges where grantee = 'samo@test.com' and table_schema='demo_schema' and table_catalog='demo_catalog')==1, (select count(*) from demo_catalog.demo_schema.demo_table), (select count(*) from...

  • 900 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, GRANT and REVOKE are privileges on an securable object to a principal. And a principal is a user, service principal, or group known to the metastore. Principals can be granted privileges and may own securable objects.Also, you can use REVOKE ON S...

  • 0 kudos
SaravananPalani
by New Contributor II
  • 21955 Views
  • 8 replies
  • 9 kudos

Is there any way to monitor the CPU, disk and memory usage of a cluster while a job is running?

I am looking for something preferably similar to Windows task manager which we can use for monitoring the CPU, memory and disk usage for local desktop.

  • 21955 Views
  • 8 replies
  • 9 kudos
Latest Reply
hitech88
New Contributor II
  • 9 kudos

Some important info to look in Gangalia UI in CPU, memory and server load charts to spot the problem:CPU chart :User %Idle %High percentage of user % indicates heavy CPU usage in the cluster.Memory chart : Use %Free %Swap % If you see purple line ove...

  • 9 kudos
7 More Replies
najmead
by Contributor
  • 15521 Views
  • 6 replies
  • 13 kudos

How to convert string to datetime with correct timezone?

I have a field stored as a string in the format "12/30/2022 10:30:00 AM"If I use the function TO_DATE, I only get the date part... I want the full date and time.If I use the function TO_TIMESTAMP, I get the date and time, but it's assumed to be UTC, ...

  • 15521 Views
  • 6 replies
  • 13 kudos
Latest Reply
Rajeev_Basu
Contributor III
  • 13 kudos

use from_utc_timestamp(to_timestam("<string>", <format>),<timezone>)

  • 13 kudos
5 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels