cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Retko
by Contributor
  • 9381 Views
  • 4 replies
  • 2 kudos

Running Command is often stuck on "Running Command..."

Hi,when running command, it often gets stuck and message below it says: "Running Command..."What can I do with it besides of restarting cluster?Also tried reattaching and clearing state, but no help here.Thanks

  • 9381 Views
  • 4 replies
  • 2 kudos
Latest Reply
Debayan
Databricks Employee
  • 2 kudos

Hi, do you see this while running a command in the notebook? Please tag @Debayan  with your next comment which will notify me. Thanks!

  • 2 kudos
3 More Replies
DennisB
by New Contributor III
  • 5647 Views
  • 4 replies
  • 2 kudos

Resolved! Better Worker Node Core Utilisation

Hi everyone,Hoping someone can help me with this problem. I have an embarrassingly parallel workload, which I'm parallelising over 4 worker nodes (of type Standard_F4, so 4 cores each). Each workload is single-threaded, so I believe that only one cor...

  • 5647 Views
  • 4 replies
  • 2 kudos
Latest Reply
DennisB
New Contributor III
  • 2 kudos

So I managed to get the 1-core-per-executor working successfully. The bit that wasn't working was spark.executor.memory -- this was too high, but lowering it so that the sum of the executors memory was ~90% of the worker node's memory allowed it to w...

  • 2 kudos
3 More Replies
Priyag1
by Honored Contributor II
  • 2998 Views
  • 2 replies
  • 11 kudos

Query parameters in dashboardsQueries can optionally leverage parameters or static values. When a visualization based on a parameterized query is adde...

Query parameters in dashboardsQueries can optionally leverage parameters or static values. When a visualization based on a parameterized query is added to a dashboard, the visualization can either be configured to use a:Widget parameterWidget paramet...

  • 2998 Views
  • 2 replies
  • 11 kudos
Latest Reply
Natalie_NL
New Contributor II
  • 11 kudos

Hi, I build a dashboard with dashboard parameters, it works pretty easy!The advantage of dashboard parameters is that you do not have to set a default (it can be: all). This is convenient when you need to filter on values that change every time the q...

  • 11 kudos
1 More Replies
The_raj
by New Contributor
  • 5257 Views
  • 0 replies
  • 0 kudos

Error while reading file <file path>. [DEFAULT_FILE_NOT_FOUND]

Hi,I have a workflow created where there are 5 notebooks in it. One of the notebooks is failing with below error. I have tried refreshing the table. Still facing the same issue. When I try to run the notebook manually, it works fine. Can someone plea...

  • 5257 Views
  • 0 replies
  • 0 kudos
dvmentalmadess
by Valued Contributor
  • 2276 Views
  • 3 replies
  • 0 kudos

Ingestion Time Clustering on initial load

We are migrating our data into Databricks and I was looking at the recommendations for partitioning here: https://docs.databricks.com/tables/partitions.html. This recommends not specifying partitioning and allowing "Ingestion Time Partitioning" (ITP)...

  • 2276 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @dvmentalmadess  Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.  We'd love to hear from you. T...

  • 0 kudos
2 More Replies
NWIEFInance
by New Contributor
  • 1279 Views
  • 0 replies
  • 0 kudos

Connect to EXCEL

> I have hard time connecting to Excel, any help connecting Data Bricks to EXCEL

  • 1279 Views
  • 0 replies
  • 0 kudos
RamozanbekS
by New Contributor III
  • 2468 Views
  • 1 replies
  • 0 kudos

Resolved! Databricks SQL Statement Execution API

I'm trying to follow the example provided here https://github.com/databricks-demos/dbsql-rest-api/blob/main/python/external_links.pyIt fails when it comes to downloading the data chunks. The statement status turns from SUCCEEDED to CLOSED right away ...

  • 2468 Views
  • 1 replies
  • 0 kudos
Latest Reply
RamozanbekS
New Contributor III
  • 0 kudos

It turns out that if the response is small and can fit 16mb limit, then status check will also provide single external link to download the data.So I need a condition here. Maybe even something like thisif len(chunks) == 1: external_url = respons...

  • 0 kudos
MadrasSenpai
by New Contributor II
  • 2310 Views
  • 2 replies
  • 2 kudos

How to install cmdstanpy in dbx cluster

I have built an HMC model using cmdstand. In my local machine, I have install cmdstan for the following approach. import cmdstanpy cmdstanpy.install_cmdstan()But in Databricks I need to reinstall it every time when I train a new model, from the noteb...

  • 2310 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Rajamannar Aanjaram Krishnamoorthy​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 2 kudos
1 More Replies
Asterol
by New Contributor III
  • 1766 Views
  • 1 replies
  • 1 kudos

Creating a test schema - what is the best practice?

Hey, I've created a schema with few tables with historical data (prod), now I would like to have a Dev/testing environment with exactly the same data.What do you recommend? CTAS? Shallow clone? Deep clone? I wonder if shallow clone would be sufficien...

Data Engineering
Clone ctas
  • 1766 Views
  • 1 replies
  • 1 kudos
Latest Reply
Tharun-Kumar
Databricks Employee
  • 1 kudos

@Asterol If you would like to have the same data for your Dev/testing environment, I would recommend using Deep Clone. Deep clone copies the metadata and creates an independent copy of the table data. Shallow clone only copies the metadata and will h...

  • 1 kudos
NathanSundarara
by Contributor III
  • 2035 Views
  • 0 replies
  • 0 kudos

Sample code to read json from service bus queue in Azure

Hi,I'm looking for sample notebook or code snippet to read messages from Azure Service bus queues. I looked for documentation couldn't find anything. Any help would be appreciated. First we are thinking of batch mode before we move on to Streaming. P...

Data Engineering
azure
deltalivetable
messagequeue
servicebus
Servicebus azure deltalivetables message queue
  • 2035 Views
  • 0 replies
  • 0 kudos
Navashakthi
by New Contributor
  • 2728 Views
  • 4 replies
  • 2 kudos

Resolved! Community Edition Sign-up Issue

Hi, I'm trying to signup community edition for learning purpose. The sign up page has issue in selecting country. The select dropdown doesn't work and continue option redirects to same page. Couldn't complete signup. Kindly help!

  • 2728 Views
  • 4 replies
  • 2 kudos
Latest Reply
amitdas2k6
New Contributor II
  • 2 kudos

for me it is alwas displaying below error but entered correct user name and passowrd,my user name : amit.das2k16@gmail.com Invalid email address or passwordNote: Emails/usernames are case-sensitive 

  • 2 kudos
3 More Replies
Shadowsong27
by New Contributor III
  • 16410 Views
  • 11 replies
  • 4 kudos

Resolved! Mongo Spark Connector 3.0.1 seems not working with Databricks-Connect, but works fine in Databricks Cloud

On latest DB-Connect==9.1.3 and dbr == 9.1, retrieving data from mongo using Maven coordinate of Mongo Spark Connector: org.mongodb.spark:mongo-spark-connector_2.12:3.0.1 - https://docs.mongodb.com/spark-connector/current/ - working fine previously t...

  • 16410 Views
  • 11 replies
  • 4 kudos
Latest Reply
mehdi3x
New Contributor II
  • 4 kudos

Hi everyone the solution for me it was to replace spark.read.format("mongo") by spark.read.format("mongodb") my spark version is 3.3.2 and my mongodb version is 6.0.6 . 

  • 4 kudos
10 More Replies
erigaud
by Honored Contributor
  • 2429 Views
  • 4 replies
  • 1 kudos

Deploying existing queries and alerts to other workspaces

I have several queries and associated alerts in a workspace, and I would like to be able to deploy them to an other workspace, for example an higher environment. Since both queries and objects are not supported in repos, what is the way to go to easi...

  • 2429 Views
  • 4 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @erigaud  We haven't heard from you since the last response from @btafur , and I was checking back to see if her suggestions helped you. Or else, If you have any solution, please share it with the community, as it can be helpful to others.  Also, ...

  • 1 kudos
3 More Replies
dprutean
by New Contributor III
  • 886 Views
  • 0 replies
  • 0 kudos

JDBC DatabaseMetaData.getCatalogs()

Calling the DatabaseMetaData.getCatalogs() returns 'spark_catalogs instead' of 'hive_metastore', when connected to tradition version of databricks cluster which is not signed with uc_catalog tag.Please check this.

  • 886 Views
  • 0 replies
  • 0 kudos
VD10
by New Contributor
  • 1612 Views
  • 1 replies
  • 0 kudos

Data Engineering Professional Certificate

On the way to obtain the certificate. Any preparing tips would be appreciated! Thanks!

  • 1612 Views
  • 1 replies
  • 0 kudos
Latest Reply
dplante
Contributor II
  • 0 kudos

Disclaimer - I haven't taken this exam yet A couple of suggestions (from this forum, google searches, etc):- check out this blog post - https://medium.com/@sjrusso/passing-the-databricks-professional-data-engineer-exam-115cccc90aba#:~:text=I%20recent...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels