cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

v_sravan_sai
by New Contributor
  • 434 Views
  • 2 replies
  • 0 kudos

UnsupportedOperationException: com.databricks.backend.daemon.data.client.DBFSV1.resolvePathOnPhysica

dbutils.fs.mv('dbfs:/FileStore/tables/Employee-2.csv','dbfs:/FileStore/backup/Employee-5.csv',recurse=True)--is giving errorUnsupportedOperationException: com.databricks.backend.daemon.data.client.DBFSV1.resolvePathOnPhysicalStorage(path: Path)File <...

  • 434 Views
  • 2 replies
  • 0 kudos
Latest Reply
Rishabh_Tiwari
Community Manager
  • 0 kudos

Hi @v_sravan_sai , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your fe...

  • 0 kudos
1 More Replies
Dorothy80Galvin
by New Contributor II
  • 730 Views
  • 2 replies
  • 1 kudos

How can I Resolve QB Desktop Update Error 15225?

I'm encountering QB Desktop update error 15225. What could be causing this issue, and how can I resolve it? It's disrupting my workflow, and I need a quick fix.

  • 730 Views
  • 2 replies
  • 1 kudos
Latest Reply
jameshardy602
New Contributor II
  • 1 kudos

hi @Dorothy80Galvin To resolve Desktop Update Error 15225, follow these steps. First, verify your Internet Explorer settings by ensuring it is set as the default browser and that SSL settings are enabled. Next, add trusted sites by navigating to Inte...

  • 1 kudos
1 More Replies
paras11
by New Contributor III
  • 710 Views
  • 4 replies
  • 1 kudos

Databricks data engineer associate exam Suspended

Hi Team,I recently had a disappointing experience while attempting my first Data bricks certification exam. During the exam, I was abruptly directed to Proctor Support. The proctor asked me to show my desk and the room I was in. I complied by showing...

Community Platform Discussions
@Cert-Bricks@Cert-Team@Cert-TeamOPS @Kaniz_Fatma
  • 710 Views
  • 4 replies
  • 1 kudos
Latest Reply
paras11
New Contributor III
  • 1 kudos

 @Kaniz_Fatma @Cert-Team @Cert-Bricks Requesting you to please look into this and update me since it has not been resolved yet and I am not able to reschedule my exam. 

  • 1 kudos
3 More Replies
TinaN
by New Contributor III
  • 859 Views
  • 3 replies
  • 3 kudos

Resolved! Extracting 'time' from a 'timestamp' datatype in Databricks

We are loading a data source to Databricks that contains columns with 'Time' datatype.  Databricks converts this to 'Timestamp', so I am researching for a way to extract time only. This is what I came up with, but the result isn't quite right.  Is th...

  • 859 Views
  • 3 replies
  • 3 kudos
Latest Reply
szymon_dybczak
Contributor
  • 3 kudos

Hi @TinaN ,I check it in the evening, but try below: SELECT date_format(timestamp_column, 'HH:mm:ss') AS time_partFROM your_table

  • 3 kudos
2 More Replies
Prasad_Koneru
by New Contributor III
  • 678 Views
  • 4 replies
  • 0 kudos

Deployment of tables and views in unity catalog and Repo structuring for catalogs objects.

we want to create the CI/CD Pipeline for deploying Unity catalog objects inorder to enhance the deployment ability.but How to maintain a repository of Tables/views/ or any other objects created in the catalogs and schema.Is this possible to do just l...

  • 678 Views
  • 4 replies
  • 0 kudos
Latest Reply
Rishabh_Tiwari
Community Manager
  • 0 kudos

Hi @Prasad_Koneru , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your f...

  • 0 kudos
3 More Replies
icyapple
by New Contributor
  • 459 Views
  • 3 replies
  • 0 kudos

hive_metastore schema access control

We are trying to control access to schemas under hive_metastore, only allowing certain users to access the tables under a schema (via SQL, Pyspark, and Python...), we have follow steps in a testing schema:1. Enable workspace table access control2. Ru...

  • 459 Views
  • 3 replies
  • 0 kudos
Latest Reply
Rishabh_Tiwari
Community Manager
  • 0 kudos

Hi @icyapple , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedba...

  • 0 kudos
2 More Replies
SHASHANK2
by New Contributor III
  • 616 Views
  • 3 replies
  • 1 kudos

cluster termination

Hello All,when I am creating all purpose cluster I am getting an idle time of 3 days, my cluster is terminating after 3days, I want to make my cluster terminate in 60 min of idle time, i want to do it globally so that in future any cluster created by...

  • 616 Views
  • 3 replies
  • 1 kudos
Latest Reply
Rishabh_Tiwari
Community Manager
  • 1 kudos

Hi @SHASHANK2 , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedb...

  • 1 kudos
2 More Replies
alluarjun
by New Contributor
  • 419 Views
  • 2 replies
  • 0 kudos

databricks asset bundle error-terraform.exe": file does not exist

Hi,I am getting below error while I am deploying databricks bundle using azure devops release  2024-07-07T03:55:51.1199594Z Error: terraform init: exec: "xxxx\\.databricks\\bundle\\dev\\terraform\\xxxx\\.databricks\\bundle\\dev\\bin\\terraform.exe": ...

  • 419 Views
  • 2 replies
  • 0 kudos
Latest Reply
Rishabh_Tiwari
Community Manager
  • 0 kudos

Hi @alluarjun , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedb...

  • 0 kudos
1 More Replies
Yashodip
by New Contributor II
  • 440 Views
  • 2 replies
  • 0 kudos

My Databricks Professional Data Engineer exam has suspended , Need help Urgently (17/07/2024)

Hello Team, I encountered Pathetic experience while attempting my Professional Data Engineer DataBricks certification. Abruptly, Proctor asked me to show my desk, after 30 mins of exam showing he/she asked multiple times.. wasted my time and then sus...

  • 440 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Yashodip, I'm sorry to hear your exam was suspended. Thank you for filing a ticket with our support team. Please allow the support team 24-48 hours to resolve. In the meantime, you can review the following documentation: Room requirements Behavio...

  • 0 kudos
1 More Replies
elsirya
by New Contributor III
  • 481 Views
  • 2 replies
  • 2 kudos

Resolved! unit testing

Currently I am creating unit tests for our ETL scripts although the test is not able to recognize sc (SparkContext).Is there a way to mock SparkContext for a unit test? Code being tested: df = spark.read.json(sc.parallelize([data])) Error message rec...

  • 481 Views
  • 2 replies
  • 2 kudos
Latest Reply
elsirya
New Contributor III
  • 2 kudos

Was able to get this to work.What I had to do was instantiate the "sc" variable in the PySpark notebook.PySpark code:"sc = spark.SparkContext"Then in the PyTest script we add a "@patch()" statement with the "sc" variable and create a "mock_sc" variab...

  • 2 kudos
1 More Replies
aditi06
by New Contributor III
  • 290 Views
  • 2 replies
  • 1 kudos

Issue while launching the data engineer exam

I started my exam but it said due to a technical issue it had been suspended though I checked all prerequisites and system checks.I have already raised the ticket please resolve this issue as early as possible.Ticket no.:  #00504757#reschedule #issue...

  • 290 Views
  • 2 replies
  • 1 kudos
Latest Reply
aditi06
New Contributor III
  • 1 kudos

@Kaniz_Fatma , Please resolve this issue as early as possible. I have already raised the ticket

  • 1 kudos
1 More Replies
Simon_T
by New Contributor III
  • 897 Views
  • 3 replies
  • 1 kudos

Resolved! Databricks Bundle Error

I am running this command: databricks bundle deploy --profile DAVE2_Dev --debug And I am getting this error: 10:13:28 DEBUG open dir C:\Users\teffs.THEAA\OneDrive - AA Ltd\Databricks\my_project\dist: open C:\Users\teffs.THEAA\OneDrive - AA Ltd\Databr...

  • 897 Views
  • 3 replies
  • 1 kudos
Latest Reply
Simon_T
New Contributor III
  • 1 kudos

So I found a link to a page that said that the databricks bundle command is expecting python3.exe instead of python.exe. So I took a copy of python.exe and renamed it to python3.exe and that seems to work. Thanks for investigating though.

  • 1 kudos
2 More Replies
InquisitiveGeek
by New Contributor II
  • 386 Views
  • 3 replies
  • 0 kudos

how can I store my cell output as a text file in my local drive?

I want to store the output of my cell as a text file in my local hard drive.I'm getting the json output and I need that json in my local drive as a text file. 

  • 386 Views
  • 3 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Contributor
  • 0 kudos

Hi @InquisitiveGeek ,You can do this following below approach: https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-outputs#download-results

  • 0 kudos
2 More Replies
himanmon
by New Contributor III
  • 391 Views
  • 2 replies
  • 1 kudos

Can I move a single file larger than 100GB using dbtuils fs?

Hello. I have a file over 100GB. Sometimes this is on the cluster's local path, and sometimes it's on the volume.And I want to send this to another path on the volume, or to the s3 bucket. dbutils.fs.cp('file:///tmp/test.txt', '/Volumes/catalog/schem...

himanmon_0-1721181332995.png himanmon_1-1721185042042.png
  • 391 Views
  • 2 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Contributor
  • 1 kudos

Hi @himanmon ,This is caused because of S3 limit on segment count. The part files can be numbered only from 1 to 10000After Setting spark.hadoop.fs.s3a.multipart.size to 104857600. , did you RESTART the cluster? Because it'll only work when the clust...

  • 1 kudos
1 More Replies
yurib
by New Contributor III
  • 668 Views
  • 2 replies
  • 0 kudos

Resolved! error creating token when creating databricks_mws_workspace resource on GCP

 resource "databricks_mws_workspaces" "this" { depends_on = [ databricks_mws_networks.this ] provider = databricks.account account_id = var.databricks_account_id workspace_name = "${local.prefix}-dbx-ws" location = var.google_region clou...

  • 668 Views
  • 2 replies
  • 0 kudos
Latest Reply
yurib
New Contributor III
  • 0 kudos

my issue was caused be credentials in `~/.databrickscfg` (generated by databricks cli) taking precedence over the creds set by `gcloud auth application-default login`. google's application default creds should be used when using the databricks google...

  • 0 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors