cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

clock4eva
by New Contributor
  • 2787 Views
  • 2 replies
  • 2 kudos

Learning via Free Trial Azure

Hi I am following a Databricks course on Udemy and the course instructed to access Databricks via the free trial of Azure. Once I created my account on Azure and load Databricks, I try to create a cluster but this never succeeds. It takes an extremel...

  • 2787 Views
  • 2 replies
  • 2 kudos
Latest Reply
RishabhTiwari07
Databricks Employee
  • 2 kudos

Hi @clock4eva , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedb...

  • 2 kudos
1 More Replies
Syleena23
by New Contributor
  • 3093 Views
  • 2 replies
  • 1 kudos

How to Optimize Delta Lake Performance for Large-Scale Data Ingestion?

Hi everyone,I'm currently working on a project that involves large-scale data ingestion into Delta Lake on Databricks. While the ingestion process is functioning, I've noticed performance bottlenecks, especially with increasing data volumes. Could yo...

  • 3093 Views
  • 2 replies
  • 1 kudos
Latest Reply
RishabhTiwari07
Databricks Employee
  • 1 kudos

Hi @Syleena23 , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedb...

  • 1 kudos
1 More Replies
SainnathReddy_M
by New Contributor II
  • 1894 Views
  • 2 replies
  • 0 kudos

Monitoring Databricks

Good Day All, Did any one done Databricks Monitoring integration with  any other 3rd party applications like Grafana or ELK to get infrastructure monitoring like cpu_utilization ,memory ,job monitor and I can able to write spark code to get cpu_utili...

  • 1894 Views
  • 2 replies
  • 0 kudos
Latest Reply
RishabhTiwari07
Databricks Employee
  • 0 kudos

Hi @SainnathReddy_M , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your...

  • 0 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 3072 Views
  • 2 replies
  • 1 kudos

classic cluster vs serverless cost

Hi Team,Can you help me the cost comparison between classic cluster and serverless?

  • 3072 Views
  • 2 replies
  • 1 kudos
Latest Reply
RishabhTiwari07
Databricks Employee
  • 1 kudos

Hi @Phani1 , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedback...

  • 1 kudos
1 More Replies
v_sravan_sai
by New Contributor
  • 1425 Views
  • 1 replies
  • 0 kudos

UnsupportedOperationException: com.databricks.backend.daemon.data.client.DBFSV1.resolvePathOnPhysica

dbutils.fs.mv('dbfs:/FileStore/tables/Employee-2.csv','dbfs:/FileStore/backup/Employee-5.csv',recurse=True)--is giving errorUnsupportedOperationException: com.databricks.backend.daemon.data.client.DBFSV1.resolvePathOnPhysicalStorage(path: Path)File <...

  • 1425 Views
  • 1 replies
  • 0 kudos
Latest Reply
RishabhTiwari07
Databricks Employee
  • 0 kudos

Hi @v_sravan_sai , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your fe...

  • 0 kudos
Dorothy80Galvin
by New Contributor II
  • 2475 Views
  • 2 replies
  • 1 kudos

How can I Resolve QB Desktop Update Error 15225?

I'm encountering QB Desktop update error 15225. What could be causing this issue, and how can I resolve it? It's disrupting my workflow, and I need a quick fix.

  • 2475 Views
  • 2 replies
  • 1 kudos
Latest Reply
jameshardy602
New Contributor II
  • 1 kudos

hi @Dorothy80Galvin To resolve Desktop Update Error 15225, follow these steps. First, verify your Internet Explorer settings by ensuring it is set as the default browser and that SSL settings are enabled. Next, add trusted sites by navigating to Inte...

  • 1 kudos
1 More Replies
paras11
by New Contributor III
  • 1856 Views
  • 3 replies
  • 1 kudos

Databricks data engineer associate exam Suspended

Hi Team,I recently had a disappointing experience while attempting my first Data bricks certification exam. During the exam, I was abruptly directed to Proctor Support. The proctor asked me to show my desk and the room I was in. I complied by showing...

Get Started Discussions
@Cert-Bricks@Cert-Team@Cert-TeamOPS @Kaniz_Fatma
  • 1856 Views
  • 3 replies
  • 1 kudos
Latest Reply
paras11
New Contributor III
  • 1 kudos

 @Kaniz_Fatma @Cert-Team @Cert-Bricks Requesting you to please look into this and update me since it has not been resolved yet and I am not able to reschedule my exam. 

  • 1 kudos
2 More Replies
TinaN
by New Contributor III
  • 9994 Views
  • 3 replies
  • 3 kudos

Resolved! Extracting 'time' from a 'timestamp' datatype in Databricks

We are loading a data source to Databricks that contains columns with 'Time' datatype.  Databricks converts this to 'Timestamp', so I am researching for a way to extract time only. This is what I came up with, but the result isn't quite right.  Is th...

  • 9994 Views
  • 3 replies
  • 3 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 3 kudos

Hi @TinaN ,I check it in the evening, but try below: SELECT date_format(timestamp_column, 'HH:mm:ss') AS time_partFROM your_table

  • 3 kudos
2 More Replies
Prasad_Koneru
by New Contributor III
  • 2808 Views
  • 4 replies
  • 0 kudos

Deployment of tables and views in unity catalog and Repo structuring for catalogs objects.

we want to create the CI/CD Pipeline for deploying Unity catalog objects inorder to enhance the deployment ability.but How to maintain a repository of Tables/views/ or any other objects created in the catalogs and schema.Is this possible to do just l...

  • 2808 Views
  • 4 replies
  • 0 kudos
Latest Reply
RishabhTiwari07
Databricks Employee
  • 0 kudos

Hi @Prasad_Koneru , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your f...

  • 0 kudos
3 More Replies
icyapple
by New Contributor
  • 1879 Views
  • 3 replies
  • 0 kudos

hive_metastore schema access control

We are trying to control access to schemas under hive_metastore, only allowing certain users to access the tables under a schema (via SQL, Pyspark, and Python...), we have follow steps in a testing schema:1. Enable workspace table access control2. Ru...

  • 1879 Views
  • 3 replies
  • 0 kudos
Latest Reply
RishabhTiwari07
Databricks Employee
  • 0 kudos

Hi @icyapple , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedba...

  • 0 kudos
2 More Replies
SHASHANK2
by New Contributor III
  • 2123 Views
  • 3 replies
  • 1 kudos

cluster termination

Hello All,when I am creating all purpose cluster I am getting an idle time of 3 days, my cluster is terminating after 3days, I want to make my cluster terminate in 60 min of idle time, i want to do it globally so that in future any cluster created by...

  • 2123 Views
  • 3 replies
  • 1 kudos
Latest Reply
RishabhTiwari07
Databricks Employee
  • 1 kudos

Hi @SHASHANK2 , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedb...

  • 1 kudos
2 More Replies
Yashodip
by New Contributor II
  • 1325 Views
  • 1 replies
  • 0 kudos

My Databricks Professional Data Engineer exam has suspended , Need help Urgently (17/07/2024)

Hello Team, I encountered Pathetic experience while attempting my Professional Data Engineer DataBricks certification. Abruptly, Proctor asked me to show my desk, after 30 mins of exam showing he/she asked multiple times.. wasted my time and then sus...

  • 1325 Views
  • 1 replies
  • 0 kudos
jaybi
by New Contributor III
  • 1599 Views
  • 2 replies
  • 1 kudos

Resolved! error in running the first command

  AssertionError: The Databricks Runtime is expected to be one of ['11.3.x-scala2.12', '11.3.x-photon-scala2.12', '11.3.x-cpu-ml-scala2.12'], found "15.3.x-cpu-ml-scala2.12". Please see the "Troubleshooting | Spark Version" section of the "Version In...

  • 1599 Views
  • 2 replies
  • 1 kudos
Latest Reply
jaybi
New Contributor III
  • 1 kudos

I got it resolved - by changing cluster' s config

  • 1 kudos
1 More Replies
elsirya
by New Contributor III
  • 3032 Views
  • 2 replies
  • 2 kudos

Resolved! unit testing

Currently I am creating unit tests for our ETL scripts although the test is not able to recognize sc (SparkContext).Is there a way to mock SparkContext for a unit test? Code being tested: df = spark.read.json(sc.parallelize([data])) Error message rec...

  • 3032 Views
  • 2 replies
  • 2 kudos
Latest Reply
elsirya
New Contributor III
  • 2 kudos

Was able to get this to work.What I had to do was instantiate the "sc" variable in the PySpark notebook.PySpark code:"sc = spark.SparkContext"Then in the PyTest script we add a "@patch()" statement with the "sc" variable and create a "mock_sc" variab...

  • 2 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels