cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

RahulChaubey
by New Contributor III
  • 2256 Views
  • 1 replies
  • 1 kudos

Resolved! Can we get SQL Serverless warehouses monitoring data using APIs or logs ?

I am looking for a possible way to get the autoscaling history data for SQL Serverless Warehouses using API or logs.I want something like what we see in monitoring UI.   

Screenshot 2024-04-15 at 6.52.29 PM.png
  • 2256 Views
  • 1 replies
  • 1 kudos
Latest Reply
artsheiko
Databricks Employee
  • 1 kudos

Hi Rahul, you need to perform two actions :  Enable system tables schema named "compute" (how-to, take a look on the page, it's highly possible that you'll find other schemas useful too)Explore system.compute.warehouse_events table Hope this helps. B...

  • 1 kudos
kp12
by New Contributor II
  • 12498 Views
  • 4 replies
  • 1 kudos

column "id" is of type uuid but expression is of type character varying.

Hello,I'm trying to write to Azure PostgreSQL flexible  database from Azure Databricks, using PostgreSQL connector in Databricks Runtime in 12.2LTS.I'm using df.write.format("postgresql").save() to write to PostgreSQL database, but getting the follow...

  • 12498 Views
  • 4 replies
  • 1 kudos
Latest Reply
Student-Learn
New Contributor II
  • 1 kudos

Yes, this stack overflow was my reference too and adding below option made load go with no error on UUID data type in postgres columnSpoiler.option(stringtype, "unspecified").option(stringtype, "unspecified")https://stackoverflow.com/questions/409739...

  • 1 kudos
3 More Replies
kankotan
by New Contributor II
  • 2578 Views
  • 1 replies
  • 3 kudos

Regional Group Request for Istanbul

Hello, I kindly request the formation of a regional group for Istanbul/Turkey. I would appreciate your assistance in this matter.Thank you,Can

Get Started Discussions
Istanbul
Request
Turkey
  • 2578 Views
  • 1 replies
  • 3 kudos
Latest Reply
Sujitha
Databricks Employee
  • 3 kudos

@kankotan Happy to help set it up for you. I have dropped an email for more information! 

  • 3 kudos
NhanNguyen
by Contributor III
  • 2219 Views
  • 2 replies
  • 0 kudos

Method public void org.apache.spark.sql.internal.CatalogImpl.clearCache() is not whitelisted on clas

I'm executing a notebook and failed with this error: Sometime, when i execute some function in spark and also failed with the error 'this class is not whitelist'. Could everyone help me check on this?Thanks for your help!

Screenshot 2024-04-03 at 21.49.46.png
  • 2219 Views
  • 2 replies
  • 0 kudos
Latest Reply
NhanNguyen
Contributor III
  • 0 kudos

Thanks for your feedback, actually my cluster is shared cluster, and then I change to single cluster then can run that method/

  • 0 kudos
1 More Replies
EDDatabricks
by Contributor
  • 3881 Views
  • 0 replies
  • 0 kudos

Lakeview dashboard dynamically change filter values

Greetings everyone,We are trying to implement a series of visualizations. All these visualizations have queries assigned to them in the form of “ Select * from test table where timestamp between :Days start and :Days end”. The is also a filter applie...

EDDatabricks_0-1712826834624.png EDDatabricks_1-1712826864074.png
Get Started Discussions
dashboard
filters
Lakeview
parameters
  • 3881 Views
  • 0 replies
  • 0 kudos
jx1226
by New Contributor III
  • 6567 Views
  • 1 replies
  • 0 kudos

How to access storage with private endpoint

We know that Databricks with VNET injection (our own VNET) allows is to connect to blob storage/ ADLS Gen2 over private endpoints and peering. This is what we typically do.We have a client who created Databricks with EnableNoPublicIP=No (secure clust...

  • 6567 Views
  • 1 replies
  • 0 kudos
Latest Reply
IvoDB
New Contributor II
  • 0 kudos

Hey @jx1226 , were you able to solve this at the customer? I am currently struggling with same issues here.

  • 0 kudos
dm7
by New Contributor II
  • 1686 Views
  • 0 replies
  • 0 kudos

DLT CDC/SCD - Taking the latest ID per day

Hi I'm creating a DLT pipeline which uses DLT CDC to implement SCD Type 1 to take the latest record using a datetime column which works with no issues:@dlt.view def users(): return spark.readStream.table("source_table") dlt.create_streaming_table(...

  • 1686 Views
  • 0 replies
  • 0 kudos
nikhilmb
by New Contributor II
  • 3738 Views
  • 3 replies
  • 0 kudos

Error ingesting zip files: ExecutorLostFailure Reason: Command exited with code 50

Hi,We are trying to ingest zip files into Azure Databricks delta lake using COPY INTO command. There are 100+ zip files with average size of ~300MB each.Cluster configuration:1 driver: 56GB, 16 cores2-8 workers: 32GB, 8 cores (each). Autoscaling enab...

  • 3738 Views
  • 3 replies
  • 0 kudos
Latest Reply
nikhilmb
New Contributor II
  • 0 kudos

Although we were able to copy the zip files onto the DB volume, we were not able to share them with any system outside of the Databricks environment. Guess delta sharing does not support sharing files that are on UC volumes.

  • 0 kudos
2 More Replies
FelipeRegis
by New Contributor II
  • 2515 Views
  • 0 replies
  • 1 kudos

Not able to access data registered in Unity Catalog using Simba ODBC driver

Hi folks, I'm working on a project with Databricks using Unity Catalog and a connection to SSIS (SQL Server Integration Services).My team is trying to access data registered in Unity Catalog using Simba ODBC driver version 2.8.0.1002. They mentioned ...

odbc_driver_unity_catalog_issue.png
  • 2515 Views
  • 0 replies
  • 1 kudos
sharma_kamal
by New Contributor III
  • 7262 Views
  • 3 replies
  • 4 kudos

Resolved! Unable to read available file after downloading

After downloading a file using `wget`, I'm attempting to read it by spark.read.json.I am getting error: PATH_NOT_FOUND - Path does not exist: dbfs:/tmp/data.json. SQLSTATE: 42K03File <command-3327713714752929>, line 2 I have checked the file do exist...

Get Started Discussions
dbfs
json
ls
wget
  • 7262 Views
  • 3 replies
  • 4 kudos
Latest Reply
Ayushi_Suthar
Databricks Employee
  • 4 kudos

Hi @sharma_kamal , Good Day!  Could you please try the below code suggested by @ThomazRossito , it will help you.  Also please refer to the below document to work with the files on Databricks:  https://docs.databricks.com/en/files/index.html Please l...

  • 4 kudos
2 More Replies
Raja_fawadAhmed
by New Contributor
  • 1000 Views
  • 0 replies
  • 0 kudos

databricks job compute price w.r.t running time

I have two workflows (jobs) in data bricks (AWS) with below cluster specs (job base cluster NOT general purpose)Driver: i3.xlarge · Workers: i3.xlarge · 2-8 workers Job 1 takes 10 min to completeJob 2 takes 50 min to completeQuestions:DBU cost is sam...

  • 1000 Views
  • 0 replies
  • 0 kudos
Chrispy
by New Contributor
  • 4654 Views
  • 0 replies
  • 0 kudos

Spark Handling White Space as NULL

I have a very strange thing happening.  I'm importing a csv file and nulls and blanks are being interpreted correctly.  What is strange is that a column that regularly has a single space character value is having the single space converted to null.I'...

  • 4654 Views
  • 0 replies
  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels