cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Pragati_17
by Visitor
  • 2 Views
  • 0 replies
  • 0 kudos

Setting Date Range Picker to some default date

I have a date range picker filter in Databricks Lakeview Dashboard, so when i open dashboard there is no date selected and i want to set a default date. Is that possible with lakeview dashboard filters?

Data Engineering
databricks lakeview dashboard
date range pciker filter
default date set
  • 2 Views
  • 0 replies
  • 0 kudos
Gilg
by Contributor
  • 9 Views
  • 0 replies
  • 0 kudos

Autoloader - File Notification mode

Hi All,I have set up a DLT pipeline that is using Autoloader in a file notification mode.Everything runs smoothly for the first time. However, it seems like the next micro-batch did not trigger as I can see some events coming in the queue.But if I lo...

Gilg_0-1710827649089.png Gilg_1-1710827662118.png
  • 9 Views
  • 0 replies
  • 0 kudos
pranathisg97
by New Contributor III
  • 632 Views
  • 3 replies
  • 1 kudos

Resolved! Control query caching using SQL statement execution API

I want to execute this statement using databricks SQL Statement Execution API. curl -X POST -H 'Authorization: Bearer <access-token>' -H 'Content-Type: application/json' -d '{"warehouse_id": "<warehouse_id>", "statement": "set us...

image.png
  • 632 Views
  • 3 replies
  • 1 kudos
Latest Reply
TimFrazer
New Contributor II
  • 1 kudos

Did you ever find a solution to this problem?

  • 1 kudos
2 More Replies
Brichj
by New Contributor
  • 52 Views
  • 2 replies
  • 0 kudos

%run ../Includes/Classroom-Setup-02.1

I ran the code in the cell as it was given in the presentation. But it failed. Can someone please help?The presentation is the second lesson in the second model of Data Engineering Associate exam prep.

  • 52 Views
  • 2 replies
  • 0 kudos
Latest Reply
Brichj
New Contributor
  • 0 kudos

Thanks Ajay-Pandey!This is error that I keep getting when I run the following: %run ./Includes/Classroom-Setup-02.3LI have run dbutils.library.restartPython(), but it did not help.Note: you may need to restart the kernel using dbutils.library.restart...

  • 0 kudos
1 More Replies
Brad
by New Contributor III
  • 216 Views
  • 4 replies
  • 0 kudos

Inconsistent behavior when displaying chart in notebook

Hi, I'm trying to create some 3D charts. With the same code and same cluster, sometimes it can show, sometimes it cannot. Previously it cannot display, but last week I opened a notebook with failed run and found the result can be shown by itself (as ...

  • 216 Views
  • 4 replies
  • 0 kudos
Latest Reply
Brad
New Contributor III
  • 0 kudos

Also, with same code, same browser, different workspaces, one works, other one not. In the notebook with "script error", if I "Export cell" and get its iframe html and use displayHTML to display it, it works, so this means the JS and HTML inside is o...

  • 0 kudos
3 More Replies
Sampath_Kumar
by New Contributor
  • 79 Views
  • 2 replies
  • 1 kudos

Volume Limitations

I have a use case to create a table using JSON files. There are 36 million files in the upstream(S3 bucket). I just created a volume on top of it. So the volume has 36M files.  I'm trying to form a data frame by reading this volume using the below sp...

  • 79 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @Sampath_Kumar, Let’s delve into the limitations and best practices related to Databricks volumes. Volume Limitations: Managed Volumes: These are Unity Catalog-governed storage volumes created within the default storage location of the contain...

  • 1 kudos
1 More Replies
cpd
by Visitor
  • 30 Views
  • 0 replies
  • 0 kudos

Ingesting geospatial data into a table

I'm just getting started with Databricks and wondering if it is possible to ingest a GeoJSON or GeoParquet file into a new table without writing code? My goal here is to load vector data into a table and perform H3 polyfill operations on all the vect...

  • 30 Views
  • 0 replies
  • 0 kudos
xhead
by New Contributor II
  • 1700 Views
  • 3 replies
  • 0 kudos

Resolved! Does "databricks bundle deploy" clean up old files?

I'm looking at this page (Databricks Asset Bundles development work tasks) in the Databricks documentation.When repo assets are deployed to a databricks workspace, it is not clear if the "databricks bundle deploy" will remove files from the target wo...

Data Engineering
bundle
cli
deploy
  • 1700 Views
  • 3 replies
  • 0 kudos
Latest Reply
xhead
New Contributor II
  • 0 kudos

One further question:The purpose of “databricks bundle destroy” is to remove all previously-deployed jobs, pipelines, and artifacts that are defined in the bundle configuration files.Which bundle configuration files? The ones in the repo? Or are ther...

  • 0 kudos
2 More Replies
pokus
by New Contributor III
  • 2083 Views
  • 3 replies
  • 2 kudos

Resolved! use DeltaLog class in databricks cluster

I need to use DeltaLog class in the code to get the AddFiles dataset. I have to keep the implemented code in a repo and run it in databricks cluster. Some docs say to use org.apache.spark.sql.delta.DeltaLog class, but it seems databricks gets rid of ...

  • 2083 Views
  • 3 replies
  • 2 kudos
Latest Reply
dbal
New Contributor
  • 2 kudos

Thanks for providing a solution @pokus .What I dont understand is why Databricks cannot provide the DeltaLog at runtime. How can this be the official solution? We need a better solution for this instead of depending on reflections.

  • 2 kudos
2 More Replies
VGS777
by New Contributor II
  • 43 Views
  • 0 replies
  • 0 kudos

Regarding Cloning dbc file from git

Hey FolksI have dbc file in a git repo and i cloned in the databricks when tried to open the .dbc file it is saying ```Failed to load file. The file encoding is not supported```can anyone please advice me on this #help #beginner

  • 43 Views
  • 0 replies
  • 0 kudos
RobsonNLPT
by Contributor
  • 110 Views
  • 3 replies
  • 0 kudos

Resolved! scala-xml : how to move child to another parent node

Hi allThe mandatory rowTag for writing to XML cause doesn't make any sense as I have the complete nested dataframe schema.In my case I need to implement an extra step to remove that extra node (default: Row) after xml generation.I need some examples ...

  • 110 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @RobsonNLPT, Working with XML in Scala using the scala-xml library can be powerful and flexible. Let’s break down your requirements and provide an example of how to achieve this. Removing the “Row” Node: When converting a DataFrame to XML, th...

  • 0 kudos
2 More Replies
LoiNguyen
by New Contributor II
  • 9451 Views
  • 5 replies
  • 2 kudos

The authentication type 10 is not supported

I use below code to connect to postgresql. df = spark.read \ .jdbc("jdbc:postgresql://hostname:5432/dbname", "schema.table", properties={"user": "user", "password": "password"})\ .load() df.printSchema() However, I got the ...

  • 9451 Views
  • 5 replies
  • 2 kudos
Latest Reply
simboss
New Contributor II
  • 2 kudos

But how are we going to do this for those who use Windows?

  • 2 kudos
4 More Replies
satishnavik
by Visitor
  • 62 Views
  • 1 replies
  • 0 kudos

How to connect Databricks Database with Springboot application using JPA

facing issue with integrating our Spring boot JPA supported application with Databricks.Below are the steps and setting we did for the integration.When we are starting the spring boot application we are getting a warning as :HikariPool-1 - Driver doe...

  • 62 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @satishnavik, It seems you’re encountering issues while integrating your Spring Boot JPA application with Databricks. Let’s address the warnings and exceptions you’re facing. Warning: Driver Does Not Support Network Timeout for Connections The...

  • 0 kudos
Hertz
by New Contributor
  • 48 Views
  • 1 replies
  • 0 kudos

Structured Streaming Event in Audit Logs

I am trying to monitor when a table is created or updated using the audit logs. I have found that structured streaming writes/appends are not captured in the audit logs? Am I missing something shouldn't this be captured as a unity catalog event. Eith...

Data Engineering
Audit Logs
structured streaming
  • 48 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Hertz, Monitoring table creation and updates using audit logs is essential for maintaining data governance and security. Let’s explore this further. Databricks, being a cloud-native platform, provides audit logs that allow administrators to t...

  • 0 kudos