cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

seydouHR
by New Contributor III
  • 4674 Views
  • 3 replies
  • 0 kudos

Resolved! CLONE not supported on delta table with Liquid Clustering

Hello all,We are building a data warehouse on Unity Catalog and we use the SHALLOW CLONE command to allow folks to spin up their own dev environments by light copying the prod tables. We also started using Liquid Clustering on our feature tables, tho...

  • 4674 Views
  • 3 replies
  • 0 kudos
Latest Reply
seydouHR
New Contributor III
  • 0 kudos

Thanks Kaniz for your reply. I was able to get it make it work using runtime 14.0.Regards, 

  • 0 kudos
2 More Replies
caldempsey
by New Contributor
  • 3658 Views
  • 0 replies
  • 0 kudos

Delta Lake Spark fails to write _delta_log via a Notebook without granting the Notebook data access

I have set up a Jupyter Notebook w/ PySpark connected to a Spark cluster, where the Spark instance is intended to perform writes to a Delta table.I'm observing that the Spark instance fails to complete the writes if the Jupyter Notebook doesn't have ...

Data Engineering
deltalake
Docker
spark
  • 3658 Views
  • 0 replies
  • 0 kudos
chrisf_sts
by New Contributor II
  • 2035 Views
  • 0 replies
  • 0 kudos

Can I generate a uuid4 column when I do a COPY INTO command?

I have raw call log data and the logs don't have a unique id number so I generate a uuid4 number when i load them using spark.  Now I want to save the records to a table, and run a COPY INTO command every day to ingest new records.  I am only appendi...

  • 2035 Views
  • 0 replies
  • 0 kudos
n-riesco
by New Contributor
  • 34021 Views
  • 5 replies
  • 1 kudos

How can I view an exported DBC notebook in my computer?

Is it possible to convert to or export as a .ipynb notebook?

  • 34021 Views
  • 5 replies
  • 1 kudos
Latest Reply
AlexV
New Contributor II
  • 1 kudos

You can rename somefile.dbc to somefile.zip and open it with the Windows File Explorer, however the .python files cannot be opened in vscode or pycharm

  • 1 kudos
4 More Replies
SureshKumarDV
by New Contributor II
  • 2230 Views
  • 2 replies
  • 1 kudos

Not able to find the DBAcademyDLT policy to create the DLT Pipeline

Hi,Greetings of the day, I am preparing for the Databricks Engineering Associate Certification and following the Databricks Academy V3 course. As part of this course, trying to create the DLT pipelines but couldn't able to see the DBAcademyDLT policy...

  • 2230 Views
  • 2 replies
  • 1 kudos
Latest Reply
ManyPixels
New Contributor II
  • 1 kudos

I had to go to Admin Settings > Advanced (Workspace settings) and enable "Cluster, Pool and Jobs Access Control". Afterwards I was able to create an instance pool.

  • 1 kudos
1 More Replies
dataguru
by New Contributor II
  • 2421 Views
  • 1 replies
  • 0 kudos

Hello unable to start using the notebook

   I get this error when I run the Databricks training notebook%run ../Includes/Classroom-Setup-01 Resetting the learning environment: | dropping the catalog "***_53sh_da"...(0 seconds) Skipping install of existing datasets to "dbfs:/mnt/dbacademy-da...

  • 2421 Views
  • 1 replies
  • 0 kudos
Latest Reply
nferran
New Contributor II
  • 0 kudos

Could you fixed it? I have the same problem

  • 0 kudos
NT911
by New Contributor II
  • 1241 Views
  • 0 replies
  • 0 kudos

how to reduce file size in sedona o/p

I have shape files with polygon/geometry info. I am exporting the file after Sedona integration with Kepler.I o/p file is in .html. I want to reduce the file size.Pls suggest in case any option is available.

  • 1241 Views
  • 0 replies
  • 0 kudos
data-warriors
by New Contributor
  • 1260 Views
  • 0 replies
  • 0 kudos

workspace deletion at Databricks recovery

Hi Team,I accidentally deleted our databricks workspace, which had all our artefacts and control plane, and was the primary resource for our team's working environment.Could anyone please help on priority, regarding the recovery/ restoration mechanis...

  • 1260 Views
  • 0 replies
  • 0 kudos
Mist3
by New Contributor II
  • 7340 Views
  • 9 replies
  • 4 kudos

Dashboard API - Create a dashboard object doesn't work

I am trying to copy a dashboard object from one workspace to another using API. I am using Get dashboard objects (/api/2.0/preview/sql/dashboards GET method), then Retrieve a definition (/api/2.0/preview/sql/dashboards/{dashboard_id} GET method) and ...

  • 7340 Views
  • 9 replies
  • 4 kudos
Latest Reply
eason_gao_db
Databricks Employee
  • 4 kudos

Hi @markusk, unfortunately the issue you're running into is a limitation of the outgoing SQL dashboard APIs. You can create blank dashboards, but you cannot programmatically insert existing definitions. The good news is we're currently running a prev...

  • 4 kudos
8 More Replies
SJR
by New Contributor III
  • 9839 Views
  • 4 replies
  • 2 kudos

Resolved! Problem when updating Databricks Repo through DevOps Pipeline

Hello all!I've been working on integrating a Databricks Repos update API call to a DevOps Pipeline so that the Databricks local repo stays up to date with the remote staging branch (Pipeline executes whenever there's a new commit in to the staging br...

Data Engineering
CICD
Data_Engineering
DevOps
pipelines
repo
  • 9839 Views
  • 4 replies
  • 2 kudos
Latest Reply
SJR
New Contributor III
  • 2 kudos

@BookerE1 I found it!. There was already another thread related to this problem and someone else helped me find the solution (Problem was the pool that I was using for the pipeline)This is the link to the other thread: https://community.databricks.co...

  • 2 kudos
3 More Replies
costi9992
by New Contributor III
  • 2059 Views
  • 2 replies
  • 0 kudos

Pipeline API documentation issue

 In List Pipelines API documentation, at response is specified that response is  statuses Array of objects The list of events matching the request criteria. next_page_token string If present, a token to fetch the next page of events. But if we retri...

  • 2059 Views
  • 2 replies
  • 0 kudos
Latest Reply
arpit
Databricks Employee
  • 0 kudos

@costi9992 This has been documented now: https://docs.databricks.com/api/workspace/jobs/listruns

  • 0 kudos
1 More Replies
Hal
by New Contributor II
  • 1833 Views
  • 1 replies
  • 3 kudos

Connecting Power BI on Azure to Databricks on AWS?

Can someone share with me the proper way to connect Power BI running on Azure to Databricks running on AWS?

  • 1833 Views
  • 1 replies
  • 3 kudos
Latest Reply
bhanadi
New Contributor II
  • 3 kudos

Have the same question. Do we have to take care of any specific tasks to make it work. Anyone who implemented it?

  • 3 kudos
srjchoubey2
by New Contributor
  • 5230 Views
  • 1 replies
  • 0 kudos

How to import excel files xls/xlsx file into Databricks python notebook?

Method 1: Using "com.crealytics.spark.excel" package, how do I import the package?Method 2: Using pandas I tried the possible paths, but file not found it shows, nor while uploading the xls/xlsx file it shows options for importing the dataframe.Help ...

Data Engineering
excel
import
pyspark
python
  • 5230 Views
  • 1 replies
  • 0 kudos
Latest Reply
vishwanath_1
New Contributor III
  • 0 kudos

import pandas as pd ExcelData = pd.read_excel("/dbfs"+FilePath, sheetName) #  make sure you add /dbfs to FilePath 

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels