cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Karene
by New Contributor
  • 13 Views
  • 1 replies
  • 0 kudos

Databricks Connection to Redash

Hello,I am trying to connect my Redash account with Databricks so that my organization can run queries on the data in Unity Catalog from Redash.I followed through the steps in the documentation and managed to connect successfully. However, I am only ...

  • 13 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Karene, Thanks for reaching out! We'll look into this and get back to you with an answer shortly. Thanks for your patience!

  • 0 kudos
AnkithP
by Visitor
  • 7 Views
  • 0 replies
  • 0 kudos

Infer schema eliminating leading zeros.

Upon reading a CSV file with schema inference enabled, I've noticed that a column originally designated as string datatype contains numeric values with leading zeros. However, upon reading the data to Pyspark data frame, it undergoes automatic conver...

  • 7 Views
  • 0 replies
  • 0 kudos
Mutharasu
by New Contributor II
  • 1131 Views
  • 6 replies
  • 6 kudos

SAP Business Object(BO) Integration with Databricks

Hi Team,We are doing an analysis on SAP Business object to connect with databricks and built a report on top of the data in the data lakehouse. In our current architecture we have delta tables on top of S3 storage. Please let us know any connectors/d...

  • 1131 Views
  • 6 replies
  • 6 kudos
Latest Reply
bharat4880
Visitor
  • 6 kudos

Hi @HB83 , Can I know which version of BO are you using? We have a similar requirement.

  • 6 kudos
5 More Replies
Ravikumashi
by Contributor
  • 287 Views
  • 3 replies
  • 0 kudos

issue with azure databricks workspace after we disable public network access

Hi All,We had azure databricks workspaces created thru terraform with public network access enabled to true and everything was working great. recently we have disabled the public network access and started to face issues.terraform is uanble to add us...

  • 287 Views
  • 3 replies
  • 0 kudos
Latest Reply
Ravikumashi
Contributor
  • 0 kudos

we use the following code to create private endpoint and on UI we can see the private endpoint connection status as approved.resource "azurerm_private_endpoint" "example" { name = "example-endpoint" location = azurerm_re...

  • 0 kudos
2 More Replies
Anske
by New Contributor
  • 394 Views
  • 1 replies
  • 0 kudos

One-time backfill for DLT streaming table before apply_changes

Hi,absolute Databricks noob here, but I'm trying to set up a DLT pipeline that processes cdc records from an external sql server instance to create a mirrored table in my databricks delta lakehouse. For this, I need to do some initial one-time backfi...

Data Engineering
Delta Live Tables
  • 394 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anske
New Contributor
  • 0 kudos

So since nobody responded, I decided to try my own suggestion and hack the snapshot data into the table that gathers the change data capture. After some straying I ended up with the notebook as attached.The notebook first creates 2 dlt tables (lookup...

  • 0 kudos
dollyb
by New Contributor III
  • 56 Views
  • 0 replies
  • 0 kudos

Differences between Spark SQL and Databricks

Hello,I'm using a local Docker Spark 3.5 runtime to test my Databricks Connect code. However I've come across a couple of cases where my code would work in one environment, but not the other.Concrete example, I'm reading data from BigQuery via spark....

  • 56 Views
  • 0 replies
  • 0 kudos
kiko_roy
by New Contributor III
  • 1426 Views
  • 4 replies
  • 3 kudos

Resolved! Permission error loading dataframe from azure unity catalog to GCS bucket

I am creating a data frame by reading a table's data residing in Azure backed unity catalog. I need to write the df or file to GCS bucket. I have configured the spark cluster config using the GCP service account json values.on running : df1.write.for...

Data Engineering
GCS bucket
permission error
  • 1426 Views
  • 4 replies
  • 3 kudos
Latest Reply
ruloweb
Visitor
  • 3 kudos

Hi, is there any terraform resource to apply this GRANT or this have to be done always manually?

  • 3 kudos
3 More Replies
cubanDataDude
by Visitor
  • 91 Views
  • 1 replies
  • 1 kudos

Job Claiming NotebooKNotFound Incorrectly (seemingly)

I have the code captured below in the screenshot. When I run this individually it works just fine, when I JOB runs this it fails out with 'ResourceNotFound' - not sure what the issue is... - Checked 'main' branch, which is where this job is pulling f...

  • 91 Views
  • 1 replies
  • 1 kudos
Latest Reply
cubanDataDude
  • 1 kudos

Figured it out:ecw_staging_nb_List = ['nb_UPSERT_stg_ecw_insurance','nb_UPSERT_stg_ecw_facilitygroups']Works just fine.

  • 1 kudos
Dom1
by Visitor
  • 46 Views
  • 0 replies
  • 0 kudos

Show log4j messages in run output

Hi,I have an issue when running JAR jobs. I expect to see logs in the output window of a run. Unfortunately, I can only see messages of that are generated with "System.out.println" or "System.err.println". Everything that is logged via slf4j is only ...

Dom1_0-1713189014582.png
  • 46 Views
  • 0 replies
  • 0 kudos
jp_allard
by Visitor
  • 31 Views
  • 0 replies
  • 0 kudos

Selective Overwrite to a Unity Catalog Table

I have been able to perform a selective overwrite using replace Where to a hive_metastore table, but when I use the same code for the same table in a unity catalog, no data is written.Has anyone else had this issue or is there common mistakes that ar...

  • 31 Views
  • 0 replies
  • 0 kudos
dannythermadom
by New Contributor III
  • 2217 Views
  • 6 replies
  • 7 kudos

Dbutils.notebook.run command not working with /Repos/

I have two github repo configured in Databricks Repos folder. repo_1 is run using a job and repo_2 is run/called from repo_1 using Dbutils.notebook.run command. dbutils.notebook.run("/Repos/repo_2/notebooks/notebook", 0, args)i am getting the follo...

  • 2217 Views
  • 6 replies
  • 7 kudos
Latest Reply
cubanDataDude
  • 7 kudos

I am having a similar issue...  ecw_staging_nb_List = ['/Workspace/Repos/PRIMARY/UVVC_DATABRICKS_EDW/silver/nb_UPSERT_stg_ecw_insurance',                 '/Repos/PRIMARY/UVVC_DATABRICKS_EDW/silver/nb_UPSERT_stg_ecw_facilitygroups'] Adding workspace d...

  • 7 kudos
5 More Replies
Labels
Top Kudoed Authors