cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

maddan80
by New Contributor II
  • 1824 Views
  • 4 replies
  • 3 kudos

Oracle Essbase connectivity

Team, I wanted to understand the best way of connecting to Oracle Essbase to ingest data into the delta lake

  • 1824 Views
  • 4 replies
  • 3 kudos
Latest Reply
BigRoux
Databricks Employee
  • 3 kudos

I would start by looking at Oracle DataDirect ODBC. It is optimized for Oracle and it supports Essbase. I believe this driver is included with Essbase. Hope this helps. Lou.

  • 3 kudos
3 More Replies
Fraip
by New Contributor
  • 2590 Views
  • 1 replies
  • 0 kudos

Unable to read files or write to from external location S3 (DataBricks Free Trial)

Hi! I'm trying DataBricks free trial and I tried to link it to an S3 Bucket I set up but I get errors related to serverless policies and unauthorized access whether I tried to read or write to S3, but I have no problem just listing the files that exi...

  • 2590 Views
  • 1 replies
  • 0 kudos
Latest Reply
MuthuLakshmi
Databricks Employee
  • 0 kudos

Your error may be caused by serverless network policy restrictions and/or missing S3 permissions. In the free trial, you cannot use your own S3 buckets with serverless compute. For full access, use a paid workspace and configure both network policy a...

  • 0 kudos
BalaRamesh
by New Contributor II
  • 753 Views
  • 3 replies
  • 0 kudos

Delta Live tables - If there is no target schema defined , where live tables will create.

Currently i am working Delta live tables.  one of  my ex - team member designed the job and they did not defined in target schema as empty  in destinations (settings -->destination ---> Target Shema). where delta live tables will create if it is empt...

BalaRamesh_0-1746779647873.png
  • 753 Views
  • 3 replies
  • 0 kudos
Latest Reply
MuthuLakshmi
Databricks Employee
  • 0 kudos

@BalaRamesh If you have catalog specified, there will be storage location for that and you will see this MV created there.Refer to this doc to understand about storage location: https://docs.databricks.com/aws/en/connect/unity-catalog/cloud-storage/m...

  • 0 kudos
2 More Replies
Garrus990
by New Contributor II
  • 1263 Views
  • 2 replies
  • 1 kudos

How to run a python task that uses click for CLI operations

Hey,in my application I am using click to facilitate CLI operations. It works locally, in notebooks, when scripts are run locally, but it fails in Databricks. I defined a task that, as an entrypoint, accepts the file where the click-decorated functio...

  • 1263 Views
  • 2 replies
  • 1 kudos
Latest Reply
VZLA
Databricks Employee
  • 1 kudos

The SystemExit issue you’re seeing is typical with Click, as it’s designed for standalone CLI applications and automatically calls sys.exit() after running a command. This behavior can trigger SystemExit exceptions in non-CLI environments, like Datab...

  • 1 kudos
1 More Replies
eyalo
by New Contributor II
  • 1858 Views
  • 1 replies
  • 0 kudos

Ingest from FTP server doesn't work

Hi,I am trying to connect my FTP server and store the files to a dataframe with the following code:%pip install ftputilfrom ftputil import FTPHostHost = "92.118.67.49"Login = "StrideNBM-DF_BO"Passwd = "Sdf123456"ftp_dir = "/dwh-reports/"with FTPHost(...

  • 1858 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kayla
Valued Contributor II
  • 0 kudos

I'm afraid I don't have an answer, and I know this is an old post, but if you haven't already, I'd recommend changing the password if that is/was a genuine password.

  • 0 kudos
Sudheerreddy25
by New Contributor II
  • 5954 Views
  • 8 replies
  • 1 kudos

Resolved! Regarding Exam got Suspended at middle without any reason.

Hi Team,My Databricks Certified Data Engineer Associate (Version 3) exam got suspended on 25th August and it is in progress state.I was there continuously in front of the camera and suddenly the alert appeared, and support person asked me to show the...

  • 5954 Views
  • 8 replies
  • 1 kudos
Latest Reply
Sneha2
New Contributor II
  • 1 kudos

Hi Team ,During the exam, I was asked to show my room from all four sides, which I did promptly. There was no one else in my room, no background noise, and no inappropriate behavior or activity of any kind.Despite my compliance, my exam was unexpecte...

  • 1 kudos
7 More Replies
alsetr
by New Contributor III
  • 510 Views
  • 1 replies
  • 0 kudos

Disable Databricks-generated error messages

Since Databricks Runtime 12.2 Databricks started to wrap spark exceptions in their own exceptions.https://learn.microsoft.com/en-us/azure/databricks/error-messages/While for some users it might be handy, for our team it is not convinient, as we canno...

  • 510 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

Databricks does not use vanilla spark.They added optimizations like the AQE, unity catalog etc.So looking for the error in the spark source code will not always work (in a lot of cases it will)

  • 0 kudos
deano2025
by New Contributor II
  • 820 Views
  • 2 replies
  • 0 kudos

Resolved! How to create an external location that accesses a public s3 bucket

Hi,I'm trying to create an external location that accesses a public s3 bucket (for open data). However, I'm not having any success. I'm confused to what to specify as the storage credential (IAM role) since its a public bucket that is out of my contr...

  • 820 Views
  • 2 replies
  • 0 kudos
Latest Reply
deano2025
New Contributor II
  • 0 kudos

Thanks @Isi Now that you've explained external locations, I think it does indeed make sense that they are probably unnecessary in this case. Thanks for clarifying!

  • 0 kudos
1 More Replies
carlos_tasayco
by Contributor
  • 534 Views
  • 1 replies
  • 0 kudos

Materializing tables in custom schemas is not supported.

Hello,I have been seeing this:https://www.databricks.com/blog/publish-multiple-catalogs-and-schemas-single-dlt-pipelineNow dlt pipelines support multiple schemas, however is not working my case:Did I do something wrong?Thanks in advance 

carlos_tasayco_0-1742491316716.png carlos_tasayco_1-1742491343464.png
  • 534 Views
  • 1 replies
  • 0 kudos
Latest Reply
MauricioS
New Contributor III
  • 0 kudos

Hi Carlos,Hope you are doing well, did you get any update on this issue, I'm currently running into the same problem.

  • 0 kudos
Anish_2
by New Contributor II
  • 713 Views
  • 2 replies
  • 0 kudos

Delta live tables - ignore updates on some columns

Hello Team,I have scenario where in apply_changes, i want to ignore updates on 1 column. Is there any way we can achieve this in Delta live tables?

  • 713 Views
  • 2 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

Hi there @Anish_2 , Yes you can do that Here is the doc link : https://docs.databricks.com/aws/en/dlt/cdc?language=PythonFor python you can simply add an attribute except_columns_list like thisdlt.apply_changes( target = "target", source = "users...

  • 0 kudos
1 More Replies
hims_2021
by New Contributor
  • 399 Views
  • 1 replies
  • 0 kudos

Unable to export object using /api/2.0/workspace/export API

Hi ,I was using /api/2.0/workspace/export API in power automate workflow to export to excel from data brick to sharepoint. This functionality was working fine till yesterday. Today onwards it is throwing below error while calling the APIAction 'HTTP_...

  • 399 Views
  • 1 replies
  • 0 kudos
Latest Reply
lingareddy_Alva
Honored Contributor III
  • 0 kudos

@hims_2021 This error indicates an encoding issue when trying to export an Excel file from Databricks to SharePoint via Power Automate. The specific error message about being "Unable to translate bytes [9A] at index 11" suggests that Power Automate i...

  • 0 kudos
804082
by New Contributor III
  • 5005 Views
  • 8 replies
  • 2 kudos

Resolved! DLT Direct Publishing Mode

Hello,I'm working on a DLT pipeline and have a block of SQL that runs...USE CATALOG catalog_a; USE SCHEMA schema_a; CREATE OR REFRESH MATERIALIZED VIEW table_a AS SELECT ... FROM catalog_b.schema_b.table_b;Executing this block returns the following.....

  • 5005 Views
  • 8 replies
  • 2 kudos
Latest Reply
Dorsey
New Contributor II
  • 2 kudos

I'm in EastUS and i don't have that option on my previews page. Also it only works with serverless?

  • 2 kudos
7 More Replies
moski
by New Contributor II
  • 12482 Views
  • 9 replies
  • 8 kudos

Databricks short cut to split a cell

Is there a shortcut to split a cell into two in Dtabricks notebook as in Jupiter notebook? in jupyter notebook it is Shift/Ctr/-

  • 12482 Views
  • 9 replies
  • 8 kudos
Latest Reply
Harshjot
Contributor III
  • 8 kudos

 Hi @mundy Jim​ / All, Attached are two snapshots so first snapshot with one cell if pressed Ctrl+Alt+Minus split into two.  

  • 8 kudos
8 More Replies
LearnDB1234
by New Contributor III
  • 1004 Views
  • 3 replies
  • 1 kudos

Resolved! How to Update Identity Column for a Databricks Table

Hi All,I have a databricks table with the below DDL:CREATE TABLE default.Test ( ID BIGINT GENERATED ALWAYS AS IDENTITY (START WITH 1 INCREMENT BY 1), StopFromDateTime TIMESTAMP, StopToDateTime TIMESTAMP, User STRING) USING delta TBLPROPERTIE...

  • 1004 Views
  • 3 replies
  • 1 kudos
Latest Reply
pdiamond
Contributor
  • 1 kudos

If you recreate the table using BIGINT GENERATED BY DEFAULT  instead of BIGINT GENERATED ALWAYS you can manipulate the column values."When using the clause GENERATED BY DEFAULT AS IDENTITY, insert operations can specify values for the identity column...

  • 1 kudos
2 More Replies
ramyav7796
by New Contributor II
  • 758 Views
  • 1 replies
  • 0 kudos

add custom logs and save in a folder logs

Hi,I am trying to add custom logging functionality for my code. Please refer to the code I am using, I am trying to save my log files by creating a logs folder in my users workspace. My intent is to store dynamic custom log files each time I run my n...

  • 758 Views
  • 1 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

Here are some suggestions for your consideration.   The issue with your custom logging setup seems to stem from attempting to save the log files in a path under "/Workspace/Users/ramya.v@point32health.org/CD/", which is not directly writable by your ...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels