cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

maddan80
by New Contributor II
  • 998 Views
  • 2 replies
  • 1 kudos

Oracle Essbase connectivity

Team, I wanted to understand the best way of connecting to Oracle Essbase to ingest data into the delta lake

  • 998 Views
  • 2 replies
  • 1 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 1 kudos

@maddan80 I see that Essbase supports ODBC/JDBC connector. Try utilizing one of those.

  • 1 kudos
1 More Replies
Harsha777
by New Contributor III
  • 1518 Views
  • 3 replies
  • 2 kudos

Resolved! Sub-Query behavior in sql statements

Hi Team,I have a query with below construct in my projectSELECT count(*) FROM `catalog`.`schema`.`t_table`WHERE _col_check IN (SELECT DISTINCT _col_check FROM `catalog`.`schema`.`t_check_table`)Actually, there is no column "_col_check" in the sub-que...

  • 1518 Views
  • 3 replies
  • 2 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 2 kudos

Hi @Harsha777 ,What occurs is called column shadowing.What happens is that the column names in main query and sub query  are identica and the databricks engine after not finding it in sub query searches in the main query.The simplest way to avoid the...

  • 2 kudos
2 More Replies
m_weirath
by New Contributor II
  • 898 Views
  • 2 replies
  • 0 kudos

DLT-META requires ddl when using cdc_apply_changes

We are setting up new DLT Pipelines using the DLT-Meta package. Everything is going well in bringing our data in from Landing to our Bronze layer when we keep the onboarding JSON fairly vanilla. However, we are hitting an issue when using the cdc_app...

  • 898 Views
  • 2 replies
  • 0 kudos
Latest Reply
dbuser17
New Contributor II
  • 0 kudos

Please check these details: https://github.com/databrickslabs/dlt-meta/issues/90

  • 0 kudos
1 More Replies
Vasu_Kumar_T
by New Contributor II
  • 568 Views
  • 1 replies
  • 0 kudos

Unity Catalog: Metastore 3 level Hierarchy

I have data files categorized by application and region. Want to know the best way to load them into the Bronze and Silver layers while maintaining proper segregation.For example, in our landing zone, we have a structure of raw files to be loaded usi...

  • 568 Views
  • 1 replies
  • 0 kudos
Latest Reply
Shazaamzaa
New Contributor III
  • 0 kudos

If I understand it correctly, you have source files partitioned by application and region in cloud storage that you want to load and would like some suggestions on the Unity Catalog structure. It will definitely depend on how you want the data to be ...

  • 0 kudos
Sudheer89
by New Contributor
  • 1723 Views
  • 1 replies
  • 0 kudos

Where is Data tab and DBFS in Premium Databricks workspace

Currently I can see Catalog tab instead of Data tab in left side navigation. I am unable to find Data tab -> File browser where I would like to upload one sample orders csv file. Later I want to refer that path in Databricks notebooks as /FileStore/t...

  • 1723 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Sudheer89 ,By default - DBFS tab is disabled. As an admin user, you can manage your users’ ability to browse data in the Databricks File System (DBFS) using the visual browser interface.Go to the admin console.Click the Workspace Settings tab.In ...

  • 0 kudos
valesexp
by New Contributor II
  • 1401 Views
  • 1 replies
  • 1 kudos

Enforce tags to Jobs

Anyone know how I enforce jobs tags, not the custom tags for cluster. I want to enforce that jobs has certain tags so we can filter our jobs. We are not using Unity Catalog yet. 

  • 1401 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Currently, enforcing job tags is not a built-in feature in Databricks. However, you can add tags to your jobs when creating or updating them and filter jobs by these tags on the jobs list page.

  • 1 kudos
Nathant93
by New Contributor III
  • 1325 Views
  • 1 replies
  • 0 kudos

Constructor public org.apache.spark.ml.feature.Bucketizer(java.lang.String) is not whitelisted.

Hi,I am getting the error Constructor public org.apache.spark.ml.feature.Bucketizer(java.lang.String) is not whitelisted. when using a serverless compute cluster. I have seen in some other articles that this is due to high concurrency - does anyone k...

  • 1325 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

The error you're encountering, "Constructor public org.apache.spark.ml.feature.Bucketizer(java.lang.String) is not whitelisted", typically arises when using a shared mode cluster. This is because Spark ML is not supported in shared clusters due to se...

  • 0 kudos
Padmaja
by New Contributor II
  • 684 Views
  • 1 replies
  • 0 kudos

Need Help with SCIM Provisioning URL and Automation

Hi Databricks Community,I’m working on setting up SCIM provisioning and need some assistance:SCIM Provisioning URL:Can anyone confirm the correct process to obtain the SCIM Provisioning URL from the Databricks account console? I need to ensure I'm re...

  • 684 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Which provider are you using? You can use the doc for Okta provisioning to guide you through the process https://docs.databricks.com/en/admin/users-groups/scim/okta.html

  • 0 kudos
637858
by New Contributor II
  • 4935 Views
  • 2 replies
  • 3 kudos

How to disable users to create personal compute using notebook?

A Databricks account administrator can disable account-wide access to the Personal Compute default policy using the following steps: Navigate to the Databricks Account Console. Click the “Settings” icon. Click the “Feature enablement” tab. Switch the...

  • 4935 Views
  • 2 replies
  • 3 kudos
Latest Reply
mggl
New Contributor II
  • 3 kudos

Is there no way to prevent using Personal Compute policy from a notebook?Or does my question make sense? In other words is it by design right/immutable to have this policy when creating a notebook?

  • 3 kudos
1 More Replies
DBMIVEN
by New Contributor II
  • 472 Views
  • 1 replies
  • 0 kudos

DLT streaming table showing more "Written records" than its actually writing to table

Hi!I have a DLT setup streaming data from incoming parquet files, into bronze, silver and gold tables. There is a strange bug where in the Graph gui, the number of written records for the gold streaming-table is far greater than the actual data that ...

1.png 2.png graph.png
  • 472 Views
  • 1 replies
  • 0 kudos
Latest Reply
DBMIVEN
New Contributor II
  • 0 kudos

Also after running this for a while i get these errors: 

  • 0 kudos
Data_Alchemist
by New Contributor
  • 809 Views
  • 1 replies
  • 0 kudos

Unable to restore table version after enabling columnMapping

i can not restore my delta table to a previous version after I have enabled columnMapping. The following error is shown:DeltaColumnMappingUnsupportedException: Changing column mapping mode from 'name' to 'none' is not supported.Any idea how to roll b...

  • 809 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Data_Alchemist ,In Databricks Runtime 15.3 and above, you can use the DROP FEATURE command to remove column mapping from a table and downgrade the table protocol.Then you can try to restore table to previous version.https://docs.databricks.com/en...

  • 0 kudos
NLearn
by New Contributor II
  • 1058 Views
  • 1 replies
  • 0 kudos

Databricks Alerts

Hello everyone, I have a requirement to set databricks alerts. I am struggling with email body template. I was trying to create button using custom template in html but it did not work out. Is there any sample template for alerts which can show in be...

  • 1058 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @NLearn ,It should looks like this:   <table> <tr><th>column1 header</th><th>column2 header</th></tr> {{#QUERY_RESULT_ROWS}} <tr> <td>{{column1}}</td> <td>{{column2}}</td> </tr> {{/QUERY_RESULT_ROWS}} </table>  If you're going to use custom templa...

  • 0 kudos
JRDWaghmare
by New Contributor
  • 898 Views
  • 1 replies
  • 0 kudos

Conversion of SQL Server DDL to databricks ddl

Hello All,I want some tool which can be used in Databricks to convert my SQL Server DDL to my databricks DDL ,For example, I have one DDL in SQL Server as below,  CREATE TABLE [schema1].[table_1]([BranchKey] [int] NOT NULL,[Branch Code1] [int] NULL,[...

  • 898 Views
  • 1 replies
  • 0 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 0 kudos

@JRDWaghmare ReMorph transpiler could might be handy here.https://github.com/databrickslabs/remorph

  • 0 kudos
NC
by New Contributor III
  • 1210 Views
  • 4 replies
  • 0 kudos

Moving files from ADLS to Unity Catalog External Volume

Hi All,I am trying to migrate files from ADLS to newly created UC External Volume. I tried using dbutils.fs.cp but failed.May I know what is the right way to copy files without any transformation from ADLS to UC external volume.Thank you

  • 1210 Views
  • 4 replies
  • 0 kudos
Latest Reply
navallyemul
New Contributor III
  • 0 kudos

Instead of copying your files from ADLS to UC volumes, you can create a storage credential and an external location. This allows you to access all your ADLS data directly through the catalog explorer under external locations. For guidance on creating...

  • 0 kudos
3 More Replies
Yoshe1101
by New Contributor III
  • 4179 Views
  • 3 replies
  • 1 kudos

Resolved! Cluster terminated. Reason: Npip Tunnel Setup Failure

Hi, I have recently deployed a new Workspace in AWS and getting the following error when trying to start the cluster:"NPIP tunnel setup failure during launch. Please try again later and contact Databricks if the problem persists. Instance bootstrap f...

  • 4179 Views
  • 3 replies
  • 1 kudos
Latest Reply
Yoshe1101
New Contributor III
  • 1 kudos

Finally, this error was fixed by changing the DHCP configuration of the VPC.

  • 1 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels