cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Sudheer89
by New Contributor
  • 1107 Views
  • 1 replies
  • 0 kudos

Where is Data tab and DBFS in Premium Databricks workspace

Currently I can see Catalog tab instead of Data tab in left side navigation. I am unable to find Data tab -> File browser where I would like to upload one sample orders csv file. Later I want to refer that path in Databricks notebooks as /FileStore/t...

  • 1107 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Sudheer89 ,By default - DBFS tab is disabled. As an admin user, you can manage your users’ ability to browse data in the Databricks File System (DBFS) using the visual browser interface.Go to the admin console.Click the Workspace Settings tab.In ...

  • 0 kudos
valesexp
by New Contributor II
  • 836 Views
  • 1 replies
  • 1 kudos

Enforce tags to Jobs

Anyone know how I enforce jobs tags, not the custom tags for cluster. I want to enforce that jobs has certain tags so we can filter our jobs. We are not using Unity Catalog yet. 

  • 836 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Currently, enforcing job tags is not a built-in feature in Databricks. However, you can add tags to your jobs when creating or updating them and filter jobs by these tags on the jobs list page.

  • 1 kudos
Nathant93
by New Contributor III
  • 672 Views
  • 1 replies
  • 0 kudos

Constructor public org.apache.spark.ml.feature.Bucketizer(java.lang.String) is not whitelisted.

Hi,I am getting the error Constructor public org.apache.spark.ml.feature.Bucketizer(java.lang.String) is not whitelisted. when using a serverless compute cluster. I have seen in some other articles that this is due to high concurrency - does anyone k...

  • 672 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

The error you're encountering, "Constructor public org.apache.spark.ml.feature.Bucketizer(java.lang.String) is not whitelisted", typically arises when using a shared mode cluster. This is because Spark ML is not supported in shared clusters due to se...

  • 0 kudos
Padmaja
by New Contributor II
  • 564 Views
  • 1 replies
  • 0 kudos

Need Help with SCIM Provisioning URL and Automation

Hi Databricks Community,I’m working on setting up SCIM provisioning and need some assistance:SCIM Provisioning URL:Can anyone confirm the correct process to obtain the SCIM Provisioning URL from the Databricks account console? I need to ensure I'm re...

  • 564 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Which provider are you using? You can use the doc for Okta provisioning to guide you through the process https://docs.databricks.com/en/admin/users-groups/scim/okta.html

  • 0 kudos
637858
by New Contributor II
  • 4457 Views
  • 2 replies
  • 3 kudos

How to disable users to create personal compute using notebook?

A Databricks account administrator can disable account-wide access to the Personal Compute default policy using the following steps: Navigate to the Databricks Account Console. Click the “Settings” icon. Click the “Feature enablement” tab. Switch the...

  • 4457 Views
  • 2 replies
  • 3 kudos
Latest Reply
mggl
New Contributor II
  • 3 kudos

Is there no way to prevent using Personal Compute policy from a notebook?Or does my question make sense? In other words is it by design right/immutable to have this policy when creating a notebook?

  • 3 kudos
1 More Replies
DBMIVEN
by New Contributor II
  • 382 Views
  • 1 replies
  • 0 kudos

DLT streaming table showing more "Written records" than its actually writing to table

Hi!I have a DLT setup streaming data from incoming parquet files, into bronze, silver and gold tables. There is a strange bug where in the Graph gui, the number of written records for the gold streaming-table is far greater than the actual data that ...

1.png 2.png graph.png
  • 382 Views
  • 1 replies
  • 0 kudos
Latest Reply
DBMIVEN
New Contributor II
  • 0 kudos

Also after running this for a while i get these errors: 

  • 0 kudos
Data_Alchemist
by New Contributor
  • 582 Views
  • 1 replies
  • 0 kudos

Unable to restore table version after enabling columnMapping

i can not restore my delta table to a previous version after I have enabled columnMapping. The following error is shown:DeltaColumnMappingUnsupportedException: Changing column mapping mode from 'name' to 'none' is not supported.Any idea how to roll b...

  • 582 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Data_Alchemist ,In Databricks Runtime 15.3 and above, you can use the DROP FEATURE command to remove column mapping from a table and downgrade the table protocol.Then you can try to restore table to previous version.https://docs.databricks.com/en...

  • 0 kudos
NLearn
by New Contributor II
  • 700 Views
  • 1 replies
  • 0 kudos

Databricks Alerts

Hello everyone, I have a requirement to set databricks alerts. I am struggling with email body template. I was trying to create button using custom template in html but it did not work out. Is there any sample template for alerts which can show in be...

  • 700 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @NLearn ,It should looks like this:   <table> <tr><th>column1 header</th><th>column2 header</th></tr> {{#QUERY_RESULT_ROWS}} <tr> <td>{{column1}}</td> <td>{{column2}}</td> </tr> {{/QUERY_RESULT_ROWS}} </table>  If you're going to use custom templa...

  • 0 kudos
JRDWaghmare
by New Contributor
  • 623 Views
  • 1 replies
  • 0 kudos

Conversion of SQL Server DDL to databricks ddl

Hello All,I want some tool which can be used in Databricks to convert my SQL Server DDL to my databricks DDL ,For example, I have one DDL in SQL Server as below,  CREATE TABLE [schema1].[table_1]([BranchKey] [int] NOT NULL,[Branch Code1] [int] NULL,[...

  • 623 Views
  • 1 replies
  • 0 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 0 kudos

@JRDWaghmare ReMorph transpiler could might be handy here.https://github.com/databrickslabs/remorph

  • 0 kudos
NC
by New Contributor III
  • 805 Views
  • 4 replies
  • 0 kudos

Moving files from ADLS to Unity Catalog External Volume

Hi All,I am trying to migrate files from ADLS to newly created UC External Volume. I tried using dbutils.fs.cp but failed.May I know what is the right way to copy files without any transformation from ADLS to UC external volume.Thank you

  • 805 Views
  • 4 replies
  • 0 kudos
Latest Reply
navallyemul
New Contributor III
  • 0 kudos

Instead of copying your files from ADLS to UC volumes, you can create a storage credential and an external location. This allows you to access all your ADLS data directly through the catalog explorer under external locations. For guidance on creating...

  • 0 kudos
3 More Replies
Yoshe1101
by New Contributor III
  • 3499 Views
  • 3 replies
  • 1 kudos

Resolved! Cluster terminated. Reason: Npip Tunnel Setup Failure

Hi, I have recently deployed a new Workspace in AWS and getting the following error when trying to start the cluster:"NPIP tunnel setup failure during launch. Please try again later and contact Databricks if the problem persists. Instance bootstrap f...

  • 3499 Views
  • 3 replies
  • 1 kudos
Latest Reply
Yoshe1101
New Contributor III
  • 1 kudos

Finally, this error was fixed by changing the DHCP configuration of the VPC.

  • 1 kudos
2 More Replies
Vishalkhode1206
by New Contributor III
  • 449 Views
  • 1 replies
  • 0 kudos

Certification discount voucher

HiCould you please provide the information to get a free voucher for the Data Engineer Associate Exam that I am planning for this week?Thanks in Advance

  • 449 Views
  • 1 replies
  • 0 kudos
Latest Reply
florence023
New Contributor III
  • 0 kudos

@Vishalkhode1206 wrote:HiCould you please provide the information to get a free voucher for the Data Engineer Associate Exam that I am planning for this week? Official SiteThanks in AdvanceHello,I'm also looking for information on how to get a free v...

  • 0 kudos
Gubbanoa
by New Contributor II
  • 656 Views
  • 3 replies
  • 0 kudos

Pycharm and Unit Testing

What is currently the best way of doing unit testing from pycharm into databricks? I have previously used databricks connect. However after upgrades, and now that even unit catalog has become a requirement, it appears quirky. Is it possible to use th...

  • 656 Views
  • 3 replies
  • 0 kudos
Latest Reply
florence023
New Contributor III
  • 0 kudos

@Gubbanoa wrote:What is currently the best way of doing unit testing from pycharm into databricks? I have previously used databricks connect. However after upgrades, and now that even unit catalog has become a requirement, it appears quirky. Is it po...

  • 0 kudos
2 More Replies
guangyi
by Contributor III
  • 957 Views
  • 1 replies
  • 0 kudos

Creating job under the all-purpose cluster type policy always failed

Here is the policy I just created: { "node_type_id": { "defaultValue": "Standard_D8s_v3", "type": "allowlist", "values": [ "Standard_D8s_v3", "Standard_D16s_v3" ] }, "num_workers": {...

  • 957 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @guangyi ,This may be related to the fact that DAB does not support this type of clusters. Unfortunately, this is not very well documented but look at below thread. This feature is requested and it should be available in the future:Creating All Pu...

  • 0 kudos
luiz_santana
by New Contributor
  • 573 Views
  • 1 replies
  • 0 kudos

Create external location referencing another cloud.

I have an Azure Databricks, but my data lake is in another cloud (AWS).Is it possible for me to create an external location in Azure Databricks, pointing to the container contained in the S3 Bucket.

  • 573 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @luiz_santana ,This is currently not supported. However, you can reference data in S3 bucket using below method: https://learn.microsoft.com/en-us/azure/databricks/connect/storage/amazon-s3

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels