cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

jamesdavids
by New Contributor
  • 561 Views
  • 1 replies
  • 0 kudos

Spark Excel Library Insufficient Privileges

HiWe have a shared access mode cluster in which we have installed a maven library for reading excel files into a Spark DataFrame. When using an account with admin rights everything works fine, however when we run it as a standar user we always get `o...

jamesdavids_0-1717747232287.png
  • 561 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Honored Contributor
  • 0 kudos

Hello James, this issue seems to be related to shared cluster limitations, as per docs: Libraries used as JDBC drivers or custom Spark data sources on Unity Catalog-enabled shared compute require ANY FILE permissions.Moving to a single user cluster m...

  • 0 kudos
JameDavi_51481
by New Contributor III
  • 479 Views
  • 1 replies
  • 0 kudos

How to escape column comments when adding them programmatically

I would like to add comments to all of our columns, programmatically. The only way I can find to do this is through SQL DDL - e.g. `alter table sometable alter column somecolumn comment 'some comment string`. However, I want to read this comment stri...

  • 479 Views
  • 1 replies
  • 0 kudos
Latest Reply
brockb
Valued Contributor
  • 0 kudos

Hi @JameDavi_51481 ,I do not believe this is supported currently, please see this line in the Parameter Marker documentation: > You must not reference a parameter marker in a DDL statement...Ref: https://docs.databricks.com/en/sql/language-manual/sql...

  • 0 kudos
amit-agarwal4
by New Contributor II
  • 777 Views
  • 2 replies
  • 0 kudos

how to clone a table using Column and row level mask

I am working on use case to clone a table which has column and row level masking implemented.Can i clone the table, based on documentation it is not possible. what are the other alternatives i have ?

  • 777 Views
  • 2 replies
  • 0 kudos
Latest Reply
raphaelblg
Honored Contributor
  • 0 kudos

@amit-agarwal4 At the current moment, I'm not aware of any other way than disabling the RLS.     

  • 0 kudos
1 More Replies
tejas8196
by New Contributor II
  • 822 Views
  • 2 replies
  • 0 kudos

DAB not updating zone_id when redeployed

Hey folks,Facing an issue with zone_id not getting overridden when redeploying the DAB template to Databricks workspace.The Databricks job is already deployed and has "ap-south-1a" zone_id. I wanted to make it "auto" so, I have made the changes to th...

Screenshot 2024-06-05 at 12.40.57 AM.png
  • 822 Views
  • 2 replies
  • 0 kudos
Latest Reply
davicd2658
New Contributor II
  • 0 kudos

Hello,Thanks for the info I will try to figure it out for more.

  • 0 kudos
1 More Replies
Prasad_Koneru
by New Contributor III
  • 729 Views
  • 1 replies
  • 0 kudos

Databricks grant update calatog catlog_name --json @privileges.json not updating privileges

Hi Team, I am trying to update the catalog permission privileges using databricks cli command Grant by appending json file but which is not updating the prIviliges, please help on grant update command usage.Command using :  databricks grants update c...

  • 729 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ravivarma
New Contributor III
  • 0 kudos

Hello @Prasad_Koneru   If the command is not updating the privileges as expected, there could be a few reasons for this.  Firstly, ensure that the JSON file is correctly formatted and contains the correct privilege assignments. The privileges.json fi...

  • 0 kudos
JamesY
by New Contributor III
  • 1539 Views
  • 6 replies
  • 0 kudos

Resolved! Retrieve error column/row when writing to sqlmi

Databricks notebook, Scala, .read() .write(), source data csv. Got this error when trying to write data to sqlmi. I understand the error indicated one of the source column value's length exceeded the database column's value length. But the message is...

Community Platform Discussions
Databricks
Scala
SqlMi
  • 1539 Views
  • 6 replies
  • 0 kudos
Latest Reply
brockb
Valued Contributor
  • 0 kudos

Thanks JamesY, I'm not familiar with the limitations of the SQL Server `nvarchar` data type but is there a way that we can filter out the rows that will fail using spark such as: spark.read.format("csv").option("header", "true").load("/path/to/csvs")...

  • 0 kudos
5 More Replies
claudiazi
by New Contributor II
  • 1402 Views
  • 5 replies
  • 1 kudos

[TABLE_OR_VIEW_ALREADY_EXISTS] when running create or replace view

When I'm running `dbt run -s model` for different models in parallel on databricks general compute cluster. Im getting the error: TABLE_OR_VIEW_ALREADY_EXISTS. Also, at the same time, the view/table was not created at all.But I run them in sequence, ...

  • 1402 Views
  • 5 replies
  • 1 kudos
Latest Reply
claudiazi
New Contributor II
  • 1 kudos

@raphaelblg yes! Im the owner. However, these views are inside the hive_metastore catalog. Could it be the reason?Many thaaanks!

  • 1 kudos
4 More Replies
hukel
by Contributor
  • 2381 Views
  • 2 replies
  • 0 kudos

InconsistentReadException: The file might have been updated during query - CSV backed table

I have some CSV files that I upload to DBFS storage several times a day.   From these CSVs,  I have created SQL tables: CREATE TABLE IF NOT EXISTS masterdata.lookup_host USING CSV OPTIONS (header "true", inferSchema "true") LOCATION '/mnt/masterdata/...

  • 2381 Views
  • 2 replies
  • 0 kudos
Latest Reply
hukel
Contributor
  • 0 kudos

One approach I'm testing (positive results so far, but still early).%sql # Prep and cleanup REFRESH TABLE masterdata.lookup_host; DROP TABLE IF EXISTS t_hosts; # Forcibly cache the needed columns before using the data in another query. CACHE TABLE...

  • 0 kudos
1 More Replies
egndz
by New Contributor II
  • 3429 Views
  • 3 replies
  • 0 kudos

Cluster Memory Issue (Termination)

Hi,I have a single-node personal cluster with 56GB memory(Node type: Standard_DS5_v2, runtime: 14.3 LTS ML). The same configuration is done for the job cluster as well and the following problem applies to both clusters:To start with: once I start my ...

egndz_2-1712845742934.png egndz_1-1712845616736.png
  • 3429 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @egndz, It seems like you’re dealing with memory issues in your Spark cluster, and I understand how frustrating that can be. Initial Memory Allocation: The initial memory allocation you’re observing (18 GB used + 4.1 GB cached) is likely a com...

  • 0 kudos
2 More Replies
SamGreene
by Contributor
  • 549 Views
  • 1 replies
  • 0 kudos

String to date conversion errors

Hi,I am getting data from CDC on SQL Server using Informatica which is writing parquet files to ADLS.  I read the parquet files using DLT and end up with the date data as a string such as this'20240603164746563' I couldn't get this to convert using m...

  • 549 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @SamGreene, You’re on the right track with using the TO_TIMESTAMP function. However, you might be encountering issues due to the format of your timestamp string. The string ‘20240603164746563’ seems to represent a timestamp down to the millisecond...

  • 0 kudos
traillog
by New Contributor
  • 505 Views
  • 1 replies
  • 0 kudos

Unzip multiple zip files in databricks

I have a zip file which in turn has multiple zip files inside it. I tried to write a code in databricks notebook to unzip all these files at once, but I ran into an error. So I started to unzip these one by one, but the code which worked in unzipping...

  • 505 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @traillog, Did you try this ?

  • 0 kudos
rodrigosanchesz
by New Contributor II
  • 392 Views
  • 0 replies
  • 0 kudos

Change adls gen2 attached to Unity Catalog metastore from premium to standard

Hello,Our cloud platform engineer created Azure storage for the production Unity Catalog metastore in our environment, but mistakenly chose the Premium tier instead of the Standard tier.Unfortunately, this decision is impacting our costs on Azure, as...

  • 392 Views
  • 0 replies
  • 0 kudos
Ender
by New Contributor II
  • 1188 Views
  • 3 replies
  • 1 kudos

Resolved! Accessing ADLS Gen2 related Hadoop configuration in notebook

I have a cluster in which I have the required configuration to access an ADLS Gen2, and it works without any problems.I want to access this storage using the Hadoop filesystem APIs. To achieve this, I am trying to get the Hadoop configuration from th...

Ender_1-1717335940727.png Ender_3-1717336689720.png
  • 1188 Views
  • 3 replies
  • 1 kudos
Latest Reply
Ender
New Contributor II
  • 1 kudos

By the way how do you achieve inline code highlighting in the editor I tried `` but it didn't work.

  • 1 kudos
2 More Replies
Shawn_Eary
by Contributor
  • 559 Views
  • 1 replies
  • 2 kudos

Resolved! DAIS24 Attendee Badge - Oops

I wasn't at DAIS24, but I received one of these emails and it appears to have come from Databricks. I think I was given the badge in error. Can we have it removed?Just Wondering,Shawn

Shawn_Eary_1-1717680651702.png
  • 559 Views
  • 1 replies
  • 2 kudos
Latest Reply
Sujitha
Community Manager
  • 2 kudos

Hi @Shawn_Eary Thank you for bringing this to our attention. We were conducting a test on badges, and some of our community members may have received this email by accident. We have revoked the changes now, please ignore the message.

  • 2 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors