cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

najmead
by Contributor
  • 5200 Views
  • 2 replies
  • 1 kudos

Spark Settings in SQL Warehouse

I'm running a query, trying to parse a string into a map, and I get the following error;org.apache.spark.SparkRuntimeException: Duplicate map key was found, please check the input data. If you want to remove the duplicated keys, you can set "spark.s...

  • 5200 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Nicholas Mead​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedbac...

  • 1 kudos
1 More Replies
farooqurrehman
by New Contributor
  • 2070 Views
  • 3 replies
  • 2 kudos

Unable to connect/read files from ADLS Gen2 using account key

It gives error[RequestId=5e57b66f-b69f-4e8b-8706-3fe5baeb77a0 ErrorClass=METASTORE_DOES_NOT_EXIST] No metastore assigned for the current workspace.using the following codespark.conf.set(  "fs.azure.account.key.mystorageaccount.dfs.core.windows.net", ...

  • 2070 Views
  • 3 replies
  • 2 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 2 kudos

Hi @Farooq ur rehman​,Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.

  • 2 kudos
2 More Replies
KVNARK
by Honored Contributor II
  • 865 Views
  • 1 replies
  • 5 kudos

accessing secret from spark cluster.

passing spark configuration to access blob, adls from data factory while creating job clusterit's working fine, but when in the property we are accessing secret it's not workingspark.hadoop.fs.azure.account.auth.type.{{secrets/scope/key}}.dfs.core.wi...

  • 865 Views
  • 1 replies
  • 5 kudos
Latest Reply
sher
Valued Contributor II
  • 5 kudos

check here : https://docs.databricks.com/security/secrets/secrets.html

  • 5 kudos
hitesh1
by New Contributor III
  • 6987 Views
  • 1 replies
  • 5 kudos

java.util.NoSuchElementException: key not found

Hello,We are using a Azure Databricks with Standard DS14_V2 Cluster with Runtime 9.1 LTS, Spark 3.1.2 and Scala 2.12 and facing the below issue frequently when running our ETL pipeline. As part of the operation that is failing there are several joins...

  • 6987 Views
  • 1 replies
  • 5 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 5 kudos

Hey man,Please use these configuration in your cluster and it will work,spark.sql.storeAssignmentPolicy LEGACYspark.sql.parquet.binaryAsString truespark.speculation falsespark.sql.legacy.timeParserPolicy LEGACYif it wont work let me know what problem...

  • 5 kudos
ramankr48
by Contributor II
  • 2782 Views
  • 2 replies
  • 3 kudos

Issue with identity key column in databricks?

For the identity key I've used both GENERATED ALWAYS AS IDENTITY(start with 1 increment by 1) andGENERATED BY DEFAULT AS IDENTITY(start with 1 increment by 1)but in both cases, if I'm running my script once then it is fine (identity key is working as...

  • 2782 Views
  • 2 replies
  • 3 kudos
Latest Reply
lizou
Contributor II
  • 3 kudos

yes, by default option allow duplicated values per design.I will avoid this option and use only use GENERATED ALWAYS AS IDENTITY Using BY DEFAULT option is worse than not using it at all in BY Default option, If I forget to set starting value, the ID...

  • 3 kudos
1 More Replies
alejandrofm
by Valued Contributor
  • 4333 Views
  • 3 replies
  • 3 kudos

Resolved! Delta, the specified key does not exist error

Hi, I'm having this error too frequently on a few tables, I check on S3 and the partition exists and the file is there on the partition.error: Spectrum Scan Error: DeltaManifestcode: 15005context: Error fetching Delta Lake manifest delta/product/sub_...

  • 4333 Views
  • 3 replies
  • 3 kudos
Latest Reply
alejandrofm
Valued Contributor
  • 3 kudos

@Hubert Dudek​ , I'll add that sometimes, just running:GENERATE symlink_format_manifest FOR TABLE schema.tablesolves it, but, how can the symlink get broken?Thanks!

  • 3 kudos
2 More Replies
Labels