cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

Sebastian
by Contributor
  • 7900 Views
  • 3 replies
  • 1 kudos

How to access databricks secret in global ini file

How to access databricks secret in global ini file. {{secrets/scope/key}} doesnt work. Do i have to put that inside quotes

  • 7900 Views
  • 3 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 1 kudos

hi @SEBIN THOMAS​ ,I would like to share the docs here are you getting any error messages? like @Hubert Dudek​ mentioned, please share more details and error message in case you are getting any.

  • 1 kudos
2 More Replies
Mohit_m
by Valued Contributor II
  • 2298 Views
  • 5 replies
  • 2 kudos

Which rest API to use in order to list the groups that belong to a specific user

Which rest API to use in order to list the groups that belong to a specific user

  • 2298 Views
  • 5 replies
  • 2 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 2 kudos

@Mohit Miglani​ ,Make sure to select the best option so the post will be moved to the top and will help in case more users have this question in the future.

  • 2 kudos
4 More Replies
Nosa
by New Contributor II
  • 1906 Views
  • 3 replies
  • 4 kudos

Resolved! adding databricks to my application

I am developing an application. I want to use databricks in my application. I developed that with python and godot. how can I have data bricks in my application?

  • 1906 Views
  • 3 replies
  • 4 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 4 kudos

Hi @Ensiyeh Shojaei​ ,Which cloud service are you using? According to the cloud provider, you will have a list of tools that can help you to connect and interact in your application.

  • 4 kudos
2 More Replies
schmit89
by New Contributor
  • 3079 Views
  • 1 replies
  • 1 kudos

Resolved! Downstream duration timeout

I'm trying to upload a file that is .5GB for a school lab and when I drag the file to DBFS it uploads for about 30 seconds and then I receive a downstream duration timeout error. What can I do to solve this issue?

  • 3079 Views
  • 1 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 1 kudos

Hi @Jason Schmit​ ,Your file might be too large to upload by using the upload interface docs I will recommend to split it up into smaller files. You can also use DBFS CLI, dbutils to upload your file.

  • 1 kudos
Raghav1
by New Contributor II
  • 8676 Views
  • 7 replies
  • 3 kudos

How to avoid DataBricks Secret Scope from exposing the value of the key resides in Azure Key Vault?

I have created a key in Azure Key Vault to store my secrets in it. In order to use it securely in Azure DataBricks, have created the secret scope and configured the Azure Key Vault properties. Out of curiosity, just wanted to check whether my key is ...

databricks issue
  • 8676 Views
  • 7 replies
  • 3 kudos
Latest Reply
prasadvaze
Valued Contributor II
  • 3 kudos

@Kaniz Fatma​ is any fix coming soon for this? this is a big security loophole The docs say that "To ensure proper control of secrets you should use Workspace object access control (limiting permission to run commands) " --- if i prevent access to ru...

  • 3 kudos
6 More Replies
tigger
by New Contributor III
  • 3113 Views
  • 3 replies
  • 2 kudos

Resolved! Is it possible to disable retryWrites using .option()?

Hello everyone,I'm trying to write to DocumentDB using org.mongodb.spark:mongo-spark-connector_2.12:3.0.1. The DocDB is version 4 which doesn't support Retryable Writes so I disabled the feature setting option "retryWrites" to "false" (also tried wit...

  • 3113 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@Hugh Vo​ - If Sajehs's answer resolved the issue, would you be happy to mark their answer as best?

  • 2 kudos
2 More Replies
Anonymous
by Not applicable
  • 1986 Views
  • 2 replies
  • 2 kudos

Resolved! Are notebooks encrypted even if no CMK is provided?

This document (https://docs.databricks.com/security/keys/customer-managed-keys-managed-services-aws.html) describes how to use a customer managed key to encrypt notebooks in the control plane.We would please like to verify: if no CMK is provided, are...

  • 1986 Views
  • 2 replies
  • 2 kudos
Latest Reply
Filippo-DB
Databricks Employee
  • 2 kudos

Hello @Nathan Buesgens​ , from a high level point of view, by default, notebooks source code and metadata in the control plane are encrypted at rest in AWS RDS using AWS KMS with a Databricks-managed Key. But there is other data related to notebooks ...

  • 2 kudos
1 More Replies
kpendergast
by Contributor
  • 3103 Views
  • 3 replies
  • 3 kudos

Resolved! How do I create a job for a notebook not in the /Users/ directory?

I am setting up a job to to load data from S3 into Delta using Auto loader. I can do this fine in interactive mode. When trying to create a job in the UI. I can select the notebook in the root directory I created for the project within the create jo...

  • 3103 Views
  • 3 replies
  • 3 kudos
Latest Reply
User16844513407
New Contributor III
  • 3 kudos

Hi @Ken Pendergast​, you are supposed to be able to reference any Notebook you have the right permissions on so it looks like you are running into a bug, can you please reach out to support or email me directly with your workspace ID? My email is jan...

  • 3 kudos
2 More Replies
yadsmc
by New Contributor II
  • 1999 Views
  • 3 replies
  • 0 kudos

Resolved! SQL Issues with 10.0 runtime

I was testing my sqls with new 10.0 runtime and found some interesting/weird thing. The same sql with explode function fails for some scenarios in 10.0! Could not figure out yet the reason

  • 1999 Views
  • 3 replies
  • 0 kudos
Latest Reply
BilalAslamDbrx
Databricks Employee
  • 0 kudos

@Yadhuram MC​  if the issue persists, please email me at bilal dot aslam at databricks dot com. I would like to get to the root of this issue. It

  • 0 kudos
2 More Replies
Anonymous
by Not applicable
  • 2094 Views
  • 2 replies
  • 2 kudos

Resolved! OPTIMIZE

I have been testing OPTIMIZE a huge set of data (about 775 million rows) and getting mixed results. When I tried on a 'string' column, the query return in 2.5mins and using the same column as 'integer', using the same query, it return 9.7 seconds. Pl...

  • 2094 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@Werner Stinckens​  Thanks for your explanation.

  • 2 kudos
1 More Replies
Atharva_Salunke
by Contributor
  • 13189 Views
  • 20 replies
  • 10 kudos

Resolved! Haven't received Badge for Apache Spark 3.0 Associate Dev certification

I have given my exam on 8/10/2021 and have passed it, subsequently I received me certificate but haven't received the badge associated with it yet, its been almost 2 weeks since I received the certificate and have raised 2 requests through the main c...

  • 13189 Views
  • 20 replies
  • 10 kudos
Latest Reply
Anonymous
Not applicable
  • 10 kudos

@Atharva Salunke​ - If you're referring to @Maithreyi Pagar​, they're all set.

  • 10 kudos
19 More Replies
Anonymous
by Not applicable
  • 1703 Views
  • 0 replies
  • 0 kudos

Is the "patch"/update method of the repos API synchronous?

The repos API has a patch method to update a repo in the workspace (to do a git pull).We would please like to verify: is this method fully synchronous? Is it guaranteed to only return a 200 after the update is complete? Or, would immediately referenc...

  • 1703 Views
  • 0 replies
  • 0 kudos
RantoB
by Valued Contributor
  • 6477 Views
  • 8 replies
  • 3 kudos

Resolved! How to export a Databricks repos in dbc format with databricks CLI

Hi,How can I export a Databricks repository in dbc format with databricks CLI ?It is possible to make databricks workspace export_dir path/to/dir .but notdatabricks repos export_dir path/to/dir .Thanks for you answers

  • 6477 Views
  • 8 replies
  • 3 kudos
Latest Reply
Prabakar
Databricks Employee
  • 3 kudos

@Bertrand BURCKER​  Is your requirement to do it only from the CLI? Or to export the repos?If it is to export the repos, you can export it as DBC format from the UI.

  • 3 kudos
7 More Replies
JK2021
by New Contributor III
  • 8210 Views
  • 10 replies
  • 5 kudos

Resolved! An unidentified special character is added in outbound file when transformed in databricks. Please help with suggestion?

Data from external source is copied to ADLS, which further gets picked up by databricks, then this massaged data is put in the outbound file . A special character ? (question mark in black diamond) is seen in some fields in outbound file which may br...

  • 8210 Views
  • 10 replies
  • 5 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 5 kudos

Are you sure it is Databricks which puts the special character in place?It could also have happened during the copy of the external system to ADLS.If you use Azure Data Factory f.e. you have to define the encoding (UTF-8 or UTF-16, ...)

  • 5 kudos
9 More Replies
MartinB
by Contributor III
  • 11329 Views
  • 8 replies
  • 9 kudos

Resolved! Is there a way to create a non-temporary Spark View with PySpark?

Hi,When creating a Spark view using SparkSQL ("CREATE VIEW AS SELCT ...") per default, this view is non-temporary - the view definition will survive the Spark session as well as the Spark cluster.In PySpark I can use DataFrame.createOrReplaceTempView...

  • 11329 Views
  • 8 replies
  • 9 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 9 kudos

why not to create manage table?dataframe.write.mode(SaveMode.Overwrite).saveAsTable("<example-table>")   # later when we need data resultDf = spark.read.table("<example-table>")

  • 9 kudos
7 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels