cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Anonymous
by Not applicable
  • 1009 Views
  • 2 replies
  • 0 kudos

Resolved! Is the "patch"/update method of the repos API synchronous?

The repos API has a patch method to update a repo in the workspace (to do a git pull).We would please like to verify: is this method fully synchronous? Is it guaranteed to only return a 200 after the update is complete? Or, would immediately referenc...

  • 1009 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @ nate_at_lovelytics, In case of success, the PATCH HTTP method returns the 200 OK response code.

  • 0 kudos
1 More Replies
Anonymous
by Not applicable
  • 1137 Views
  • 2 replies
  • 0 kudos

Resolved! On-prem DNS - entries to be added

Hello,I have a question about DNS addresses Databricks uses.In our network configuration, we are using custom VNet injection with no public IPs, and are required to use on-premises corporate DNS.Therefore we would like to add necessary entries to on-...

  • 1137 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Wlodek Bielski​ , Please refer to the doc :- https://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/on-prem-network

  • 0 kudos
1 More Replies
sriwin
by New Contributor
  • 1867 Views
  • 2 replies
  • 0 kudos

Resolved! Create gpg file and save to AWS s3 storage in scala

Hi - Could you please help me on how can I create a scala notebook to perform the below tasksEncrypt a text file using the gpgUpload the file to amazon s3 storageverify the file exists in amazon s3decrypt the encrypted file to verify no issuesApprec...

  • 1867 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @sriwin p​ , There is encrypt/decrypt file test case code from PR to the https://github.com/sbt/sbt-pgp repository. It provides an example of usage of PGP file encryption/decryption with :package com.jsuereth.pgp   import org.specs2.mutable._ im...

  • 0 kudos
1 More Replies
krishnakash
by New Contributor II
  • 2139 Views
  • 6 replies
  • 4 kudos

Resolved! Is there any way of determining last stage of SparkSQL Application Execution?

I have created custom UDF's that generate logs. These logs can be flushed by calling another API exposed which is exposed by an internal layer. However I want to call this API just after the execution of the UDF comes to an end. Is there any way of d...

  • 2139 Views
  • 6 replies
  • 4 kudos
Latest Reply
User16763506586
Contributor
  • 4 kudos

@Krishna Kashiv​ May be ExecutorPlugin.java can help. It has all the methods you might required. Let me know if it works or not.You need to implement this interface org.apache.spark.api.plugin.SparkPluginand expose it as spark.plugins = com.abc.Imp...

  • 4 kudos
5 More Replies
krishnakash
by New Contributor II
  • 2267 Views
  • 2 replies
  • 1 kudos

Resolved! How to provide custom class extending SparkPlugin/ExecutorPlugin in Databricks 7.3?

How to properly configure the jar containing the class and spark plugin in Databricks?During DBR 7.3 cluster creation, I tried by setting the spark.plugins, spark.driver.extraClassPath and spark.executor.extraClassPath Spark configs by copying the ja...

  • 2267 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hello @Krishna Kashiv​ - I don't know if we've met yet. My name is Piper and I'm a community moderator here. Thank you for your new question. It looks thorough! Let's give it a while to see what our members have to say. Otherwise, we will circle back...

  • 1 kudos
1 More Replies
bdc
by New Contributor III
  • 6357 Views
  • 6 replies
  • 5 kudos

Resolved! Is it possible to access a variable in markdown cells?

I saw a similar question in a discussion 5 years ago and back then this option was not available. https://community.databricks.com/s/question/0D53f00001HKHhNCAX/markup-in-databricks-notebookI wonder if this feature has been added. It is possible to d...

  • 6357 Views
  • 6 replies
  • 5 kudos
Latest Reply
bdc
New Contributor III
  • 5 kudos

When I click on the feedback link under workspace/help, it opens a page with page not found.

  • 5 kudos
5 More Replies
mdavidallen
by New Contributor II
  • 1638 Views
  • 4 replies
  • 2 kudos

Resolved! How to transfer ownership of a Databricks cloud standard account?

My email address is the owner of an account in a particular standard plan tenancy. I would like to transfer ownership to another user so they can change billing details, and take admin access going forward. How can this be accomplished?

  • 1638 Views
  • 4 replies
  • 2 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 2 kudos

Hi @David Allen​ To transfer account owner rights, contact your Databricks account representative. This is applicable for both legacy and E2 accounts.https://docs.databricks.com/administration-guide/account-settings/account-console.html#access-the-ac...

  • 2 kudos
3 More Replies
Chris_Shehu
by Valued Contributor III
  • 1410 Views
  • 5 replies
  • 3 kudos
  • 1410 Views
  • 5 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

You may have noticed that the local SQL endpoint is not listed in the options for getting started with APEX. The local SQL endpoint is an extremely useful feature for getting ADO.NET web services started. I say check this uk-dissertation.com review f...

  • 3 kudos
4 More Replies
Confused
by New Contributor III
  • 2194 Views
  • 6 replies
  • 1 kudos

Hi Guys Is there any documentation on where the /databricks-datasets/ mount is actually served from?We are looking at locking down where our workspace...

Hi GuysIs there any documentation on where the /databricks-datasets/ mount is actually served from?We are looking at locking down where our workspace can reach out to via the internet and as it currently stands we are unable to reach this.I did look ...

  • 2194 Views
  • 6 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hello Mat, Thanks for letting us know. Would you be happy to mark your answer as best if that will solve the problem for others? That way, members will be able to find the solution more easily.

  • 1 kudos
5 More Replies
MadelynM
by New Contributor III
  • 1826 Views
  • 2 replies
  • 1 kudos

2021-08-Best-Practices-for-Your-Data-Architecture-v3-OG-1200x628

Thanks to everyone who joined the Best Practices for Your Data Architecture session on Getting Workloads to Production using CI/CD. You can access the on-demand session recording here, and the code in the Databricks Labs CI/CD Templates Repo. Posted ...

  • 1826 Views
  • 2 replies
  • 1 kudos
Latest Reply
MadelynM
New Contributor III
  • 1 kudos

Here's the embedded links list!Jobs scheduling and orchestrationBuilt-in job scheduling: https://docs.databricks.com/jobs.html#schedule-a-job Periodic scheduling of the jobsExecute notebook / jar / Python script / Spark-submitMultitask JobsExecute no...

  • 1 kudos
1 More Replies
Kaniz
by Community Manager
  • 1000 Views
  • 1 replies
  • 0 kudos

Error message :- RuntimeError: Unable to read the token from "[REDACTED]"

when trying to rundatabricks configure --host https://adb-xxxxxx.10.azuredatabricks.net/ -f {dbutils.secrets.get("scopre", "secret")}Error message :-RuntimeError: Unable to read the token from "[REDACTED]"

  • 1000 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

Is this an intermittent issue or it happens all the time? do you get the same error when you try to access another host?

  • 0 kudos
raymund
by New Contributor III
  • 2049 Views
  • 7 replies
  • 5 kudos

Resolved! Why adding the package 'org.apache.spark:spark-sql-kafka-0-10_2.12:3.0.1' failed in runtime 9.1.x-scala2.12 but was successful using runtime 8.2.x-scala2.12 ?

Using Databricks spark submit job, setting new cluster1] "spark_version": "8.2.x-scala2.12" => OK, works fine2] "spark_version": "9.1.x-scala2.12" => FAIL, with errorsException in thread "main" java.lang.ExceptionInInitializerError at com.databricks...

  • 2049 Views
  • 7 replies
  • 5 kudos
Latest Reply
raymund
New Contributor III
  • 5 kudos

this has been resolved by adding the following spark_conf (not thru --conf) "spark.hadoop.fs.file.impl": "org.apache.hadoop.fs.LocalFileSystem"example:------"new_cluster": { "spark_version": "9.1.x-scala2.12", ... "spark_conf": { "spar...

  • 5 kudos
6 More Replies
antoooks
by New Contributor III
  • 1532 Views
  • 3 replies
  • 5 kudos

Resolved! display() function always return connection refused on tunneling despite successfully retrieving the schema

Hi everyone,I am using SSH tunnelling with SSHTunnelForwarder to reach a target AWS RDS PostgreSQL database. The connection got through, however when I tried to display the retrieved data frame it always throws "connection refused" error. Please see ...

image.png
  • 1532 Views
  • 3 replies
  • 5 kudos
Latest Reply
jose_gonzalez
Moderator
  • 5 kudos

hi @Kurnianto Trilaksono Sutjipto​ ,This seems like a connectivity issue with the url you are trying to connect to. It fails during the display() command because read is a lazy transformation and it will not be executed right away. On the other hand,...

  • 5 kudos
2 More Replies
CAN
by New Contributor
  • 421 Views
  • 0 replies
  • 0 kudos

Security Threats in Databricks for File Upload

Dear community, we are using the Azure Databricks service and wondering if uploading a file to the DBFS (or to a storage accessed directly from a notebook in Databricks) could be a potential security threat. Imagine you upload some files with 'malici...

  • 421 Views
  • 0 replies
  • 0 kudos
Leszek
by Contributor
  • 1882 Views
  • 5 replies
  • 11 kudos

Resolved! Runtime SQL Configuration - how to make it simple

Hi, I'm running couple of Notebooks in my pipeline and I would like to set fixed value of 'spark.sql.shuffle.partitions' - same value for every notebook. Should I do that by adding spark.conf.set.. code in each Notebook (Runtime SQL configurations ar...

  • 1882 Views
  • 5 replies
  • 11 kudos
Latest Reply
Leszek
Contributor
  • 11 kudos

Hi, Thank you all for the tips. I tried before to set this option in Spark Config but didn't work for some reason. Today I tried again and it's working :).

  • 11 kudos
4 More Replies
Labels
Top Kudoed Authors