cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Naeem_K
by New Contributor III
  • 3256 Views
  • 5 replies
  • 1 kudos

Resolved! Data Engineer Certificate and badge not received

I have cleared the certification exam on 26th January 2023 with 91.11%, but still haven't received the certificate. I had given the exam with a different mail ID but I'm not receiving any emails from Databricks to that mail ID.​I've also raised a tic...

  • 3256 Views
  • 5 replies
  • 1 kudos
Latest Reply
Nadia1
Databricks Employee
  • 1 kudos

Hello team, I have answered via salesforce ticket and sent Naeem their sign on link and resent the badge/certificate email. Thanks!

  • 1 kudos
4 More Replies
Rahul_Samant
by Contributor
  • 8352 Views
  • 8 replies
  • 1 kudos

Mounting File Share in init script of cluster

we have a flow where we have to process chunk of files from file share. currently we are moving the files first to storage account and then post processing move files back to file share again. this is adding to the execution time for moving files bac...

  • 8352 Views
  • 8 replies
  • 1 kudos
Latest Reply
Samirshaikh
New Contributor II
  • 1 kudos

Hi @Rahul Samant is this issue solved Please help we are also facing same issues

  • 1 kudos
7 More Replies
GC-James
by Contributor II
  • 6431 Views
  • 6 replies
  • 10 kudos

Disable dbutils suggestion

I would like to turn off or suppress this message which is returned from the dbutils library. %r   files <- dbutils.fs.ls("/dbfs/tmp/")   For prettier results from dbutils.fs.ls(<dir>), please use `%fs ls <dir>`How can I do this?

  • 6431 Views
  • 6 replies
  • 10 kudos
Latest Reply
Vidula
Honored Contributor
  • 10 kudos

Hi @James Smith​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...

  • 10 kudos
5 More Replies
desert_safari
by New Contributor II
  • 3584 Views
  • 2 replies
  • 0 kudos

Bugs with text query parameters?

Hi all,I have query with a handful of text query parameters, I need to use to insert data into a table from a dashboard e.g.INSERT INTO user_data (first_name, middle_name, last_name, city, country, zip_code) VALUES ('{{first_name}}', '{{middle_name}}...

  • 3584 Views
  • 2 replies
  • 0 kudos
Latest Reply
desert_safari
New Contributor II
  • 0 kudos

Sorry my client wants all 3 fields. The middle name has to be there. There are a few cases where people have just one name.

  • 0 kudos
1 More Replies
KVNARK
by Honored Contributor II
  • 4285 Views
  • 1 replies
  • 6 kudos

Resolved! grant the access permissions for specific container and also for specific folder within container in Azure Blob storage

Hi,regarding permissions for Azure Storage.we have created the Storage account (blob storage) and within the account we are going to create many containers and in which container we are going to have multiple folders and files.we want to grant permis...

  • 4285 Views
  • 1 replies
  • 6 kudos
Latest Reply
Ajay-Pandey
Databricks MVP
  • 6 kudos

Hi @KVNARK .​ ,You can use the service principle in the azure active directory to grant specific access to that app and use that app credentials to create a new mount point.That will help you to give specific storage permission to users.

  • 6 kudos
Databricks_-Dat
by New Contributor II
  • 3454 Views
  • 2 replies
  • 4 kudos

what is the supported mssql connector for Databricks runtime 11.3LTS Scala 2.12 Spark 3.3.0?

We were using mssql connector -com.microsoft.azure:spark-mssql-connector_2.12_3.0:1.0.0-alpha with 10.3LTS DBR. As we need to upgrade to higher version of DBR to make use of new functions like unpivot/melt in the notebooks. -com.microsoft.azure:spark...

  • 3454 Views
  • 2 replies
  • 4 kudos
Latest Reply
ranged_coop
Valued Contributor II
  • 4 kudos

Is the spark 3.3 series even supported by the connector yet ?As per the [github link](https://github.com/microsoft/sql-spark-connector#current-releases) - assuming this is the library you are trying to use ?The latest Spark 2.4.x compatible connector...

  • 4 kudos
1 More Replies
chanansh
by Contributor
  • 2034 Views
  • 1 replies
  • 0 kudos

Running stateful spark streaming example fails https://www.databricks.com/blog/2022/10/18/python-arbitrary-stateful-processing-structured-streaming.html

ERROR:py4j.clientserver:There was an exception while executing the Python Proxy on the Python Side. Traceback (most recent call last): File "/databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/clientserver.py", line 617, in _call_proxy retu...

  • 2034 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, The error looks like the failure was fetched from the PY configuration? Could you please provide the whole snippet of the error?

  • 0 kudos
StevenW
by New Contributor III
  • 5101 Views
  • 4 replies
  • 4 kudos

Resolved! Manipulating Data - using Notebooks

I need to read/query table A, manipulate/modify the data and insert the new data into Table A again.I considered using :Cur_Actual = spark.sql("Select * from Table A")currAct_Rows = Cur_Actual.rdd.collect()for row in currAct_Rows: do_somthing(row)...

  • 5101 Views
  • 4 replies
  • 4 kudos
Latest Reply
Manoj12421
Valued Contributor II
  • 4 kudos

You can use withColumn() for the transformations and then write data this can be append, overwrite, merge .

  • 4 kudos
3 More Replies
Anonymous
by Not applicable
  • 2648 Views
  • 2 replies
  • 0 kudos
  • 2648 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi Debayan,Thank you for answering my question.We were able to successfully attach the 2nd workspace (on a different AWS account) to the existing UC metastore on another AWS account.Unfortunately, we couldn't figure out what we did differently. It se...

  • 0 kudos
1 More Replies
Shanthala
by New Contributor III
  • 4256 Views
  • 4 replies
  • 2 kudos

Resolved! Can employees of a partner can sign up directly to the Databricks Academy?

Can employees of a partner sign up directly to the Databricks Academy without singing up for Partner Portal - https://partners.databricks.com/ ?

  • 4256 Views
  • 4 replies
  • 2 kudos
Latest Reply
Shanthala
New Contributor III
  • 2 kudos

thank you all!! got it. we started a cohort to study Databricks Engineering path last week by directly registering in Partner Academy!!

  • 2 kudos
3 More Replies
Anonymous
by Not applicable
  • 3233 Views
  • 2 replies
  • 0 kudos

Safari errors in loading

Safari shows a white page, javascript console reports some errors: SyntaxError: Invalid regular expression: invalid group specifier nameUnhandled Promise Rejection: ChunkLoadError: Loading chunk 6956 failed.(missing: https://dd-databricks-live.cloud....

  • 3233 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, Could you please make sure the safari is updated? Also, please try to clear the caches , history and restart the browser. Also, what are you trying to open and this error is surfaced?

  • 0 kudos
1 More Replies
Joao_DE
by New Contributor III
  • 3512 Views
  • 4 replies
  • 0 kudos

JDBC connection

Hi everyone! I have a question. For a project I need to establish a jdbc connection using spark.read. My question is when does the connection is deleted. That is because I will read multiple tables from that database, so if I could just create a conn...

  • 3512 Views
  • 4 replies
  • 0 kudos
Latest Reply
Joao_DE
New Contributor III
  • 0 kudos

Hi Vidula!I haven´t figure out a solution yet, so any help would be appreciatedThank you!

  • 0 kudos
3 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels