cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

marcus1
by New Contributor III
  • 2438 Views
  • 2 replies
  • 1 kudos

Running jobs as a non-job owner

We have enabled Cluster, Pool and Job access, and non-job owners can not run a job even though they are administrators. This disables users from creating cluster resources.When a non-owner of a job attempts to run, they get a permission denied.My un...

  • 2438 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Marcus Simonsen​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 1 kudos
1 More Replies
ConSpooky
by New Contributor II
  • 2570 Views
  • 3 replies
  • 4 kudos

Best practice for creating queries for data transformation?

My apologies in advance for sounding like a newbie. This is really just a curiosity question I have as an outsider observing my team clash with our client. Please ask any questions you have, and I will try my best to answer it.Currently, we are stori...

  • 2570 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Nick Connors​Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...

  • 4 kudos
2 More Replies
Gerhard
by New Contributor III
  • 1916 Views
  • 0 replies
  • 1 kudos

Read proprietary files and transform contents to a table - error resilient process needed

We do have data stored in HDF5 files in a "proprietary" way. This data needs to be read, converted and transformed before it can be inserted into a delta table.All of this transformation is done in a custom python function that takes the HDF5 file an...

  • 1916 Views
  • 0 replies
  • 1 kudos
tassiodahora
by New Contributor III
  • 65423 Views
  • 2 replies
  • 7 kudos

Resolved! Failed to merge incompatible data types LongType and StringType

Guys, good morning!I am writing the results of a json in a delta table, only the json structure is not always the same, if the field does not list in the json it generates type incompatibility when I append(dfbrzagend.write .format("delta") .mode("ap...

  • 65423 Views
  • 2 replies
  • 7 kudos
Latest Reply
Anonymous
Not applicable
  • 7 kudos

Hi @Tássio Santos​ The delta table performs schema validation of every column, and the source dataframe column data types must match the column data types in the target table. If they don’t match, an exception is raised.For reference-https://docs.dat...

  • 7 kudos
1 More Replies
Geeta1
by Valued Contributor
  • 2362 Views
  • 1 replies
  • 8 kudos

Activate gift certificate in Databricks Community Reward store

Hello all,Can anyone let me know about the "Activate Gift Certificate" option in Databricks Community Reward store? What is its purpose and how we can use it?

image.png
  • 2362 Views
  • 1 replies
  • 8 kudos
Latest Reply
yogu
Honored Contributor III
  • 8 kudos

you earn points with forum interaction. Those points can be exchanged for 'credits'.With those credits you can buy Databricks swag.Your lifetime points (so the cumulated amount of points) are not affected by this.

  • 8 kudos
Manjusha
by New Contributor II
  • 2831 Views
  • 1 replies
  • 1 kudos

SocketTimeout exception when running a display command on spark dataframe

I am using runtime 9.1LTSI have a R notebook that reads a csv into a R dataframe and does some transformations and finally is converted to spark dataframe using the createDataFrame function.after that when I call the display function on this spark da...

  • 2831 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Manjusha Unnikrishnan​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else bricksters will get back to you soon. Thanks.

  • 1 kudos
MBV3
by Contributor
  • 2318 Views
  • 1 replies
  • 2 kudos

Delete a file from GCS folder

What is the best way to delete files from the gcp bucket inside spark job?

  • 2318 Views
  • 1 replies
  • 2 kudos
Latest Reply
Unforgiven
Valued Contributor III
  • 2 kudos

@M Baig​ yes you need just to create service account for databricks and than assign storage admin role to bucket. After that you can mount GCS standard way:bucket_name = "<bucket-name>"mount_name = "<mount-name>"dbutils.fs.mount("gs://%s" % bucket_na...

  • 2 kudos
User16844487905
by New Contributor III
  • 5366 Views
  • 4 replies
  • 5 kudos

AWS quickstart - Cloudformation failure When deploying your workspace with the recommended AWS quickstart method, a Cloudformation template will be la...

AWS quickstart - Cloudformation failureWhen deploying your workspace with the recommended AWS quickstart method, a Cloudformation template will be launched in your AWS account. If you experience a failure with the error message along the lines of ROL...

Screen Shot 2021-10-12 at 11.46.28 AM Screen Shot 2021-10-13 at 3.09.01 PM
  • 5366 Views
  • 4 replies
  • 5 kudos
Latest Reply
yalun
New Contributor III
  • 5 kudos

How do I launch the "Quickstart" again? Where is it in the console?

  • 5 kudos
3 More Replies
SM
by New Contributor III
  • 6168 Views
  • 3 replies
  • 10 kudos

How to use Azure Data lake as a storage location to store the Delta Live Tables?

I am trying write data into Azure Datalake. I am reading files from Azure Blob Storage however when I try to create the Delta Live Table to Azure Datalake I get error the following errorshaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contrac...

image
  • 6168 Views
  • 3 replies
  • 10 kudos
Latest Reply
RThornton
New Contributor III
  • 10 kudos

@Kaniz Fatma​ I don't think you quite understand the question. I'm running into the same problem. When creating a Delta Live Table pipeline to write to Azure Data Lake Storage (abfss://etc...) as the Storage Location, the pipeline fails with the erro...

  • 10 kudos
2 More Replies
AmarJT
by New Contributor II
  • 3257 Views
  • 2 replies
  • 6 kudos

Lakehouse Fundamentals Accreditation badge not received

Hi Team,I have successfully passed the test after completion of the course. But i have not received any badge from your side. I have just been provided a certificate. Certificate ID:ID: E-E04YDVAs mentioned in the web portals i tried accessing "http...

image image
  • 3257 Views
  • 2 replies
  • 6 kudos
Latest Reply
Geeta1
Valued Contributor
  • 6 kudos

Hi @Amarjeet Kumar​ , you will receive the badge in a day after completion. Even I received it a day after I cleared the exam. If you don't receive it the next day also, then you can raise a ticket at  https://help.databricks.com/s/contact-us?ReqType...

  • 6 kudos
1 More Replies
yogu
by Honored Contributor III
  • 2406 Views
  • 2 replies
  • 18 kudos

Resolved! Explain about "Activate Gift Certificate" section in Databricks Community Rewards.

Hello everyone,Can any one explain about the Active Gift Certificate in in Databricks Community Rewards. And how to use it?

image
  • 2406 Views
  • 2 replies
  • 18 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 18 kudos

it boils down to this:you earn points with forum interaction. Those points can be exchanged for 'credits'.With those credits you can buy Databricks swag.Your lifetime points (so the cumulated amount of points) are not affected by this.

  • 18 kudos
1 More Replies
-werners-
by Esteemed Contributor III
  • 2320 Views
  • 3 replies
  • 22 kudos

Resolved! Package cells (scala), who uses them?

So I was wondering who uses package cells in scala?We have this library (jar) which has some useful functions we use all over the place. But that's about it. So I think we can do the same thing without a jar but with package cells.But I never hear ...

  • 2320 Views
  • 3 replies
  • 22 kudos
Latest Reply
Anonymous
Not applicable
  • 22 kudos

Hi @Werner Stinckens​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...

  • 22 kudos
2 More Replies
Constantino
by New Contributor III
  • 1594 Views
  • 1 replies
  • 2 kudos

Is there any way to prevent non-admin users from creating new jobs?

This is specific to creating new jobs, I understand that various permissions can be set on existing jobs using job access control. This seems to suggest no, I can't find anything in the Databricks docs either.

  • 1594 Views
  • 1 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

nope.Looked for that too, but it does not seem to be possible. Perhaps with Unity catalog, as there you have more permission controls.But using Unity is not an overnight decision.

  • 2 kudos
Anonymous
by Not applicable
  • 3093 Views
  • 3 replies
  • 28 kudos

Resolved! Refresh Dashboard also make all related queried refresh?

Hi all,I have a quick currious. I know both query and dashboard page in Databricks SQL have refresh button to can them refresh. But one question it, when I'm in Dashboard page and click the refesh button. Does this thing also force every related quer...

  • 3093 Views
  • 3 replies
  • 28 kudos
Latest Reply
Anonymous
Not applicable
  • 28 kudos

Thanks all your support. It's totally clear for me now!!!

  • 28 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels