cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Tripalink
by New Contributor III
  • 4845 Views
  • 4 replies
  • 1 kudos

Error. Another git operation is in progress.

I am getting an error every time I try to view another branch or create a branch. Sometimes this has happened in the past, but usually seems to fix itself after about 10-30 minutes. This error has been lasting for over 12 hours, so I am now concerned...

git_error_message
  • 4845 Views
  • 4 replies
  • 1 kudos
Latest Reply
Hakuna_Madata
New Contributor II
  • 1 kudos

I had the same problem and I could resolve it by creating the repo again with a trailing ".git" in the Git repository URL.For example, use thishttps://gitlab.mycompany.com/my-project/my-repo.gitnot this:https://gitlab.mycompany.com/my-project/my-repo...

  • 1 kudos
3 More Replies
jfarmer
by New Contributor II
  • 5097 Views
  • 3 replies
  • 1 kudos

PermissionError / Operation not Permitted with Files-in-Repos

I've been running a notebook using files-in-repo. Previously this has worked fine. I'm unsure what's changed (I was testing integration with DCS on older runtimes, but don't think I made any persistent changes)--but now it's throwing an error (always...

image image
  • 5097 Views
  • 3 replies
  • 1 kudos
Latest Reply
_carleto_
New Contributor II
  • 1 kudos

Hi @jfarmer , did you solved this issue? I'm having exactly the same challenge.Thanks!

  • 1 kudos
2 More Replies
shelly
by New Contributor
  • 2229 Views
  • 3 replies
  • 0 kudos

take() operation throwing index out of range error

x=[1,2,3,4,5,6,7]rdd = sc.parallelize(x)print (rdd.take(2))Traceback (most recent call last): File "/usr/local/spark/python/pyspark/serializers.py", line 458, in dumps return cloudpickle.dumps(obj, pickle_protocol) ^^^^^^^^^^^^^^^^^^...

  • 2229 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Shelly Bhardwaj​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 0 kudos
2 More Replies
shelly
by New Contributor
  • 1185 Views
  • 2 replies
  • 0 kudos

take() ooperation is throwing error

Traceback (most recent call last): File "/usr/local/spark/python/pyspark/serializers.py", line 458, in dumps return cloudpickle.dumps(obj, pickle_protocol) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/spark/python/pyspa...

  • 1185 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Shelly Bhardwaj​ :The error message you provided seems to be incomplete, as it only shows the traceback of a serialization error. Can you provide the full error message or describe the issue in more detail?Regarding the code you provided, it looks c...

  • 0 kudos
1 More Replies
Joao_DE
by New Contributor III
  • 3736 Views
  • 2 replies
  • 0 kudos

GRANT PRIVILEGES or REVOKE not working in databricks: Operation not allowed

Hi everyone!I am having a problem! I can Grant or Revoke privileges from users using the UI on databricks, but when I try to do that using SQL commands such as GRANT SELECT ON SCHEMA [... ] TO [USER]; I get an error stating Operation not allowed.I am...

  • 3736 Views
  • 2 replies
  • 0 kudos
Latest Reply
Joao_DE
New Contributor III
  • 0 kudos

Its solved, the problem was I was using '' insted of ``

  • 0 kudos
1 More Replies
gauthamchettiar
by New Contributor II
  • 1604 Views
  • 0 replies
  • 1 kudos

Spark always performing broad casts irrespective of spark.sql.autoBroadcastJoinThreshold during streaming merge operation with DeltaTable.

I am trying to do a streaming merge between delta tables using this guide - https://docs.delta.io/latest/delta-update.html#upsert-from-streaming-queries-using-foreachbatchOur Code Sample (Java): Dataset<Row> sourceDf = sparkSession ...

BroadCastJoin 1M
  • 1604 Views
  • 0 replies
  • 1 kudos
al_joe
by Contributor
  • 735 Views
  • 0 replies
  • 3 kudos

Why is this simple numerical operation not precise?

I was experimenting with beginner tutorial and saw this strange output ...Why is this so ? And why is the behavior not consistent for ALL rows updated by the same statement?8.8 - 1 = 7.800000000000001See screenshot ...

20220827_180902_msedge_DE_2.1_-_Managing_Delta_Tables_-_Databricks_-_Pers
  • 735 Views
  • 0 replies
  • 3 kudos
SailajaB
by Valued Contributor III
  • 2877 Views
  • 2 replies
  • 5 kudos

An error occurred while calling o303.mount: Operation failed: "This request is not authorized to perform this operation

Hi Team,We are unable to mount storage container in below scenario We created Gen 2 using VNet and added firewall restrictions (i.e allow trusted sources)And deployed Data bricks workspace with out VNet injection. Is it possible to add databricks pub...

  • 2877 Views
  • 2 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hey @Sailaja B​ Hope everything is great!Does Hubert's response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?Thanks!

  • 5 kudos
1 More Replies
matt_t
by New Contributor
  • 3003 Views
  • 3 replies
  • 1 kudos

Resolved! S3 sync from bucket to a mounted bucket causing a "[Errno 95] Operation not supported" error for some but not all files

Trying to sync one folder from an external s3 bucket to a folder on a mounted S3 bucket and running some simple code on databricks to accomplish this. Data is a bunch of CSVs and PSVs.The only problem is some of the files are giving this error that t...

  • 3003 Views
  • 3 replies
  • 1 kudos
Latest Reply
Atanu
Esteemed Contributor
  • 1 kudos

@Matthew Tribby​  does above suggestion work. Please let us know if you need further help on this. Thanks.

  • 1 kudos
2 More Replies
Labels