I cannot create a workspace, help me please.
They are grey I cannot click them. And if I hover my cursor on top of them, there is no any info.What am I gonna do?
- 755 Views
- 0 replies
- 4 kudos
They are grey I cannot click them. And if I hover my cursor on top of them, there is no any info.What am I gonna do?
So I was wondering who uses package cells in scala?We have this library (jar) which has some useful functions we use all over the place. But that's about it. So I think we can do the same thing without a jar but with package cells.But I never hear ...
Hi @Werner Stinckens​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...
This is specific to creating new jobs, I understand that various permissions can be set on existing jobs using job access control. This seems to suggest no, I can't find anything in the Databricks docs either.
nope.Looked for that too, but it does not seem to be possible. Perhaps with Unity catalog, as there you have more permission controls.But using Unity is not an overnight decision.
Hi all,I have a quick currious. I know both query and dashboard page in Databricks SQL have refresh button to can them refresh. But one question it, when I'm in Dashboard page and click the refesh button. Does this thing also force every related quer...
Thanks all your support. It's totally clear for me now!!!
The date field is getting changed while reading data from source .xls file to the dataframe. In the source xl file all columns are strings but i am not sure why date column alone behaves differentlyIn Source file date is 1/24/1947.In pyspark datafram...
how about using inferschema one single time to create a correct DF, then create a schema from the df-schema.something like this f.e.from pyspark.sql.types import StructType # Save schema from the original DataFrame into json: schema_json = df.s...
I have a Multi-Task Job that is running a bunch of PySpark notebooks and about 30-60% of the time, my jobs fail with the following error:I haven't seen any consistency with this error. I've had as many as all of the tasks in the job giving this error...
Hi. Did you ever got a resolution to this problem outside of rolling back to 10.4? I have recently moved some workloads over to runtime 11.3 and am experiencing intermittent "repl did not start in 30 seconds." errors.I have increased the repl timeout...
How To convert the below query to spark sql. especially the isnull REplaceSELECT ID,ISNULL(NAME,"N/A") AS NAME,COMPANYFROM TEST
How To convert the below query to spark sql. especially the isnull REplaceSELECT ID, ISNULL(NAME,"N/A") AS NAME, COMPANYFROM TEST
In a databricks cluster with Scala 2.1.1 I am trying to read a file into a spark data frame using the following code.val df = spark.read .format("com.springml.spark.sftp") .option("host", "*") .option("username", "*") .option("password", "*")...
Hi @Andreas P​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
Can we use Databricks or code in data bricks without learning Pyspark in depth which is used for ETL purpose and data engineering perspective. can someone throw some light on this. Currently learning Pyspark (basics of Pythion in handling the data) a...
Thanks All for your valuable suggestions!
Hello,I just bootstrap a new Databricks EC2 on an AWS account with Terraform. Priori dependencies seems OK on my side (network, root storage, credentials configuration). I'm referring mainly to this guide and of course pages related to each Databrick...
I have a databricks notebook which is to read stream from Azure Event Hub.My code does the following:1.Configure path for Eventhubs2.Read Streamdf_read_stream = (spark.readStream .format("eventhubs") .options(**conf)...
I am also facing same issue , using Cluster11.3 LTS (includes Apache Spark 3.3.0, Scala 2.12) liberary : com.microsoft.azure:azure-eventhubs-spark_2.12:2.3.21Please help me for sameconf = {}conf["eventhubs.connectionString"] = "Endpoint=sb://xxxx.ser...
How can we connect databricks to MongoDB?Any code reference will help
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group