cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Muskan
by New Contributor III
  • 2653 Views
  • 9 replies
  • 0 kudos

Unable to launch notebook

I have created a 12.2 LTS cluster and trying to launch notebook attached to this cluster. But unable to launch, it is not giving any error instead it is still showing the same home page.

  • 2653 Views
  • 9 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Muskan Bansal​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers...

  • 0 kudos
8 More Replies
kunaldeb
by New Contributor III
  • 2190 Views
  • 3 replies
  • 1 kudos

databricks cluster creation error

Hi Databricks Community,My Databricks workspace created on Azure pay-as-go subscription.I am facing two sided challenges.First>>   I am not able to create Delta-Live-table pipeline or any other all-purpose multi-node cluster as it is throwing below e...

  • 2190 Views
  • 3 replies
  • 1 kudos
Latest Reply
kunaldeb
New Contributor III
  • 1 kudos

HI All, Thanks for your reply. Just to update you, I am now able to to create DLT pipeline as well as all purpose multi-node cluster with minimum resources. This is due to Quota limit and I was able to increase it. But observation is, if I try to us...

  • 1 kudos
2 More Replies
Kit
by New Contributor III
  • 1397 Views
  • 6 replies
  • 1 kudos

Resolved! How can I get the list of downstream dashboards of a query

I want to deprecate a query from our workspace. However, I don't know if there is any downstream dashboard needs it.In case there is downstream dependency, then I can't just delete the query.I tried the search by query name, but it only returns the ...

  • 1397 Views
  • 6 replies
  • 1 kudos
Latest Reply
Kit
New Contributor III
  • 1 kudos

Databricks support suggests me to check the query history, to find out any history of the query, then check which dashboard is using it.Not an ideal solution but it works for me.

  • 1 kudos
5 More Replies
Nick_Hughes
by New Contributor III
  • 2052 Views
  • 3 replies
  • 2 kudos

Authorised views in databricks?

In GCP you can give a user access to a view, and then the view itself access to the underlying object, meaning you don't have to give end users access to the tables themselves.Is there a similar way of managing these permissions in databricks? The vi...

  • 2052 Views
  • 3 replies
  • 2 kudos
Latest Reply
Kaniz
Community Manager
  • 2 kudos

Hi @Nick Hughes​, Yes, in Databricks, you can also manage permissions at the database and table/view level to grant or revoke access to users or groups.You can create a view in Databricks using the CREATE VIEW command and then grant appropriate permi...

  • 2 kudos
2 More Replies
Antoine1
by New Contributor III
  • 2561 Views
  • 9 replies
  • 5 kudos

Seem to have outdated Account Console

Hello, We have been testing for a long time with Databricks and are now going to run it in production. Our tests were done over Databricks for AWS using the Standard plan and have since upgraded to the Premium plan. One of the aims to upgrade plans w...

databricks_actual Databricks_should_look_like
  • 2561 Views
  • 9 replies
  • 5 kudos
Latest Reply
Antoine1
New Contributor III
  • 5 kudos

Hello, Does anyone have a proper way of contacting support ? As explained in some answers on this thread, we aren't able to create a support ticket in the help centre. We have contacted our account executive 10 days ago, to try to understand why we c...

  • 5 kudos
8 More Replies
brian_0305
by New Contributor II
  • 1887 Views
  • 3 replies
  • 2 kudos

Use JDBC connect to databrick default cluster and read table into pyspark dataframe. All the column turned into same as column name

I used code like below to Use JDBC connect to databrick default cluster and read table into pyspark dataframeurl = 'jdbc:databricks://[workspace domain]:443/default;transportMode=http;ssl=1;AuthMech=3;httpPath=[path];AuthMech=3;UID=token;PWD=[your_ac...

error
  • 1887 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@yu zhang​ :It looks like the issue with the first code snippet you provided is that it is not specifying the correct query to retrieve the data from your database.When using the load() method with the jdbc data source, you need to provide a SQL quer...

  • 2 kudos
2 More Replies
Erik_L
by Contributor II
  • 1194 Views
  • 3 replies
  • 1 kudos

Resolved! How to keep data in time-based localized clusters after joining?

I have a bunch of data frames from different data sources. They are all time series data in order of a column timestamp, which is an int32 Unix timestamp. I can join them together by this and another column join_idx which is basically an integer inde...

  • 1194 Views
  • 3 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Erik Louie​ :If the data frames have different time zones, you can use Databricks' timezone conversion function to convert them to a common time zone. You can use the from_utc_timestamp or to_utc_timestampfunction to convert the timestamp column to ...

  • 1 kudos
2 More Replies
shaunangcx
by New Contributor II
  • 1727 Views
  • 3 replies
  • 0 kudos

Resolved! Command output disappearing (Not sure what's the root cause)

I have a workflow which will run every month and it will create a new notebook containing the outputs from the main notebook. However, after some time, the outputs from the created notebook will disappear. Is there anyway I can retain the outputs?

  • 1727 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Shaun Ang​ :There are a few possible reasons why the outputs from the created notebook might be disappearing:Notebook permissions: It's possible that the user or service account running the workflow does not have permission to write to the destinati...

  • 0 kudos
2 More Replies
rbelidrv
by New Contributor II
  • 3949 Views
  • 3 replies
  • 1 kudos

How to apply a UDF to a property in an array of structs

I have a column that contains an array of structs as follows:"column" : [ { "struct_field1": "struct_value", "struct_field2": "struct_value" }, { "struct_field1": "struct_value", "struct_field2": "struct_value" } ]I want to apply a udf to each f...

  • 3949 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @Richard Belihomji​, It looks like you are trying to apply a UDF to each field of the structs in an array column in a Spark DataFrame. However, it seems you are encountering an issue with the UDF not receiving the context.To nest a UDF inside a tr...

  • 1 kudos
2 More Replies
sintsan
by New Contributor II
  • 1119 Views
  • 3 replies
  • 0 kudos

Azure Databricks DBFS Root, Storage Account Networking

For an Azure Databricks with vnet injection, we would like to change the networking on the default managed Azure Databricks storage account (dbstorage) from Enabled from all networks to Enabled from selected virtual networks and IP addresses.Can this...

  • 1119 Views
  • 3 replies
  • 0 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 0 kudos

@Sander Sintjorissen​ usually root storage bucket has below directories present in article https://learn.microsoft.com/en-us/azure/databricks/dbfs/root-locationsto store logs related to auditing you can create another storage and add that. hope this ...

  • 0 kudos
2 More Replies
usman_wains
by New Contributor
  • 358 Views
  • 1 replies
  • 0 kudos

Request for unlock workspace

please unlock my workspace that am easily to login our workspace am waiting a few days ago

  • 358 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

Adding @Vidula Khanna​ and @Kaniz Fatma​ for visibility to help you with your request

  • 0 kudos
RayelightOP
by New Contributor II
  • 815 Views
  • 1 replies
  • 2 kudos

Azure Blob Storage sas-keys expired for Apache Spark Tutorial

"Apache Spark programming with databricks" tutorial uses Blob storage parquet files on Azure. To access those files a sas key is used in the configuration files. Those keys were generated 5 years ago, however they expired in the begining of this mont...

  • 815 Views
  • 1 replies
  • 2 kudos
Latest Reply
jose_gonzalez
Moderator
  • 2 kudos

Adding @Vidula Khanna​ and @Kaniz Fatma​ for visibility to help with your request

  • 2 kudos
kumarPerry
by New Contributor II
  • 1294 Views
  • 3 replies
  • 0 kudos

Notebook connectivity issue with aws s3 bucket using mounting

When connecting to aws s3 bucket using dbfs, application throws error like org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 7864387.0 failed 4 times, most recent failure: Lost task 0.3 in stage 7864387.0 (TID 1709732...

  • 1294 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Amrendra Kumar​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us s...

  • 0 kudos
2 More Replies
SDas1
by New Contributor
  • 2436 Views
  • 1 replies
  • 0 kudos

Identity column value of Databricks delta table is not started with 0 and increaed by 1. It always started with something like 1 or 2 and increased by 2. Below is the sample code and any logical input here is appreciated

spark.sql("CREATE TABLE integrated.TrailingWeeks(ID bigint GENERATED BY DEFAULT AS IDENTITY (START WITH 0 increment by 1) ,Week_ID int NOT NULL) USING delta OPTIONS (path 'dbfs:/<Path in Azure datalake>/delta')")

  • 2436 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Shubhendu Das​, Thank you for contacting us about your concern about the identity column values in your Databricks Delta table. I understand the deals are not starting at 0 or incrementing by one as expected.Databricks Delta Lake does not guarant...

  • 0 kudos
Labels
Top Kudoed Authors