cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

llvu
by New Contributor III
  • 4583 Views
  • 3 replies
  • 1 kudos

getArgument works fine in interactive cluster 10.4 LTS, raises error in interactive cluster 10.4 LTS

Hello,I am trying to use the getArgument() function in a spark.sql query. It works fine if I run the notebook via an interactive cluster, but gives an error when executed via a job run in an instance Pool.query:OPTIMIZE <table>where date = replace(re...

  • 4583 Views
  • 3 replies
  • 1 kudos
Latest Reply
llvu
New Contributor III
  • 1 kudos

Hi @Retired_mod,Would you be able to respond to my last comment? I couldn't manage to get it working yet.Thank you in advance.

  • 1 kudos
2 More Replies
AdamStra2
by New Contributor III
  • 28511 Views
  • 0 replies
  • 3 kudos

Schema owned by Service Principal shows error in PBI

Background info:1. We have unity catalog enabled. 2. All of our jobs are run by Service Principal that has all necessary access it needs.Issue:One of the jobs checks existing schemas against the ones it is supposed to create in that given run and if ...

pic.png
  • 28511 Views
  • 0 replies
  • 3 kudos
AH
by New Contributor III
  • 2898 Views
  • 1 replies
  • 0 kudos

AWS Databricks VS AWS EMR

HiWhich services should I use for data lake implementation?any cost comparison between Databricks and aws emr.which one is best to choose 

  • 2898 Views
  • 1 replies
  • 0 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 0 kudos

@AH that depends on use case, if your implementation involves Data Lake, ML, Data engineering tasks better to go with databricks as it has got good UI and there good governance using unity catalog for your data lake and you have good consumer tool su...

  • 0 kudos
elgeo
by Valued Contributor II
  • 3179 Views
  • 1 replies
  • 1 kudos

Resolved! System billing usage table - Usage column

Hello experts,Could someone please explain what is exactly contained into the column usage in the system.billing.usage table?We ran specific queries in a cluster trying to calculate the cost and we observe that the DBUs shown in the system table are ...

  • 3179 Views
  • 1 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@elgeo both should be same, untill if somehow we miss to pick proper plan DBU price, usage column will have complete information related to sku name and DBU units etc... if you use azure databricks calculator and compare we should see similar result 

  • 1 kudos
HHol
by New Contributor
  • 7937 Views
  • 0 replies
  • 0 kudos

How to retrieve a Job Name from the SparkContext

We are currently starting to build certain data pipelines using Databricks.For this we use Jobs and the steps in these Jobs are implemented in Python Wheels.We are able to retrieve the Job ID, Job Run ID and Task Run Id in our Python Wheels from the ...

  • 7937 Views
  • 0 replies
  • 0 kudos
RyanHager
by Contributor
  • 4639 Views
  • 4 replies
  • 2 kudos

Roadmap on export menu option for SQL Query and Dashboard Types in Workspace

Are there plans for an export option for SQL Query and SQL Dashboard in the Workspace explorer screen similar to notebooks?Background:  Need a way to export and backup any queries and dashboards to save design work and move from staging environments ...

  • 4639 Views
  • 4 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

Th best option would be to have them just under git Repo (especially dashboards).

  • 2 kudos
3 More Replies
cmilligan
by Contributor II
  • 7008 Views
  • 6 replies
  • 1 kudos

Long run time with %run command

My team has started to see long run times on cells when using the %run commands to run another notebook. The notebook that we are calling with %run only contains variable setting, defining functions, and library imports. In some cases I have seen in ...

  • 7008 Views
  • 6 replies
  • 1 kudos
hexoffender
by New Contributor
  • 1266 Views
  • 0 replies
  • 0 kudos

case statements return same value

I have these 4 case statements count(*) as Total_claim_reciepts,count(case when claim_id like '%M%' and receipt_flag = 1 and is_firstpassclaim = 1 then 0 else claim_id end) as Total_claim_reciepts,count(case when claim_status ='DENIED' and claim_repa...

  • 1266 Views
  • 0 replies
  • 0 kudos
Ajbi
by New Contributor II
  • 9161 Views
  • 2 replies
  • 0 kudos

NATIVE_XML_DATA_SOURCE_NOT_ENABLED

I'm trying to read an xml file and receiving the following error. I've installed the maven library spark xml to the cluster, however I'm receiving the error. is there anything i'm missing?ErrorAnalysisException: [NATIVE_XML_DATA_SOURCE_NOT_ENABLED] N...

  • 9161 Views
  • 2 replies
  • 0 kudos
Latest Reply
Ajbi
New Contributor II
  • 0 kudos

i've tried already  spark.read.format('com.databricks.spark.xml'). it receives the same error.  

  • 0 kudos
1 More Replies
kll
by New Contributor III
  • 1751 Views
  • 1 replies
  • 0 kudos

how to save variables in one notebook to be imported into another?

Say, I have a list of values, dictionaries, variable names in `notebook1.ipynb` that I'd like to re-use / import in another `notebook2.ipynb`. For example, in `notebook1.ipynb`, I have the following:   var1 = "dallas" var_lst = [100, 200, 300, 400, ...

  • 1751 Views
  • 1 replies
  • 0 kudos
Latest Reply
Krishnamatta
Contributor
  • 0 kudos

You can use %run ./notebook2 after defining variables in notebook1So notebook2 will use the variables defined in notebook1

  • 0 kudos
manupmanoos
by New Contributor III
  • 3566 Views
  • 1 replies
  • 0 kudos

How to save the best model checkpoi through the epochs of a deep learning network through callbacks?

I have create a neural network and I am training the model with the code as below.  The code fails to write to the databricks file storage. is there any other way to write the checkpoint to databricks storage or to an s3 bucket directly?custom_early_...

  • 3566 Views
  • 1 replies
  • 0 kudos
Latest Reply
manupmanoos
New Contributor III
  • 0 kudos

Hi @Retired_mod ,I am not able to save it to local storage in databricks dbfs also. It is showing invalid operation when I am trying to save to databricks file storage. Additionally, I have valid aws credentials with which I am able to save a model t...

  • 0 kudos
eric2
by New Contributor II
  • 2653 Views
  • 3 replies
  • 0 kudos

Databricks Delta table Insert Data Error

When trying to insert data into the Delta table in databricks, an error occurs as shown below. [TASK_WRITE_FAILED] Task failed while writing rows to abfss://cont-01@dlsgolfzon001.dfs.core.windows.net/dir-db999_test/D_RGN_INFO_TMP.In SQL, the results ...

  • 2653 Views
  • 3 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

seems ok to me, have you tried to display the data from table A and also the B/C join?

  • 0 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels