cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

AdamStra2
by New Contributor III
  • 1920 Views
  • 2 replies
  • 1 kudos

Web terminal and clusters

Hi, I have come across this piece of documentation:Databricks does not support running Spark jobs from the web terminal. In addition, Databricks web terminal is not available in the following cluster types:Job clustersClusters launched with the DISAB...

  • 1920 Views
  • 2 replies
  • 1 kudos
Latest Reply
AdamStra2
New Contributor III
  • 1 kudos

Hi @Retired_mod ,any update on my question? Thanks.

  • 1 kudos
1 More Replies
sm1274
by New Contributor
  • 3114 Views
  • 0 replies
  • 0 kudos

Creating java UDF for Spark SQL

Hello, I have created a sample java UDF which masks few characters of a string. However I facing couple of issues when uploading and using it.First I could only import it, which for now is OK. But when do the following,create function udf_mask as 'ba...

  • 3114 Views
  • 0 replies
  • 0 kudos
thuovi
by New Contributor II
  • 2540 Views
  • 0 replies
  • 2 kudos

dbutils.fs.ls MAX_LIST_SIZE_EXCEEDED

Hi!I'm experiencing different behaviours between two DBX Workspaces when trying to list file contents from an abfss: location.In workspace A running len(dbutils.fs.ls('abfss://~~@~~~~.dfs.core.windows.net/~~/')) results in "Out[1]: 1551", while runni...

  • 2540 Views
  • 0 replies
  • 2 kudos
s_park
by Databricks Employee
  • 22412 Views
  • 3 replies
  • 4 kudos

Training @ Data & AI World Tour 2023

Join your peers at the Data + AI World Tour 2023! Explore the latest advancements, hear real-world case studies and discover best practices that deliver data and AI transformation. From the Databricks Lakehouse Platform to open source technologies in...

Screenshot 2023-10-09 at 10.42.55 AM.png
Get Started Discussions
DAIWT
DAIWT_2023
Training
User_Group
  • 22412 Views
  • 3 replies
  • 4 kudos
Latest Reply
VjGian15
New Contributor II
  • 4 kudos

Introducing Mini Flush: Your Ticket to Ultimate Casino Thrills!Are you ready to embark on an electrifying journey into the world of online gambling? If so, look no further than Vijaybet Online Casino! Our state-of-the-art platform is your gateway to ...

  • 4 kudos
2 More Replies
sg-vtc
by New Contributor III
  • 2395 Views
  • 1 replies
  • 1 kudos

Resolved! Problem creating external delta table on non-AWS s3 bucket

I am testing Databricks with non-AWS S3 object storage.  I can access the non-AWS S3 bucket by setting these parameters:sc._jsc.hadoopConfiguration().set("fs.s3a.access.key", "XXXXXXXXXXXXXXXXXXXX")sc._jsc.hadoopConfiguration().set("fs.s3a.secret.key...

sgvtc_0-1697817308224.png sgvtc_1-1697817308223.png sgvtc_2-1697817308221.png
Get Started Discussions
external delta table
  • 2395 Views
  • 1 replies
  • 1 kudos
Latest Reply
sg-vtc
New Contributor III
  • 1 kudos

Found the solution to disable it.  Can close this question.

  • 1 kudos
llvu
by New Contributor III
  • 4785 Views
  • 3 replies
  • 1 kudos

getArgument works fine in interactive cluster 10.4 LTS, raises error in interactive cluster 10.4 LTS

Hello,I am trying to use the getArgument() function in a spark.sql query. It works fine if I run the notebook via an interactive cluster, but gives an error when executed via a job run in an instance Pool.query:OPTIMIZE <table>where date = replace(re...

  • 4785 Views
  • 3 replies
  • 1 kudos
Latest Reply
llvu
New Contributor III
  • 1 kudos

Hi @Retired_mod,Would you be able to respond to my last comment? I couldn't manage to get it working yet.Thank you in advance.

  • 1 kudos
2 More Replies
AdamStra2
by New Contributor III
  • 28558 Views
  • 0 replies
  • 3 kudos

Schema owned by Service Principal shows error in PBI

Background info:1. We have unity catalog enabled. 2. All of our jobs are run by Service Principal that has all necessary access it needs.Issue:One of the jobs checks existing schemas against the ones it is supposed to create in that given run and if ...

pic.png
  • 28558 Views
  • 0 replies
  • 3 kudos
AH
by New Contributor III
  • 3031 Views
  • 1 replies
  • 0 kudos

AWS Databricks VS AWS EMR

HiWhich services should I use for data lake implementation?any cost comparison between Databricks and aws emr.which one is best to choose 

  • 3031 Views
  • 1 replies
  • 0 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 0 kudos

@AH that depends on use case, if your implementation involves Data Lake, ML, Data engineering tasks better to go with databricks as it has got good UI and there good governance using unity catalog for your data lake and you have good consumer tool su...

  • 0 kudos
elgeo
by Valued Contributor II
  • 3342 Views
  • 1 replies
  • 1 kudos

Resolved! System billing usage table - Usage column

Hello experts,Could someone please explain what is exactly contained into the column usage in the system.billing.usage table?We ran specific queries in a cluster trying to calculate the cost and we observe that the DBUs shown in the system table are ...

  • 3342 Views
  • 1 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@elgeo both should be same, untill if somehow we miss to pick proper plan DBU price, usage column will have complete information related to sku name and DBU units etc... if you use azure databricks calculator and compare we should see similar result 

  • 1 kudos
HHol
by New Contributor
  • 8043 Views
  • 0 replies
  • 0 kudos

How to retrieve a Job Name from the SparkContext

We are currently starting to build certain data pipelines using Databricks.For this we use Jobs and the steps in these Jobs are implemented in Python Wheels.We are able to retrieve the Job ID, Job Run ID and Task Run Id in our Python Wheels from the ...

  • 8043 Views
  • 0 replies
  • 0 kudos
RyanHager
by Contributor
  • 4832 Views
  • 4 replies
  • 2 kudos

Roadmap on export menu option for SQL Query and Dashboard Types in Workspace

Are there plans for an export option for SQL Query and SQL Dashboard in the Workspace explorer screen similar to notebooks?Background:  Need a way to export and backup any queries and dashboards to save design work and move from staging environments ...

  • 4832 Views
  • 4 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

Th best option would be to have them just under git Repo (especially dashboards).

  • 2 kudos
3 More Replies
cmilligan
by Contributor II
  • 7324 Views
  • 6 replies
  • 1 kudos

Long run time with %run command

My team has started to see long run times on cells when using the %run commands to run another notebook. The notebook that we are calling with %run only contains variable setting, defining functions, and library imports. In some cases I have seen in ...

  • 7324 Views
  • 6 replies
  • 1 kudos
hexoffender
by New Contributor
  • 1324 Views
  • 0 replies
  • 0 kudos

case statements return same value

I have these 4 case statements count(*) as Total_claim_reciepts,count(case when claim_id like '%M%' and receipt_flag = 1 and is_firstpassclaim = 1 then 0 else claim_id end) as Total_claim_reciepts,count(case when claim_status ='DENIED' and claim_repa...

  • 1324 Views
  • 0 replies
  • 0 kudos
Ajbi
by New Contributor II
  • 9269 Views
  • 2 replies
  • 0 kudos

NATIVE_XML_DATA_SOURCE_NOT_ENABLED

I'm trying to read an xml file and receiving the following error. I've installed the maven library spark xml to the cluster, however I'm receiving the error. is there anything i'm missing?ErrorAnalysisException: [NATIVE_XML_DATA_SOURCE_NOT_ENABLED] N...

  • 9269 Views
  • 2 replies
  • 0 kudos
Latest Reply
Ajbi
New Contributor II
  • 0 kudos

i've tried already  spark.read.format('com.databricks.spark.xml'). it receives the same error.  

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels