cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

HowardLJ
by New Contributor
  • 2430 Views
  • 1 replies
  • 0 kudos

Delta Live Tables Slowly Changing Dimensions Type 2 with Joins

Hi,I may be missing something really obvious here. The organisation I work for has started using Delta Live Tables in Databricks for data modelling, recently. One of the dimensions I am trying to model takes data from 3 existing tables in our data la...

Get Started Discussions
Delta Live Tables
SCD
  • 2430 Views
  • 1 replies
  • 0 kudos
Latest Reply
the_real_merca
New Contributor II
  • 0 kudos

Can it be because the default join is `inner` and that means the row must exists in both tables

  • 0 kudos
AFox
by Contributor
  • 1216 Views
  • 1 replies
  • 0 kudos

Databricks Community Post Editor Issues

Anyone else constantly having errors with this editor when using any of the 'features' like code sample?Can we please have a Markdown Editor or at least the ability to edit the HTML this tool creates to fix all the bugs it makes?

  • 1216 Views
  • 1 replies
  • 0 kudos
Latest Reply
AFox
Contributor
  • 0 kudos

Here is a fun one: "The message body contains h d, which is not permitted in this community. Please remove this content before sending your post."Had to add the space between h and d to be able to post it.  This means code samples can't contain `ch d...

  • 0 kudos
Stewie
by New Contributor II
  • 2166 Views
  • 0 replies
  • 0 kudos

Databricks Advanced Data Engineering Course Factually Incorrect and Misleading

On Video 4 of the Advanced Data Engineering with Databricks course at 3:08 the presenter says 'No one else can do what we can with a single solution' . This is far from truth, Palantir foundry is miles ahead of databricks in Data Governance , Ease of...

  • 2166 Views
  • 0 replies
  • 0 kudos
Phani1
by Valued Contributor II
  • 6595 Views
  • 0 replies
  • 0 kudos

Error handling best practices

Hi Team,Could you please share the best practices for error handling in Databricks for the following:                  1. Notebook level 2.Job level 3. Code level(Python) 4. streaming 5. DLT & Autoloader  Kindly suggest details around  Error handling...

  • 6595 Views
  • 0 replies
  • 0 kudos
DavidKxx
by Contributor
  • 4732 Views
  • 2 replies
  • 1 kudos

Resolved! Rendering markdown images hard coded as data image png base64 in notebook

Hi all,For training purposes, I have cloned a repo from John Snow Labs into my Databricks account and am working in the notebook that you can review at https://github.com/JohnSnowLabs/spark-nlp-workshop/blob/master/open-source-nlp/03.0.SparkNLP_Pretr...

DavidKxx_0-1700585428705.png
Get Started Discussions
image
rendering
  • 4732 Views
  • 2 replies
  • 1 kudos
Latest Reply
NateAnth
Databricks Employee
  • 1 kudos

Try changing the magic command for that cell from %md to %md-sandbox to see if that helps the image to render appropriately.

  • 1 kudos
1 More Replies
Scott_in_Zurich
by New Contributor III
  • 3046 Views
  • 1 replies
  • 1 kudos

Tutorial "Query Data from a Notebook": access errors

I am trying to work through Tutorial: Query data from a Notebook.Access errors are defeating my attempts. Steps to reproduce:sign up for free trial through Databricks website. The path skipped the subscription-selection step and defaulted the trial t...

  • 3046 Views
  • 1 replies
  • 1 kudos
Latest Reply
Scott_in_Zurich
New Contributor III
  • 1 kudos

Done, thanks!

  • 1 kudos
aayusha3
by New Contributor II
  • 1823 Views
  • 1 replies
  • 0 kudos

Internal error: Attach your notebook to a different compute or restart the current compute.

I am currently using a personal computer cluster [13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12)] on GCP attached to a notebook. After running a few command lines without an issue, I end up getting this errorInternal error. Attach your notebook t...

aayusha3_0-1700256903637.png
  • 1823 Views
  • 1 replies
  • 0 kudos
radupopa
by New Contributor II
  • 1611 Views
  • 1 replies
  • 1 kudos

Resubscribing / Deleting and recreating an account

Hello,at some point I tested Databricks for a potential customer and, after the test, I cancelled the subscription.I read that it is not possible to resubscribe with the same e-mail address. Therefore, my idea would be to delete the account I created...

  • 1611 Views
  • 1 replies
  • 1 kudos
Latest Reply
rajd
New Contributor II
  • 1 kudos

I have a similar issue. I subscribed to Databricks using AWS account email. I cancelled it later. Now I want to start using Databricks on  AWS again using the same email id and with pay as you go plan. But there is no way to re-subscribe. If this can...

  • 1 kudos
hillel1
by New Contributor II
  • 1332 Views
  • 0 replies
  • 0 kudos

benchmark tpc-ds from external parquet hive structure in S#

Hi I am just getting started in databricks would appreciate some help here.I have 10TB TPCDS in S3 i a hive partition structure.My goal is to benchmark a data bricks cluster on this data.after setting all IAM credentials according to this https://doc...

Get Started Discussions
External Table
S3
  • 1332 Views
  • 0 replies
  • 0 kudos
Shree23
by New Contributor III
  • 1373 Views
  • 1 replies
  • 0 kudos

Azure DevOps load sequence

Hi Expert,How we can setup multiple notebook in  a sequence order in flow for an example 1 pipeline  have notebook1 - sequence 1,Notebook2- Sequence 2(in 1pipeline only)   

  • 1373 Views
  • 1 replies
  • 0 kudos
Latest Reply
LeTurtleboy
New Contributor II
  • 0 kudos

Not sure how to approach your challenge but something you can is to use the Databricks Job Scheduler or if you want an external solution in Azure you can call several notebooks from DataFactory.

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels