cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

vrajesh123
by New Contributor II
  • 6262 Views
  • 1 replies
  • 0 kudos

Webassessor Secure Browser will not Launch during exam.

Hello - I registered for the Databricks Data Engineering Associate Certification exam.  I hit an issue, their Secure browser would not launch, it just crashed - the only thing I could see in a flash is "bad request" and poof its gone.  Spend over 2 h...

Get Started Discussions
Certification Exam
Issues
Secure Browser
  • 6262 Views
  • 1 replies
  • 0 kudos
A1459
by New Contributor
  • 1374 Views
  • 0 replies
  • 0 kudos

Execute delete query from notebook on azure synapse

Hello Everyone, Is there a way we can execute the delete query from azure notebook on azure synapse database.I tried using read api method with option "query" but getting error like jdbc connector not able to handle code.Can any suggest how we can de...

  • 1374 Views
  • 0 replies
  • 0 kudos
sudhanshu1
by New Contributor III
  • 9716 Views
  • 2 replies
  • 2 kudos

Python file testing using pytest

Hi All,I have a requirement in my project, where we will be writing some python code inside databricks . Please note we will not be using pyspark . It will plain pythin with polars.I am looking into ho to create test files for main file. Below is sim...

  • 9716 Views
  • 2 replies
  • 2 kudos
Sujitha
by Databricks Employee
  • 11625 Views
  • 1 replies
  • 5 kudos

New how-to guide to data warehousing with the Data Intelligence Platform

Just launched: The Big Book of Data Warehousing and BI, a new hands-on guide focused on real-world use cases from governance, transformation, analytics and AI.As the demand for data becomes insatiable in every company, the data infrastructure has bec...

Screenshot 2023-12-05 at 11.04.08 AM.png
  • 11625 Views
  • 1 replies
  • 5 kudos
Latest Reply
Edward3
New Contributor II
  • 5 kudos

lol beans It used to take me a long time to regain my equilibrium, but recently I learned that a website really leads this layout when you may find delight after a stressful day here. Since then, I've been able to find my equilibrium much more quickl...

  • 5 kudos
pablobd
by Contributor II
  • 3340 Views
  • 1 replies
  • 0 kudos

Asset bundle build and deploy python wheel with versions

Hi all,I was able to deploy a wheel to the /Shared/ folder from a repository in Gitlab with asset bundles. The databricks.yml looks something like this.artifacts:  default:    type: whl    build: poetry build    path: .  targets:    workspace:      h...

  • 3340 Views
  • 1 replies
  • 0 kudos
Latest Reply
pablobd
Contributor II
  • 0 kudos

Finally I decided to use AWS Code Artifact and mirror the PyPI, which I think it's a bit cleaner. But your solution looks good too. Thanks!

  • 0 kudos
invalidargument
by New Contributor III
  • 1786 Views
  • 1 replies
  • 2 kudos

Create new workbooks with code

Is is possible to create new notebooks from a notevbook in databricks? I have tried this code. But all of them are generic files, not notebooks.notebook_str = """# Databricks notebook source import pyspark.sql.functions as F import numpy as np # CO...

  • 1786 Views
  • 1 replies
  • 2 kudos
Latest Reply
invalidargument
New Contributor III
  • 2 kudos

Unfortunaly %run does not help me since I can't %run a .py file. I still need my code in notebooks.I am transpiling propriatary code to python using jinja templates. I would like to have the output as notebooks since those are most convenient to edit...

  • 2 kudos
SamGreene
by Contributor II
  • 2657 Views
  • 1 replies
  • 0 kudos

Resolved! DLT Pipeline Graph is not detecting dependencies

Hi,This is my first databricks project.  I am loading data from a UC external volume in ADLS into tables and then split one of the tables into two tables based on a column.  When I create a pipeline, the tables don't have any dependencies and this is...

  • 2657 Views
  • 1 replies
  • 0 kudos
Latest Reply
SamGreene
Contributor II
  • 0 kudos

While re-implementing my pipeline to publish to dev/test/prod instead of bronze/silver/gold, I think I found the answer.  The downstream tables need to use the LIVE schema. 

  • 0 kudos
SamGreene
by Contributor II
  • 2371 Views
  • 1 replies
  • 1 kudos

Unpivoting data in live tables

I am loading data from CSV into live tables.  I have a live delta table with data like this:WaterMeterID, ReadingDateTime1, ReadingValue1, ReadingDateTime2, ReadingValue2It needs to be unpivoted into this:WaterMeterID, ReadingDateTime1, ReadingValue1...

  • 2371 Views
  • 1 replies
  • 1 kudos
leelee3000
by Databricks Employee
  • 2469 Views
  • 2 replies
  • 0 kudos

time travel with DLT

Needed some help with Time Travel with Delta Live tables   We were trying to figure out if we can go in and alter the history on this table, and what would happen to data that we mass upload?  By this we mean we have data from the past that we would ...

  • 2469 Views
  • 2 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Delta Live Tables leverage Delta Lake, or Delta Tables.  Delta tables, through transactions (e.g. insert, update, delete, merges, optimization) create versions of said Delta Table.  Once a version is created it cannot be altered, it is immutable.  Yo...

  • 0 kudos
1 More Replies
Phani1
by Databricks MVP
  • 2262 Views
  • 1 replies
  • 0 kudos

Upgrade Spark version 3.2 to 3.4+

Hi Team,We would like to upgrade from Spark version 3.2 to 3.4+ (Databricks Runtime - 10.4 to 12.2/13.3)We would like to understand how complex upgradation is this and challenges which we face? what are the technical steps and precautions we need to ...

  • 2262 Views
  • 1 replies
  • 0 kudos
Phani1
by Databricks MVP
  • 1973 Views
  • 0 replies
  • 0 kudos

Customer Managed Keys in Databricks (AWS)

Hi Databricks Team,Could you please provide me the detailed steps on how to be enabled customer managed keys in databricks (AWS) Account, if there is any video on it that would be great helpful.Regards,Phanindra

  • 1973 Views
  • 0 replies
  • 0 kudos
RamanP9404
by New Contributor
  • 3947 Views
  • 0 replies
  • 0 kudos

Spark Streaming Issues while performing left join

Hi team,I'm struck in a Spark Structured streaming use-case.Requirement: To read two streaming data frames, perform a left join on it and display the results. Issue: While performing a left join, the resultant data frame contains only rows where ther...

  • 3947 Views
  • 0 replies
  • 0 kudos
stackoftuts
by New Contributor
  • 935 Views
  • 0 replies
  • 0 kudos

AI uses

Delve into the transformative realm of AI applications, where innovation merges seamlessly with technology's limitless possibilities.Explore the multifaceted landscape of AI uses and its dynamic impact on diverse industries at StackOfTuts. 

  • 935 Views
  • 0 replies
  • 0 kudos
dZegpi
by New Contributor II
  • 1636 Views
  • 1 replies
  • 0 kudos

Load GCP data to Databricks using R

I'm working with Databricks and Google Cloud in the same project. I want to load specific datasets stored in GCP into a R notebook in Databricks. I currently can see the datasets in BigQuery. The problem is that using the sparklyr package, I'm not ab...

  • 1636 Views
  • 1 replies
  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels