cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

invalidargument
by New Contributor III
  • 1614 Views
  • 1 replies
  • 2 kudos

Create new workbooks with code

Is is possible to create new notebooks from a notevbook in databricks? I have tried this code. But all of them are generic files, not notebooks.notebook_str = """# Databricks notebook source import pyspark.sql.functions as F import numpy as np # CO...

  • 1614 Views
  • 1 replies
  • 2 kudos
Latest Reply
invalidargument
New Contributor III
  • 2 kudos

Unfortunaly %run does not help me since I can't %run a .py file. I still need my code in notebooks.I am transpiling propriatary code to python using jinja templates. I would like to have the output as notebooks since those are most convenient to edit...

  • 2 kudos
SamGreene
by Contributor II
  • 2442 Views
  • 1 replies
  • 0 kudos

Resolved! DLT Pipeline Graph is not detecting dependencies

Hi,This is my first databricks project.  I am loading data from a UC external volume in ADLS into tables and then split one of the tables into two tables based on a column.  When I create a pipeline, the tables don't have any dependencies and this is...

  • 2442 Views
  • 1 replies
  • 0 kudos
Latest Reply
SamGreene
Contributor II
  • 0 kudos

While re-implementing my pipeline to publish to dev/test/prod instead of bronze/silver/gold, I think I found the answer.  The downstream tables need to use the LIVE schema. 

  • 0 kudos
SamGreene
by Contributor II
  • 2211 Views
  • 1 replies
  • 1 kudos

Unpivoting data in live tables

I am loading data from CSV into live tables.  I have a live delta table with data like this:WaterMeterID, ReadingDateTime1, ReadingValue1, ReadingDateTime2, ReadingValue2It needs to be unpivoted into this:WaterMeterID, ReadingDateTime1, ReadingValue1...

  • 2211 Views
  • 1 replies
  • 1 kudos
leelee3000
by Databricks Employee
  • 2243 Views
  • 2 replies
  • 0 kudos

time travel with DLT

Needed some help with Time Travel with Delta Live tables   We were trying to figure out if we can go in and alter the history on this table, and what would happen to data that we mass upload?  By this we mean we have data from the past that we would ...

  • 2243 Views
  • 2 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

Delta Live Tables leverage Delta Lake, or Delta Tables.  Delta tables, through transactions (e.g. insert, update, delete, merges, optimization) create versions of said Delta Table.  Once a version is created it cannot be altered, it is immutable.  Yo...

  • 0 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 2078 Views
  • 1 replies
  • 0 kudos

Upgrade Spark version 3.2 to 3.4+

Hi Team,We would like to upgrade from Spark version 3.2 to 3.4+ (Databricks Runtime - 10.4 to 12.2/13.3)We would like to understand how complex upgradation is this and challenges which we face? what are the technical steps and precautions we need to ...

  • 2078 Views
  • 1 replies
  • 0 kudos
Phani1
by Valued Contributor II
  • 1917 Views
  • 0 replies
  • 0 kudos

Customer Managed Keys in Databricks (AWS)

Hi Databricks Team,Could you please provide me the detailed steps on how to be enabled customer managed keys in databricks (AWS) Account, if there is any video on it that would be great helpful.Regards,Phanindra

  • 1917 Views
  • 0 replies
  • 0 kudos
RamanP9404
by New Contributor
  • 3809 Views
  • 0 replies
  • 0 kudos

Spark Streaming Issues while performing left join

Hi team,I'm struck in a Spark Structured streaming use-case.Requirement: To read two streaming data frames, perform a left join on it and display the results. Issue: While performing a left join, the resultant data frame contains only rows where ther...

  • 3809 Views
  • 0 replies
  • 0 kudos
stackoftuts
by New Contributor
  • 853 Views
  • 0 replies
  • 0 kudos

AI uses

Delve into the transformative realm of AI applications, where innovation merges seamlessly with technology's limitless possibilities.Explore the multifaceted landscape of AI uses and its dynamic impact on diverse industries at StackOfTuts. 

  • 853 Views
  • 0 replies
  • 0 kudos
dZegpi
by New Contributor II
  • 1518 Views
  • 1 replies
  • 0 kudos

Load GCP data to Databricks using R

I'm working with Databricks and Google Cloud in the same project. I want to load specific datasets stored in GCP into a R notebook in Databricks. I currently can see the datasets in BigQuery. The problem is that using the sparklyr package, I'm not ab...

  • 1518 Views
  • 1 replies
  • 0 kudos
vish93
by New Contributor II
  • 2332 Views
  • 0 replies
  • 1 kudos

Best AI Art Generator

AI art generator uses artificial intelligence to create captivating artworks, redefining the boundaries of traditional creativity and enabling endless artistic possibilities.AI photo restoration is a groundbreaking technology that employs artificial ...

  • 2332 Views
  • 0 replies
  • 1 kudos
Phani1
by Valued Contributor II
  • 6245 Views
  • 0 replies
  • 0 kudos

Alter table

Hi Team, Could you please suggest, Do we have an alternate approach to alter the table instead of creating a new table and copying the data as part of the deployment.

  • 6245 Views
  • 0 replies
  • 0 kudos
patojo94
by New Contributor II
  • 3939 Views
  • 1 replies
  • 2 kudos

Stream failure JsonParseException

Hi all! I am having the following issue with a couple of pyspark streams. I have some notebooks running each of them an independent file structured streaming using  delta bronze table  (gzip parquet files) dumped from kinesis to S3 in a previous job....

Get Started Discussions
Photon
streaming aggregations
  • 3939 Views
  • 1 replies
  • 2 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels