cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

pablobd
by Contributor II
  • 2415 Views
  • 1 replies
  • 0 kudos

Asset bundle build and deploy python wheel with versions

Hi all,I was able to deploy a wheel to the /Shared/ folder from a repository in Gitlab with asset bundles. The databricks.yml looks something like this.artifacts:  default:    type: whl    build: poetry build    path: .  targets:    workspace:      h...

  • 2415 Views
  • 1 replies
  • 0 kudos
Latest Reply
pablobd
Contributor II
  • 0 kudos

Finally I decided to use AWS Code Artifact and mirror the PyPI, which I think it's a bit cleaner. But your solution looks good too. Thanks!

  • 0 kudos
leelee3000
by Databricks Employee
  • 1266 Views
  • 2 replies
  • 0 kudos

time travel with DLT

Needed some help with Time Travel with Delta Live tables   We were trying to figure out if we can go in and alter the history on this table, and what would happen to data that we mass upload?  By this we mean we have data from the past that we would ...

  • 1266 Views
  • 2 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

Delta Live Tables leverage Delta Lake, or Delta Tables.  Delta tables, through transactions (e.g. insert, update, delete, merges, optimization) create versions of said Delta Table.  Once a version is created it cannot be altered, it is immutable.  Yo...

  • 0 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 1331 Views
  • 1 replies
  • 0 kudos

Upgrade Spark version 3.2 to 3.4+

Hi Team,We would like to upgrade from Spark version 3.2 to 3.4+ (Databricks Runtime - 10.4 to 12.2/13.3)We would like to understand how complex upgradation is this and challenges which we face? what are the technical steps and precautions we need to ...

  • 1331 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
Hi Team,We would like to upgrade from Spark version 3.2 to 3.4+ (Databricks Runtime - 10.4 to 12.2/13.3)We would like to understand how complex upgradation is this and challenges which we face? what are the technical steps and precautions we need to ...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
Phani1
by Valued Contributor II
  • 1688 Views
  • 0 replies
  • 0 kudos

Customer Managed Keys in Databricks (AWS)

Hi Databricks Team,Could you please provide me the detailed steps on how to be enabled customer managed keys in databricks (AWS) Account, if there is any video on it that would be great helpful.Regards,Phanindra

  • 1688 Views
  • 0 replies
  • 0 kudos
RamanP9404
by New Contributor
  • 1885 Views
  • 0 replies
  • 0 kudos

Spark Streaming Issues while performing left join

Hi team,I'm struck in a Spark Structured streaming use-case.Requirement: To read two streaming data frames, perform a left join on it and display the results. Issue: While performing a left join, the resultant data frame contains only rows where ther...

  • 1885 Views
  • 0 replies
  • 0 kudos
dZegpi
by New Contributor II
  • 1105 Views
  • 1 replies
  • 0 kudos

Load GCP data to Databricks using R

I'm working with Databricks and Google Cloud in the same project. I want to load specific datasets stored in GCP into a R notebook in Databricks. I currently can see the datasets in BigQuery. The problem is that using the sparklyr package, I'm not ab...

  • 1105 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
I'm working with Databricks and Google Cloud in the same project. I want to load specific datasets stored in GCP into a R notebook in Databricks. I currently can see the datasets in BigQuery. The problem is that using the sparklyr package, I'm not ab...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
vish93
by New Contributor II
  • 778 Views
  • 0 replies
  • 1 kudos

Best AI Art Generator

AI art generator uses artificial intelligence to create captivating artworks, redefining the boundaries of traditional creativity and enabling endless artistic possibilities.AI photo restoration is a groundbreaking technology that employs artificial ...

  • 778 Views
  • 0 replies
  • 1 kudos
Phani1
by Valued Contributor II
  • 5374 Views
  • 0 replies
  • 0 kudos

Alter table

Hi Team, Could you please suggest, Do we have an alternate approach to alter the table instead of creating a new table and copying the data as part of the deployment.

  • 5374 Views
  • 0 replies
  • 0 kudos
Xyrion
by New Contributor II
  • 1070 Views
  • 1 replies
  • 0 kudos

Constraint options usage

I am trying to use the constraints options:NOT ENFORCEDDEFERRABLEINITIALLY DEFERREDNORELYHowever it seems I am not able to use them successfully. When I try to use them with PRIMARY KEYS (not sure if it is possible), I am not able to enforce any key....

  • 1070 Views
  • 1 replies
  • 0 kudos
Latest Reply
Xyrion
New Contributor II
  • 0 kudos

BTW the forum is bugged I can't paste code..

  • 0 kudos
Borkadd
by New Contributor II
  • 1544 Views
  • 1 replies
  • 0 kudos

Multi Task Job creation through Pulumi

I am trying to create a multi-task Databricks Job in Azure Cloud with its own cluster.Although I was able to create a single task job without any issues, the code to deploy the multi-task job fails due to the following cluster validation error:error:...

  • 1544 Views
  • 1 replies
  • 0 kudos
Latest Reply
Borkadd
New Contributor II
  • 0 kudos

Hello @Retired_mod, thanks for your answer, but the problem keeps the same. I had already tested with different cluster configurations, single-node and multi-node, including those cluster configurations which worked with single task jobs, but the err...

  • 0 kudos
Aria
by New Contributor III
  • 7892 Views
  • 2 replies
  • 2 kudos

Databricks Asset bundle

Hi,I am new to databricks, We are trying to use Databricks asset bundles for code deployment .I have spect a lot of time but still so many things are not clear to me.Can we change the target path of the notebooks deployed from /shared/.bundle/* to so...

  • 7892 Views
  • 2 replies
  • 2 kudos
Latest Reply
Еmil
New Contributor III
  • 2 kudos

Hi @Retired_mod,Thank you for your post, I thought it will solve my issues too, however after reading your suggestion it was nothing new for me, because l already have done exactly that.. Here is what I have dome so you or anyone can replicate it:1. ...

  • 2 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 479 Views
  • 0 replies
  • 0 kudos

Auto Loader notebook for multiple tables

Hi Team,My requirement is ,i do have File A from source A which needs to write into Multiple Delta tables i.e DeltaTableA,DeltaTableB,DeltaTableC. Is it possible to have a single instance of an autoloader script. (multiple write streams). Could you p...

  • 479 Views
  • 0 replies
  • 0 kudos
Phani1
by Valued Contributor II
  • 1976 Views
  • 0 replies
  • 0 kudos

Unity catalog Migration

Hi Team,Could you please help me to understand,  1)Why we need to migrate Unity catalog? if we are not migrating what benefits we will not get?2) How to migrate Unity catalog (What all are objects needs to migrate and any tool) ? Regards,Phanindra

  • 1976 Views
  • 0 replies
  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels