cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

dZegpi
by New Contributor II
  • 775 Views
  • 3 replies
  • 0 kudos

Load GCP data to Databricks using R

I'm working with Databricks and Google Cloud in the same project. I want to load specific datasets stored in GCP into a R notebook in Databricks. I currently can see the datasets in BigQuery. The problem is that using the sparklyr package, I'm not ab...

  • 775 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Our End-of-Year Community Survey is here! Please take a few moments to complete the survey. Your feedback matters!

  • 0 kudos
2 More Replies
vish93
by New Contributor
  • 399 Views
  • 0 replies
  • 0 kudos

Best AI Art Generator

AI art generator uses artificial intelligence to create captivating artworks, redefining the boundaries of traditional creativity and enabling endless artistic possibilities.AI photo restoration is a groundbreaking technology that employs artificial ...

  • 399 Views
  • 0 replies
  • 0 kudos
Phani1
by Valued Contributor
  • 2840 Views
  • 1 replies
  • 0 kudos

Alter table

Hi Team, Could you please suggest, Do we have an alternate approach to alter the table instead of creating a new table and copying the data as part of the deployment.

  • 2840 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Phani1, When deploying changes to a table in Azure Databricks, you can use an alternate approach to alter the table without creating a new one and copying data. Here are some options: ALTER TABLE Statement: The ALTER TABLE statement allows you to...

  • 0 kudos
Xyrion
by New Contributor II
  • 793 Views
  • 1 replies
  • 0 kudos

Constraint options usage

I am trying to use the constraints options:NOT ENFORCEDDEFERRABLEINITIALLY DEFERREDNORELYHowever it seems I am not able to use them successfully. When I try to use them with PRIMARY KEYS (not sure if it is possible), I am not able to enforce any key....

  • 793 Views
  • 1 replies
  • 0 kudos
Latest Reply
Xyrion
New Contributor II
  • 0 kudos

BTW the forum is bugged I can't paste code..

  • 0 kudos
Borkadd
by New Contributor II
  • 1030 Views
  • 2 replies
  • 0 kudos

Multi Task Job creation through Pulumi

I am trying to create a multi-task Databricks Job in Azure Cloud with its own cluster.Although I was able to create a single task job without any issues, the code to deploy the multi-task job fails due to the following cluster validation error:error:...

  • 1030 Views
  • 2 replies
  • 0 kudos
Latest Reply
Borkadd
New Contributor II
  • 0 kudos

Hello @Kaniz_Fatma, thanks for your answer, but the problem keeps the same. I had already tested with different cluster configurations, single-node and multi-node, including those cluster configurations which worked with single task jobs, but the err...

  • 0 kudos
1 More Replies
Aria
by New Contributor III
  • 4229 Views
  • 3 replies
  • 3 kudos

Databricks Asset bundle

Hi,I am new to databricks, We are trying to use Databricks asset bundles for code deployment .I have spect a lot of time but still so many things are not clear to me.Can we change the target path of the notebooks deployed from /shared/.bundle/* to so...

  • 4229 Views
  • 3 replies
  • 3 kudos
Latest Reply
Еmil
New Contributor III
  • 3 kudos

Hi @Kaniz_Fatma,Thank you for your post, I thought it will solve my issues too, however after reading your suggestion it was nothing new for me, because l already have done exactly that.. Here is what I have dome so you or anyone can replicate it:1. ...

  • 3 kudos
2 More Replies
MFrandsen
by New Contributor
  • 660 Views
  • 1 replies
  • 0 kudos

Question for exam project

For my exam i have to do a small project for the company im interning at. I am creating a datawarehouse where i will have to transfer data from another database, and then transforming it to a star schema. would databricks be good for this, or is it t...

  • 660 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @MFrandsen, Let’s delve into the world of data warehousing and star schemas.   Databricks is indeed a powerful platform, and it can be a great choice for your project. Here’s why:   Star Schema: A star schema is a multi-dimensional data model used...

  • 0 kudos
Phani1
by Valued Contributor
  • 340 Views
  • 0 replies
  • 0 kudos

Auto Loader notebook for multiple tables

Hi Team,My requirement is ,i do have File A from source A which needs to write into Multiple Delta tables i.e DeltaTableA,DeltaTableB,DeltaTableC. Is it possible to have a single instance of an autoloader script. (multiple write streams). Could you p...

  • 340 Views
  • 0 replies
  • 0 kudos
Phani1
by Valued Contributor
  • 1559 Views
  • 1 replies
  • 1 kudos

Unity catalog Migration

Hi Team,Could you please help me to understand,  1)Why we need to migrate Unity catalog? if we are not migrating what benefits we will not get?2) How to migrate Unity catalog (What all are objects needs to migrate and any tool) ? Regards,Phanindra

  • 1559 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @Phani1, Good Question!   Why Migrate to Unity Catalog?   Unity Catalog is a unified governance solution for Databricks workspaces. Without it, each Databricks workspace connects to a Hive metastore and maintains a separate service for Table Acces...

  • 1 kudos
Sreekanth_N
by New Contributor II
  • 1545 Views
  • 3 replies
  • 0 kudos

'NotebookHandler' object has no attribute 'setContext' in pyspark streaming in AWS

I am facing issue while calling dbutils.notebook.run() inside of pyspark streaming with concurrent.executor. At first the error is "pyspark.sql.utils.IllegalArgumentException: Context not valid. If you are calling this outside the main thread,you mus...

  • 1545 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kevin3
New Contributor III
  • 0 kudos

The error message you're encountering in PySpark when using dbutils.notebook.run() suggests that the context in which you are attempting to call the run() method is not valid. PySpark notebooks in Databricks have certain requirements when it comes to...

  • 0 kudos
2 More Replies
anandreddy23
by New Contributor II
  • 2236 Views
  • 2 replies
  • 1 kudos

unpersist doesn't clear

from pyspark.sql import SparkSessionfrom pyspark import SparkContext, SparkConffrom pyspark.storagelevel import StorageLevelspark = SparkSession.builder.appName('TEST').config('spark.ui.port','4098').enableHiveSupport().getOrCreate()df4 = spark.sql('...

  • 2236 Views
  • 2 replies
  • 1 kudos
Latest Reply
anandreddy23
New Contributor II
  • 1 kudos

Thank you so much for taking time and explaining the concepts

  • 1 kudos
1 More Replies
doublesteakhous
by New Contributor
  • 607 Views
  • 1 replies
  • 0 kudos

SQL UDFs not visible in notebooks

We are using a serverless SQL warehouse and managed tables in unity catalog in Azure Databricks. When usind the designated catalog tab, I can see and filter for functions, but when I am developing in a notebook, there are only tables and views visibl...

  • 607 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @doublesteakhous, When working with Azure Databricks Notebooks, you might notice that the smaller catalog view to the side only displays tables and views, but not functions. However, there is a way to access and explore functions within your noteb...

  • 0 kudos
valjas
by New Contributor III
  • 2927 Views
  • 3 replies
  • 1 kudos

Clusters are really slow

We have two environments for our Azure Databricks. Dev and Prod. We had clusters created and tested in Dev environment, then they were exported to the prod environment through APIs. The clusters in Dev are performing as expected. Whereas, the cluster...

  • 2927 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @valjas, Workspace Creation and Cluster Performance: Actions taken during the creation of a workspace can indeed impact cluster performance. When setting up a workspace, consider the following factors: Configuration Settings: Ensure that the wor...

  • 1 kudos
2 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels
Top Kudoed Authors