cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Pratikmsbsvm
by Contributor
  • 1313 Views
  • 2 replies
  • 0 kudos

Migration of PowerBI reports from Synapse to Databricks sql (DBSQL)

We have 250 powerbi reports build on top of Azure Synapse, now we are migrating from Azure Synapse to Databricks (DB SQL). How to plan for cutover and strategy for PowerBII just seeking high level points we have to take care for planning. Any techie ...

  • 1313 Views
  • 2 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

While your account Solution Architect (SA) will be able to guide you, if you still want to check what peers did here https://community.databricks.com/t5/warehousing-analytics/migrate-azure-synapse-analytics-data-to-databricks/td-p/90663 and here http...

  • 0 kudos
1 More Replies
NIK251
by New Contributor III
  • 2320 Views
  • 3 replies
  • 1 kudos

Resolved! Delta Live Table Pipeline

I have the error message when try to create a delta live table pipeline.My error is: com.databricks.pipelines.common.errors.deployment.DeploymentException: Failed to launch pipeline cluster 1207-112912-8e84v9h5: Encountered Quota Exhaustion issue in ...

  • 2320 Views
  • 3 replies
  • 1 kudos
Latest Reply
NIK251
New Contributor III
  • 1 kudos

Thanks sir, I solved it.

  • 1 kudos
2 More Replies
DebIT2011
by New Contributor III
  • 9443 Views
  • 4 replies
  • 9 kudos

Choosing between Azure Data Factory (ADF) and Databricks PySpark notebooks

I’m working on a project where I need to pull large datasets from Cosmos DB into Databricks for further processing, and I’m trying to decide whether to use Azure Data Factory (ADF) or Databricks PySpark notebooks for the extraction and processing tas...

  • 9443 Views
  • 4 replies
  • 9 kudos
Latest Reply
Johns404
New Contributor II
  • 9 kudos

Hi @DebIT2011,You're facing a classic architectural decision between orchestration with ADF versus direct transformation using Databricks PySpark notebooks. Both tools are powerful but serve different purposes depending on your project needs. Below i...

  • 9 kudos
3 More Replies
makerandcoder12
by New Contributor
  • 766 Views
  • 1 replies
  • 0 kudos

How can I leverage Databricks for building end-to-end machine learning pipelines?

I’ve been following practical tutorials on makerandcoder, which often showcase hands-on machine learning projects using Python, scikit-learn, and Spark. I’m looking to scale my projects using the Databricks platform for better collaboration, data han...

  • 766 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Databricks enables the creation of scalable, end-to-end machine learning (ML) pipelines by providing a comprehensive and collaborative platform that integrates key components for data handling, experimentation, and model deployment. Here’s how Databr...

  • 0 kudos
rafal_walisko
by New Contributor II
  • 2352 Views
  • 1 replies
  • 0 kudos

Optimal Strategies for downloading large query results with Databricks API

Hi everyone,I'm currently facing an issue with handling a large amount of data using the Databricks API. Specifically, I have a query that returns a significant volume of data, sometimes resulting in over 200 chunks.My initial approach was to retriev...

  • 2352 Views
  • 1 replies
  • 0 kudos
Latest Reply
Datagyan
New Contributor II
  • 0 kudos

I am also facing the same issue now one approach tomorrow i will try I will create a job that using serverless job cluster. Then whenever user will click on download button from UI. This should trigger the job now this job. Will read the table as dat...

  • 0 kudos
arnas
by New Contributor II
  • 852 Views
  • 3 replies
  • 0 kudos

S3 limited bucket permissions

Hi,can I run Databricks on limited/restricted S3 bucket folder, no access to bucket root level as it is restricted per project folder in IAM?i.e s3://mybucket/myproject_abc/Now I configured all permissions as per documentationhttps://docs.databricks....

  • 852 Views
  • 3 replies
  • 0 kudos
Latest Reply
arnas
New Contributor II
  • 0 kudos

Thanks, but no thanks, spam resides in JUNK folder

  • 0 kudos
2 More Replies
tts
by New Contributor III
  • 3217 Views
  • 8 replies
  • 0 kudos

Resolved! Programatic selection of serverless compute for notebooks environment version

Hello,I have a case where I am executing notebooks from an external system using databricks api /api/2.2/jobs/runs/submit. This has always been non problematic with the job compute, but due to the quite recent serverless for notebooks support being i...

tts_1-1739539955132.png
  • 3217 Views
  • 8 replies
  • 0 kudos
Latest Reply
GerardW
New Contributor II
  • 0 kudos

@tts, did you manage to get this resolved? Battling the same issue here and cannot find a way to make it work. Even though the default API is 2 now, it still seems to randomly assign an API when deploy via DAB. @JakubSkibicki, tried this too but hitt...

  • 0 kudos
7 More Replies
MOUNIKASIMHADRI
by New Contributor
  • 17977 Views
  • 6 replies
  • 1 kudos

Insufficient Permissions Issue on Databricks

I have encountered a technical issue on Databricks.While executing commands both in Spark and SQL within the Databricks environment, I’ve run into permission-related errors from selecting files from DBFS. "org.apache.spark.SparkSecurityException: [IN...

  • 17977 Views
  • 6 replies
  • 1 kudos
Latest Reply
NandiniN
Databricks Employee
  • 1 kudos

Please refer to some of the other community articles with the no module error https://community.databricks.com/t5/data-engineering/udf-importing-from-other-modules/td-p/58988

  • 1 kudos
5 More Replies
Fz1
by New Contributor III
  • 4638 Views
  • 4 replies
  • 1 kudos

DLT Pipeline unable to find custom Libraries/Wheel packages

We have our DLT pipeline and we need to import our custom libraries packaged in wheel files.We are on Azure DBX and we are using Az DevOps CI/CD to build and deploy the wheel packages on our DBX environment. In the top of our DLT notebook we are impo...

Get Started Discussions
dbfs
dlt
Libraries
python
wheel
  • 4638 Views
  • 4 replies
  • 1 kudos
Latest Reply
Laurence_Fishbu
New Contributor II
  • 1 kudos

You might want to verify the file path and permissions within your CI/CD process—sometimes the context in which the pipeline runs lacks proper DBFS mount visibility. We've encountered similar visibility inconsistencies while working on data aggregati...

  • 1 kudos
3 More Replies
Sudheer2
by New Contributor III
  • 517 Views
  • 1 replies
  • 0 kudos

How to Migrate Legacy Dashboards from hive_metastore to Unity Catalog using Python

Hi all,After updating the legacy dashboard APIs, I’m looking to migrate legacy dashboards from the hive_metastore to Unity Catalog in Databricks. Specifically, I want to programmatically:Migrate SQL queries used in dashboardsRetain or recreate the as...

  • 517 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

For your consideration:   To migrate legacy dashboards from the Hive Metastore to Unity Catalog in Databricks programmatically while retaining SQL queries, data visualizations, and ensuring compatibility with Unity Catalog schemas and tables using Py...

  • 0 kudos
dbsuersu
by New Contributor II
  • 5737 Views
  • 3 replies
  • 4 kudos

ArcGIS Connection

Hi,I am trying to connect to an ArcGIS instance using Data bricks. Is this possible? After connecting, I am trying to read the data into a Data fame.Please help me with this request. If its not possible to connect , please provide an alternative.Than...

  • 5737 Views
  • 3 replies
  • 4 kudos
Latest Reply
GISWhammy
New Contributor II
  • 4 kudos

I am trying to set up an ODBC or JDBC direct connection from ArcGIS Pro  and ArcGIS Enterprise Server; has anyone done this successfully? I was able to make DSN successful connection, but no tables are being delivered; I did not use a connection stri...

  • 4 kudos
2 More Replies
Kuchnhi
by New Contributor III
  • 3260 Views
  • 11 replies
  • 7 kudos

Facing issues while upgrading DBR version from 9.1 LTS to 15.4 LTS

Dear all,I am upgrading DBR version from 9.1 LTS to 15.4 LTS in Azure Databricks. for that I have created a new cluster with 15.4 DBR attached init script for installing application dependencies. Cluster has started successfully but it takes 30 min. ...

  • 3260 Views
  • 11 replies
  • 7 kudos
Latest Reply
SmithPoll
New Contributor III
  • 7 kudos

ust to add, you might also want to check the cluster logs (driver and init script logs) for any hidden errors or timeouts during startup. Sometimes dependencies silently fail to install,even if the cluster appears to be running. If possible, try brea...

  • 7 kudos
10 More Replies
Sujitha
by Databricks Employee
  • 2341 Views
  • 3 replies
  • 2 kudos

Calling all Chennai residents! Join the Chennai User Group on Community!

Calling all Chennai residents! Join the Chennai User Group on Community!   Are you passionate about our vibrant city of Chennai? Do you love connecting with like-minded individuals, expanding your knowledge, and contributing to a thriving Community...

  • 2341 Views
  • 3 replies
  • 2 kudos
Latest Reply
rmohanrangan
New Contributor III
  • 2 kudos

Thank you, will that be happening in future or any what's up group is available now?

  • 2 kudos
2 More Replies
oricaruso
by New Contributor II
  • 753 Views
  • 1 replies
  • 0 kudos

Gcs and databricks community

Hello,I would like to know if it is possible to connect my Databricks community account with a Google cloud storage account via a notebook.I tried to connect it via the json key of my gcs service account but the notebook always gives this error when ...

  • 753 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @oricaruso! It looks like this post duplicates the one you posted earlier. A response has already been provided in the original thread. I recommend continuing the discussion there to keep the conversation focused and organized.

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels