cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

andre_gonzalez
by New Contributor
  • 3089 Views
  • 3 replies
  • 0 kudos

SQL wharehouse do not work with power bi online service

Whenever i try to use a SQL Wharehouse serverless cluster on a power bi dataset it does not refresh on the power bi online service. It does work normally for other types of databricks clusters. The catalog is being defined on the power query import.I...

2023-10-17_14-36.png
  • 3089 Views
  • 3 replies
  • 0 kudos
Latest Reply
ChuckyDee25
New Contributor II
  • 0 kudos

Hi,We have the exact same issue, even if we specify the catalog in the connection parameters.However, Oauth authentication through a dataflow (instead of from Power Query Desktop) works fine. In Desktop we are in version 2.122.746.0, but the issue is...

  • 0 kudos
2 More Replies
Lebrown
by New Contributor
  • 641 Views
  • 1 replies
  • 1 kudos

Free Edition and Databricks Asset Bundles

Hi,I would like to learn more about DAB's and gain practical knowledge. For this I want to use the Free Edition but the authentication fails. I have tried both the Databricks extension in VSCode and the Databricks CLI. In the extension, it returns: C...

  • 641 Views
  • 1 replies
  • 1 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 1 kudos

Hello Lebrown:Here are the steps I managed to deploy job and pipelines to Databricks using DABs with example (Using free edition):1. Install the Databricks CLI (latest version). On Windows, run:winget search databrickswinget install Databricks.Databr...

  • 1 kudos
billyboy
by New Contributor II
  • 1390 Views
  • 2 replies
  • 4 kudos

Resolved! How To Remove Extra DataBricks Free Edition Account

Accidently made an extra DataBricks Free edition account/workspace on the same email while messing with Legacy edition login, is there a way to delete one of these? The old community edition had a "Delete Account" button but can't seem to find that f...

  • 1390 Views
  • 2 replies
  • 4 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 4 kudos

Hello Billyboy,I can’t seem to find the option either, but one of the limitations of the Free Edition is:“Databricks may delete Free Edition accounts that are inactive for a prolonged period.”So, you could simply avoid logging into that account for a...

  • 4 kudos
1 More Replies
Jeremyy
by New Contributor
  • 1175 Views
  • 1 replies
  • 0 kudos

I can't create a compute resource beyond "SQL Warehouse", "Vector Search" and "Apps"?

None of the LLMs even understand why I can't create a compute resource. I was using community (now free edition) until yesterday, when I became apparent that I needed the paid version, so I upgraded. I've even got my AWS account connected, which was ...

  • 1175 Views
  • 1 replies
  • 0 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 0 kudos

Hello Jeremyy,The free edition has some limitation in terms of computing. As you noticed that there is no such option to create a custom compute. the custom compute configurations and GPUs are not supported. Free Edition users only have access to ser...

  • 0 kudos
upskill
by New Contributor
  • 751 Views
  • 1 replies
  • 0 kudos

Resolved! Delete workspace in Free account

I created a free edition account and I used my google account for logging in. I see 2 works spaces got created. I want to delete one of them. How can I delete one of the workspace. If it is not possible, how can I delete my account as a whole?

  • 751 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @upskill! Did you possibly sign in twice during setup? That can sometimes lead to separate accounts, each with its own workspace. Currently, there’s no self-serve option to remove a workspace or delete an account. You can reach out to help@data...

  • 0 kudos
ChristianRRL
by Valued Contributor III
  • 2891 Views
  • 3 replies
  • 1 kudos

DQ Expectations Best Practice

Hi there, I hope this is a fairly simple and straightforward question. I'm wondering if there's a "general" consensus on where along the DLT data ingestion + transformation process should data quality expectations be applied? For example, two very si...

  • 2891 Views
  • 3 replies
  • 1 kudos
Latest Reply
dataoculus_app
New Contributor III
  • 1 kudos

in my opinion, you can keep the bronze/raw layer as it is, and the quality check should be applied to silver.

  • 1 kudos
2 More Replies
Dimitry
by Contributor III
  • 1326 Views
  • 2 replies
  • 1 kudos

Resolved! Struggle to parallelize UDF

Hi all I have 2 clusters, that look identical but one runs my UDF in parallel another one does not.The ones that do is personal, the bad one is shared.import pandas as pd from datetime import datetime from time import sleep import threading # test f...

Dimitry_0-1750216264118.png Dimitry_1-1750216332766.png Dimitry_3-1750216642622.png
  • 1326 Views
  • 2 replies
  • 1 kudos
Latest Reply
Dimitry
Contributor III
  • 1 kudos

As a side note "no isolation shared" cluster has no access to unity catalog, so no table queries.I resorted to using personal compute assigned to a group.

  • 1 kudos
1 More Replies
Jerry01
by New Contributor III
  • 1613 Views
  • 1 replies
  • 0 kudos

How to override a in-built function in databricks

I am trying to override is_member() in-built function in such a way that, it always return true. How to do it in databricks using sql or python?

  • 1613 Views
  • 1 replies
  • 0 kudos
Latest Reply
xbgydx12
New Contributor II
  • 0 kudos

To re-active this question. I have a similar requirement. I want to override shouldRetain(log: T, currentTime: Long) in class org.apache.spark.sql.execution.streaming.CompactibleFileStreamLog, it also always return true

  • 0 kudos
zent
by New Contributor
  • 666 Views
  • 1 replies
  • 0 kudos

Requirements for Managed Iceberg tables with Unity Catalog

Does Databricks support creating native Apache iceberg tables(managed) in unity catalog or is it possible only with private preview, so what are the requirements?

  • 666 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @zent! Databricks now fully supports creating Apache Iceberg managed tables in Unity Catalog, and this capability is available in Public Preview (not just private preview). These managed Iceberg tables can be read and written by Databricks and ...

  • 0 kudos
Anton_Lagergren
by Contributor
  • 2553 Views
  • 2 replies
  • 1 kudos

Resolved! New Regional Group Request

Hello!How may I request and/or create a new Regional Group for the DMV Area (DC, Maryland, Virginia).Thank you,—Anton@DB_Paul   @Sujitha 

  • 2553 Views
  • 2 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

Is there a group you already created??

  • 1 kudos
1 More Replies
darkanita81
by New Contributor III
  • 1362 Views
  • 3 replies
  • 3 kudos

Resolved! How be a part of Databricks Groups

Hello, I am part of a Community Databricks Crew LATAM, where we have achieved 300  people connected and we have executed 3 events, one by month, we want to be part of Databricks Groups but we dont know how to do that, if somebody can help me I will a...

  • 1362 Views
  • 3 replies
  • 3 kudos
Latest Reply
Rishabh_Tiwari
Databricks Employee
  • 3 kudos

Hi Ana, Thanks for reaching out! I won’t be attending DAIS this time, but we do have a Databricks Community booth set up near the Expo Hall. My colleague @Sujitha  will be there. Do stop by to say hi and learn about all the exciting things we have go...

  • 3 kudos
2 More Replies
Dimitry
by Contributor III
  • 2660 Views
  • 2 replies
  • 0 kudos

How to "Python versions in the Spark Connect client and server are different. " in UDF

I've read all relevant articles but none have solution that I could understand. Sorry I'm new to it.I have a simple UDF to demonstrate the problem:df = spark.createDataFrame([(1, 1.0, 'a'), (1, 2.0, 'b'), (2, 3.0, 'c'), (2, 5.0, 'd'), (2, 10.0, 'e')]...

Dimitry_0-1749435601522.png
  • 2660 Views
  • 2 replies
  • 0 kudos
Latest Reply
SP_6721
Honored Contributor
  • 0 kudos

Hi @Dimitry ,The error you're seeing indicates that the Python version in your notebook (3.11) doesn't match the version used by Databricks Serverless, which is typically Python 3.12. Since Serverless environments use a fixed Python version, this mis...

  • 0 kudos
1 More Replies
anilsampson
by New Contributor III
  • 877 Views
  • 1 replies
  • 1 kudos

Databricks Dashboard run from Job issue

Hello, i am trying to trigger a databricks dashboard via workflow task.1.when i deploy the job triggering the dashboard task via local "Deploy bundle" command deployment is successful.2. when i try to deploy to a different environment via CICD while ...

  • 877 Views
  • 1 replies
  • 1 kudos
Latest Reply
SP_6721
Honored Contributor
  • 1 kudos

Hi @anilsampson ,The error means your dashboard_task is not properly nested under the tasks section.tasks:- task_key: dashboard_task  dashboard_task:    dashboard_id: ${resources.dashboards.nyc_taxi_trip_analysis.id}    warehouse_id: ${var.warehouse_...

  • 1 kudos
amit_jbs
by New Contributor II
  • 4103 Views
  • 6 replies
  • 2 kudos

In databricks deployment .py files getting converted to notebooks

A critical issue has arisen that is impacting our deployment planning for our client. We have encountered a challenge with our Azure CI/CD pipeline integration, specifically concerning the deployment of Python files (.py). Despite our best efforts, w...

  • 4103 Views
  • 6 replies
  • 2 kudos
Latest Reply
AGivenUser
New Contributor II
  • 2 kudos

Another option is Databricks Asset Bundles.

  • 2 kudos
5 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels