cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Sudheer2
by New Contributor III
  • 1551 Views
  • 5 replies
  • 0 kudos

Issue with Adding New Members to Existing Groups During Migration in User group Service Principle

 Hi all,I have implemented a migration process to move groups from a source workspace to a target workspace using the following code. The code successfully migrates groups and their members to the target system, but I am facing an issue when it comes...

  • 1551 Views
  • 5 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

I have provided response in https://community.databricks.com/t5/get-started-discussions/migrating-service-principals-from-non-unity-to-unity-enabled/m-p/103017#M4679 

  • 0 kudos
4 More Replies
Tanay
by New Contributor II
  • 2218 Views
  • 1 replies
  • 1 kudos

Resolved! Why does a join on (df1.id == df2.id) result in duplicate columns while on="id" does not?

Why does a join with on (df1.id == df2.id) result in duplicate columns, but on="id" does not?I encountered an interesting behavior while performing a join on two Data frames. Here's the scenario: df1 = spark.createDataFrame([(1, "Alice"), (2, "Bob"),...

  • 2218 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @Tanay , Your intuition is correct here. In Apache Spark, the difference in behavior between on (df1.id == df2.id) and on="id" in a join stems from how Spark resolves and handles column naming during the join operation.When you use the first synta...

  • 1 kudos
Roig
by New Contributor II
  • 754 Views
  • 2 replies
  • 0 kudos

Create multiple dashboard subscription with filters

Hi Databricks community, We developed a dashboard that surfaces up several important KPIs for each project we have.In the top filter, we select the project name and the time frame and the dashboard will present the relevant KPIs and charts. I can eas...

  • 754 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

You can achieve this by setting up different schedules for each project and specifying the default filter values accordingly   Create the Dashboard: Ensure your dashboard is set up with the necessary filters, including the project filter.   Set Defau...

  • 0 kudos
1 More Replies
SalDossored
by New Contributor II
  • 7107 Views
  • 2 replies
  • 0 kudos

PPT material or document from Databricks Learning

Hello Databricks Community,I am a beginner with Databricks. I am wondering if we can download power point slides or learning documents from the Databricks Learning Platform. I like to read after taking the online course. Could you let me know? Curren...

Get Started Discussions
Learning Databricks
study material
  • 7107 Views
  • 2 replies
  • 0 kudos
selectasol
by New Contributor
  • 456 Views
  • 1 replies
  • 0 kudos

Databricks, Cloud Services Pricing

i am unable to find reason of not getting Databricks, Cloud Services Pricing why?

  • 456 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Can you please provide some more context on the issue you are facing to be able to properly assist you?

  • 0 kudos
Phani1
by Valued Contributor II
  • 1376 Views
  • 3 replies
  • 1 kudos

Delta sharing vs CosmosDB

 Hi All,We have a situation where we write data to CosmosDB and create JSON data for a transaction table, which includes a mini statement in JSON format.Now, we want to introduce the concept of delta sharing and share the transaction table. The Java ...

  • 1376 Views
  • 3 replies
  • 1 kudos
Latest Reply
Phani1
Valued Contributor II
  • 1 kudos

Thanks for  your reply,Right now, the team is transferring data from Databricks to Cosmos DB, and then they're using REST APIs to access that data. They handle about 100 requests per minute, with some tables needing around 100 requests per second due...

  • 1 kudos
2 More Replies
Cloud_Architect
by New Contributor III
  • 4330 Views
  • 4 replies
  • 0 kudos

How to get the Usage/DBU Consumption report without using system tables

Is there a way to get the usage/DBU consumption report without using system tables?

  • 4330 Views
  • 4 replies
  • 0 kudos
Latest Reply
TracyJackson
New Contributor II
  • 0 kudos

You can get DBU consumption reports using the Azure Portal (for Azure SQL), through Metrics under your database's "Usage" section, or via Dynamic Management Views (DMVs) like sys.dm_db_resource_stats in SSMS. Third-party tools like SQL Sentry also of...

  • 0 kudos
3 More Replies
Nicolas_Izidoro
by New Contributor II
  • 924 Views
  • 5 replies
  • 1 kudos

Não estou conseguindo logar na minha conta Databricks communit

Galera não consigo logar na minha conta Databricks Communit fala que meu email não tem nada criado nele, mas eu tenho essa conta a um bom tempo já e nunca me ocorreu isso, já até tentei criar uma outra conta com esse mesmo email, mas não consigo cria...

  • 924 Views
  • 5 replies
  • 1 kudos
Latest Reply
Nicolas_Izidoro
New Contributor II
  • 1 kudos

infelizmente Não tenho nenhuma url

  • 1 kudos
4 More Replies
gluedhawkeye
by New Contributor II
  • 1453 Views
  • 2 replies
  • 0 kudos

Calling the w.genie function throws a "API is not yet supported in the workspace" error. [0.39.0]

Hi everyone,I've been trying to call the databricks genie function, but even on the latest build, it throws the error stating: w.genie API is not yet supported in the workspace.Here is the output of the logs:> {> "content": "**REDACTED**"> }< {< "err...

  • 1453 Views
  • 2 replies
  • 0 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 0 kudos

Hi @gluedhawkeye ,I tested this on my own and getting the same error:This is the same code as used here, but they have an info "This script implements an experimental chatbot that interacts with Databricks' Genie API, which is currently in Private Pr...

  • 0 kudos
1 More Replies
santiagortiiz
by New Contributor III
  • 4476 Views
  • 2 replies
  • 0 kudos

I was charged by a free trial

Hello databricks community, I took a databricks course to prepare for certification exam and requested a 14-days free trial on february 13 at 4:51 PM. So, February 27 at 4:51 pm must be the end of the free trial, but it ended 1 day before. Additional...

  • 4476 Views
  • 2 replies
  • 0 kudos
Latest Reply
ystoikov
New Contributor II
  • 0 kudos

Hello, @santiagortiiz ! It looks like you were charged for the AWS services, not for Databricks DBUs. On your screens, I see different amounts. 

  • 0 kudos
1 More Replies
Vanshika
by New Contributor
  • 1127 Views
  • 1 replies
  • 0 kudos

Databricks and Cloud Services Pricing

Hi,If I connect databricks (trial version) with AWS/Azure/Google Cloud and then work on dashboards and Genie - will there be any minimal charges, or its completely free to use the cloud services?

  • 1127 Views
  • 1 replies
  • 0 kudos
Latest Reply
ystoikov
New Contributor II
  • 0 kudos

Anyway, you will pay for cloud provider products - VM, IPs. etc, 

  • 0 kudos
Richie1602
by New Contributor II
  • 819 Views
  • 2 replies
  • 0 kudos

Issue with Percentage Calculation in Power BI Using Databricks as Source

Hi everyone,I've created a financial summary report in Power BI, and my source is Databricks. I have created a view for each financial metric name along with the calculations. All my amount fields are accurate, but when calculating percentages, I’m g...

  • 819 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Hello Richie,In Databricks, you can use a combination of NULLIF and COALESCE functions to handle divide-by-zero scenarios effectively. Here's an example of how you can modify your percentage calculation:   SELECT MetricNo, MetricName, Amo...

  • 0 kudos
1 More Replies
spijl
by New Contributor III
  • 1437 Views
  • 2 replies
  • 0 kudos

Resolved! Datagrip connection error

I am trying to connect with Datagrip provided driver. I am not getting this to work with token from datagrips. The connection url is: jdbc:databricks://dbc-******.cloud.databricks.com:443/***_analytics;httpPath=/sql/1.0/warehouses/ba***3  I am gettin...

  • 1437 Views
  • 2 replies
  • 0 kudos
Latest Reply
spijl
New Contributor III
  • 0 kudos

hi @Alberto_Umana thanks. I created the token in databricks under User Settings > Access Tokens indeed. Not sure how to ensure is valid and has the necessary permissions to access the Databricks SQL warehouse. I generated it recently though.

  • 0 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 8633 Views
  • 2 replies
  • 0 kudos

SAP Successfator

Hi Team,We are working on a new Data Product onboarding to the current Databricks Lakehouse Platform.The first step is foundation where we should get data from SAP success factors to S3+ Bronze layer and then do the initial setup of Lakehouse+Power B...

  • 8633 Views
  • 2 replies
  • 0 kudos
Latest Reply
Fabrizio11
New Contributor II
  • 0 kudos

Hi everyone! Great article, by the way. What's your favorite strategy for winning in online games?

  • 0 kudos
1 More Replies
sahil_s_jain
by New Contributor III
  • 1935 Views
  • 4 replies
  • 0 kudos

GRPC call are not getting through on Databricks 15.4 LTS

Hi Team,I have updated spark version from 3.3.2 to 3.5.0 and switched to Databricks 15.4 LTS from 12.2 LTS so as to get Spark 3.5 version on the Databricks compute. We have moved from uploading libraries on DBFS to uploading libraries to Volumes as 1...

  • 1935 Views
  • 4 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

And this was working before is it correct? When the init was hosted in DBFS

  • 0 kudos
3 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels