cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

gluedhawkeye
by New Contributor II
  • 1333 Views
  • 2 replies
  • 0 kudos

Calling the w.genie function throws a "API is not yet supported in the workspace" error. [0.39.0]

Hi everyone,I've been trying to call the databricks genie function, but even on the latest build, it throws the error stating: w.genie API is not yet supported in the workspace.Here is the output of the logs:> {> "content": "**REDACTED**"> }< {< "err...

  • 1333 Views
  • 2 replies
  • 0 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 0 kudos

Hi @gluedhawkeye ,I tested this on my own and getting the same error:This is the same code as used here, but they have an info "This script implements an experimental chatbot that interacts with Databricks' Genie API, which is currently in Private Pr...

  • 0 kudos
1 More Replies
santiagortiiz
by New Contributor III
  • 4054 Views
  • 2 replies
  • 0 kudos

I was charged by a free trial

Hello databricks community, I took a databricks course to prepare for certification exam and requested a 14-days free trial on february 13 at 4:51 PM. So, February 27 at 4:51 pm must be the end of the free trial, but it ended 1 day before. Additional...

  • 4054 Views
  • 2 replies
  • 0 kudos
Latest Reply
ystoikov
New Contributor II
  • 0 kudos

Hello, @santiagortiiz ! It looks like you were charged for the AWS services, not for Databricks DBUs. On your screens, I see different amounts. 

  • 0 kudos
1 More Replies
Vanshika
by New Contributor
  • 1040 Views
  • 1 replies
  • 0 kudos

Databricks and Cloud Services Pricing

Hi,If I connect databricks (trial version) with AWS/Azure/Google Cloud and then work on dashboards and Genie - will there be any minimal charges, or its completely free to use the cloud services?

  • 1040 Views
  • 1 replies
  • 0 kudos
Latest Reply
ystoikov
New Contributor II
  • 0 kudos

Anyway, you will pay for cloud provider products - VM, IPs. etc, 

  • 0 kudos
Richie1602
by New Contributor II
  • 719 Views
  • 2 replies
  • 0 kudos

Issue with Percentage Calculation in Power BI Using Databricks as Source

Hi everyone,I've created a financial summary report in Power BI, and my source is Databricks. I have created a view for each financial metric name along with the calculations. All my amount fields are accurate, but when calculating percentages, I’m g...

  • 719 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Hello Richie,In Databricks, you can use a combination of NULLIF and COALESCE functions to handle divide-by-zero scenarios effectively. Here's an example of how you can modify your percentage calculation:   SELECT MetricNo, MetricName, Amo...

  • 0 kudos
1 More Replies
spijl
by New Contributor III
  • 1270 Views
  • 2 replies
  • 0 kudos

Resolved! Datagrip connection error

I am trying to connect with Datagrip provided driver. I am not getting this to work with token from datagrips. The connection url is: jdbc:databricks://dbc-******.cloud.databricks.com:443/***_analytics;httpPath=/sql/1.0/warehouses/ba***3  I am gettin...

  • 1270 Views
  • 2 replies
  • 0 kudos
Latest Reply
spijl
New Contributor III
  • 0 kudos

hi @Alberto_Umana thanks. I created the token in databricks under User Settings > Access Tokens indeed. Not sure how to ensure is valid and has the necessary permissions to access the Databricks SQL warehouse. I generated it recently though.

  • 0 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 8445 Views
  • 2 replies
  • 0 kudos

SAP Successfator

Hi Team,We are working on a new Data Product onboarding to the current Databricks Lakehouse Platform.The first step is foundation where we should get data from SAP success factors to S3+ Bronze layer and then do the initial setup of Lakehouse+Power B...

  • 8445 Views
  • 2 replies
  • 0 kudos
Latest Reply
Fabrizio11
New Contributor II
  • 0 kudos

Hi everyone! Great article, by the way. What's your favorite strategy for winning in online games?

  • 0 kudos
1 More Replies
sahil_s_jain
by New Contributor III
  • 1731 Views
  • 4 replies
  • 0 kudos

GRPC call are not getting through on Databricks 15.4 LTS

Hi Team,I have updated spark version from 3.3.2 to 3.5.0 and switched to Databricks 15.4 LTS from 12.2 LTS so as to get Spark 3.5 version on the Databricks compute. We have moved from uploading libraries on DBFS to uploading libraries to Volumes as 1...

  • 1731 Views
  • 4 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

And this was working before is it correct? When the init was hosted in DBFS

  • 0 kudos
3 More Replies
roshan_robert
by New Contributor II
  • 6279 Views
  • 6 replies
  • 1 kudos

[JAVA_GATEWAY_EXITED] Java gateway process exited before sending its port number.

Hi Team,In a streamlit app (in databricks) while creating the spark session getting below error, this is happening when running the app via web link."[JAVA_GATEWAY_EXITED] Java gateway process exited before sending its port number"Below is the code u...

  • 6279 Views
  • 6 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Per looking internally I see that Spark (Context) is not available in Apps The recommended way would be to use our available SDKs and connect to Clusters/DBSQL. No spark context is available- it’s meant to defer processing to other compute it can con...

  • 1 kudos
5 More Replies
MBhaskar
by New Contributor
  • 792 Views
  • 1 replies
  • 0 kudos

DESCRIBE Table and SHOW CREAT TABLE shows contradictory NULL constraints

SHOW CREATE TABLE provides correct NULL constraints details of each column where as DESCRIBE TABLE shows wrong NULL constraints details? 

  • 792 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Yes, you are correct. The SHOW CREATE TABLE command provides accurate details about the NULL constraints for each column in a table, whereas the DESCRIBE TABLE command may show incorrect NULL constraints details. This discrepancy arises because SHOW ...

  • 0 kudos
ClaudiuSolar
by New Contributor III
  • 21088 Views
  • 4 replies
  • 0 kudos

Resolved! SAP HANA Smart Data Access to Databricks SQL Warehouse

HiI'm currently looking into connecting to the SQL Warehouse through SDA/SDI. Does anyone have any experience doing so and can share some takeaways on how to implement it. We want to expose the Databricks tables to SAP. We're already doing this by us...

  • 21088 Views
  • 4 replies
  • 0 kudos
Latest Reply
felixdmeshio
New Contributor III
  • 0 kudos

Hey,If you're exploring how to connect your SQL Warehouse to SAP or want to streamline the process of transferring data from SAP HANA into Databricks, our SAP HANA to Databricks Connector could be a valuable tool. This connector allows you to directl...

  • 0 kudos
3 More Replies
RohithChippa
by New Contributor III
  • 2181 Views
  • 7 replies
  • 1 kudos

Databricks cleanroom functionality and billing

I'm new to Databricks and have been tasked with exploring Databricks Clean Rooms. I'm a bit confused about how billing works for Clean Rooms and their overall functionality. Specifically, I'm curious about the following:Environment Hosting: Are Clean...

  • 2181 Views
  • 7 replies
  • 1 kudos
Latest Reply
RohithChippa
New Contributor III
  • 1 kudos

Is it adding foreign catalog table in datacleanroom feature also available after GA release, cause I tried it but was not able to see the foreign catalog in add data assets tab

  • 1 kudos
6 More Replies
shweta_m
by New Contributor
  • 476 Views
  • 1 replies
  • 0 kudos

Converting Managed Hive Metastore Table to External Table with Mount Point Location

We have a managed Hive Metastore (HMS) table, and we would like to convert it into an external table and the location of that external hms table as mount points.

  • 476 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

You could create the table as external table by using CREATE TABLE student_copy AS SELECT * FROM student; to pull the data from the managed table.

  • 0 kudos
Kasen
by New Contributor III
  • 3736 Views
  • 2 replies
  • 0 kudos

Unable to grant catalog access to service principal

Hi everyone,I created a service principals called TestServicePrincipal. I tried to grant the catalog access to the service principals, but the error mentioned that it could not find principal with name TestServicePrincipal. If I grant the access to s...

Kasen_0-1715058248230.png Kasen_1-1715058284642.png
Get Started Discussions
grant access
service principals
  • 3736 Views
  • 2 replies
  • 0 kudos
Latest Reply
OYESPEEDY
New Contributor II
  • 0 kudos

The issue could be related to how the service principal is being resolved in your system. Unlike users, service principals are often registered in a directory (like Azure AD), and their names might not match what you’re using. Instead of using TestSe...

  • 0 kudos
1 More Replies
DataYoga
by New Contributor
  • 4443 Views
  • 3 replies
  • 0 kudos

Informatica ETLs

I'm delving into the challenges of ETL transformations, particularly moving from traditional platforms like Informatica to Databricks. Given the complexity of legacy ETLs, I'm curious about the approaches others have taken to integrate these with Dat...

  • 4443 Views
  • 3 replies
  • 0 kudos
Latest Reply
thelogicplus
Contributor
  • 0 kudos

@DataYoga , you may explore the tool and services from Travinto Technologies . They have very good tools. We had explored their tool for our code coversion from  Informatica, Datastage and abi initio to DATABRICKS , pyspark. Also we used for SQL quer...

  • 0 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels