by
Phani1
• Valued Contributor II
- 619 Views
- 3 replies
- 1 kudos
Hi All,We have a situation where we write data to CosmosDB and create JSON data for a transaction table, which includes a mini statement in JSON format.Now, we want to introduce the concept of delta sharing and share the transaction table. The Java ...
- 619 Views
- 3 replies
- 1 kudos
Latest Reply
Thanks for your reply,Right now, the team is transferring data from Databricks to Cosmos DB, and then they're using REST APIs to access that data. They handle about 100 requests per minute, with some tables needing around 100 requests per second due...
2 More Replies
- 2521 Views
- 4 replies
- 0 kudos
Is there a way to get the usage/DBU consumption report without using system tables?
- 2521 Views
- 4 replies
- 0 kudos
Latest Reply
You can get DBU consumption reports using the Azure Portal (for Azure SQL), through Metrics under your database's "Usage" section, or via Dynamic Management Views (DMVs) like sys.dm_db_resource_stats in SSMS. Third-party tools like SQL Sentry also of...
3 More Replies
- 450 Views
- 5 replies
- 1 kudos
Galera não consigo logar na minha conta Databricks Communit fala que meu email não tem nada criado nele, mas eu tenho essa conta a um bom tempo já e nunca me ocorreu isso, já até tentei criar uma outra conta com esse mesmo email, mas não consigo cria...
- 450 Views
- 5 replies
- 1 kudos
- 773 Views
- 2 replies
- 0 kudos
Hi everyone,I've been trying to call the databricks genie function, but even on the latest build, it throws the error stating: w.genie API is not yet supported in the workspace.Here is the output of the logs:> {> "content": "**REDACTED**"> }< {< "err...
- 773 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @gluedhawkeye ,I tested this on my own and getting the same error:This is the same code as used here, but they have an info "This script implements an experimental chatbot that interacts with Databricks' Genie API, which is currently in Private Pr...
1 More Replies
- 3032 Views
- 2 replies
- 0 kudos
Hello databricks community, I took a databricks course to prepare for certification exam and requested a 14-days free trial on february 13 at 4:51 PM. So, February 27 at 4:51 pm must be the end of the free trial, but it ended 1 day before. Additional...
- 3032 Views
- 2 replies
- 0 kudos
Latest Reply
Hello, @santiagortiiz ! It looks like you were charged for the AWS services, not for Databricks DBUs. On your screens, I see different amounts.
1 More Replies
- 742 Views
- 1 replies
- 0 kudos
Hi,If I connect databricks (trial version) with AWS/Azure/Google Cloud and then work on dashboards and Genie - will there be any minimal charges, or its completely free to use the cloud services?
- 742 Views
- 1 replies
- 0 kudos
Latest Reply
Anyway, you will pay for cloud provider products - VM, IPs. etc,
- 386 Views
- 2 replies
- 0 kudos
Hi everyone,I've created a financial summary report in Power BI, and my source is Databricks. I have created a view for each financial metric name along with the calculations. All my amount fields are accurate, but when calculating percentages, I’m g...
- 386 Views
- 2 replies
- 0 kudos
Latest Reply
Hello Richie,In Databricks, you can use a combination of NULLIF and COALESCE functions to handle divide-by-zero scenarios effectively. Here's an example of how you can modify your percentage calculation:
SELECT
MetricNo,
MetricName,
Amo...
1 More Replies
by
spijl
• New Contributor III
- 789 Views
- 2 replies
- 0 kudos
I am trying to connect with Datagrip provided driver. I am not getting this to work with token from datagrips. The connection url is: jdbc:databricks://dbc-******.cloud.databricks.com:443/***_analytics;httpPath=/sql/1.0/warehouses/ba***3
I am gettin...
- 789 Views
- 2 replies
- 0 kudos
Latest Reply
spijl
New Contributor III
hi @Alberto_Umana thanks. I created the token in databricks under User Settings > Access Tokens indeed. Not sure how to ensure is valid and has the necessary permissions to access the Databricks SQL warehouse. I generated it recently though.
1 More Replies
by
Phani1
• Valued Contributor II
- 7898 Views
- 2 replies
- 0 kudos
Hi Team,We are working on a new Data Product onboarding to the current Databricks Lakehouse Platform.The first step is foundation where we should get data from SAP success factors to S3+ Bronze layer and then do the initial setup of Lakehouse+Power B...
- 7898 Views
- 2 replies
- 0 kudos
Latest Reply
Hi everyone! Great article, by the way. What's your favorite strategy for winning in online games?
1 More Replies
- 854 Views
- 4 replies
- 0 kudos
Hi Team,I have updated spark version from 3.3.2 to 3.5.0 and switched to Databricks 15.4 LTS from 12.2 LTS so as to get Spark 3.5 version on the Databricks compute. We have moved from uploading libraries on DBFS to uploading libraries to Volumes as 1...
- 854 Views
- 4 replies
- 0 kudos
Latest Reply
And this was working before is it correct? When the init was hosted in DBFS
3 More Replies
- 2518 Views
- 6 replies
- 1 kudos
Hi Team,In a streamlit app (in databricks) while creating the spark session getting below error, this is happening when running the app via web link."[JAVA_GATEWAY_EXITED] Java gateway process exited before sending its port number"Below is the code u...
- 2518 Views
- 6 replies
- 1 kudos
Latest Reply
Per looking internally I see that Spark (Context) is not available in Apps
The recommended way would be to use our available SDKs and connect to Clusters/DBSQL. No spark context is available- it’s meant to defer processing to other compute it can con...
5 More Replies
- 485 Views
- 1 replies
- 0 kudos
SHOW CREATE TABLE provides correct NULL constraints details of each column where as DESCRIBE TABLE shows wrong NULL constraints details?
- 485 Views
- 1 replies
- 0 kudos
Latest Reply
Yes, you are correct. The SHOW CREATE TABLE command provides accurate details about the NULL constraints for each column in a table, whereas the DESCRIBE TABLE command may show incorrect NULL constraints details. This discrepancy arises because SHOW ...
- 212 Views
- 0 replies
- 0 kudos
What experience does the consultancy have with similar Databricks projects?Ensure they have relevant experience in your industry. Ask for examples of similar projects they've worked on.How do they manage Databricks costs?Inquire about their strategie...
- 212 Views
- 0 replies
- 0 kudos
- 19339 Views
- 4 replies
- 0 kudos
HiI'm currently looking into connecting to the SQL Warehouse through SDA/SDI. Does anyone have any experience doing so and can share some takeaways on how to implement it. We want to expose the Databricks tables to SAP. We're already doing this by us...
- 19339 Views
- 4 replies
- 0 kudos
Latest Reply
Hey,If you're exploring how to connect your SQL Warehouse to SAP or want to streamline the process of transferring data from SAP HANA into Databricks, our SAP HANA to Databricks Connector could be a valuable tool. This connector allows you to directl...
3 More Replies
- 1032 Views
- 7 replies
- 1 kudos
I'm new to Databricks and have been tasked with exploring Databricks Clean Rooms. I'm a bit confused about how billing works for Clean Rooms and their overall functionality. Specifically, I'm curious about the following:Environment Hosting: Are Clean...
- 1032 Views
- 7 replies
- 1 kudos
Latest Reply
Is it adding foreign catalog table in datacleanroom feature also available after GA release, cause I tried it but was not able to see the foreign catalog in add data assets tab
6 More Replies