- 1037 Views
- 1 replies
- 0 kudos
Databricks and Cloud Services Pricing
Hi,If I connect databricks (trial version) with AWS/Azure/Google Cloud and then work on dashboards and Genie - will there be any minimal charges, or its completely free to use the cloud services?
- 1037 Views
- 1 replies
- 0 kudos
- 0 kudos
Anyway, you will pay for cloud provider products - VM, IPs. etc,
- 0 kudos
- 714 Views
- 2 replies
- 0 kudos
Issue with Percentage Calculation in Power BI Using Databricks as Source
Hi everyone,I've created a financial summary report in Power BI, and my source is Databricks. I have created a view for each financial metric name along with the calculations. All my amount fields are accurate, but when calculating percentages, I’m g...
- 714 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello Richie,In Databricks, you can use a combination of NULLIF and COALESCE functions to handle divide-by-zero scenarios effectively. Here's an example of how you can modify your percentage calculation: SELECT MetricNo, MetricName, Amo...
- 0 kudos
- 1261 Views
- 2 replies
- 0 kudos
Resolved! Datagrip connection error
I am trying to connect with Datagrip provided driver. I am not getting this to work with token from datagrips. The connection url is: jdbc:databricks://dbc-******.cloud.databricks.com:443/***_analytics;httpPath=/sql/1.0/warehouses/ba***3 I am gettin...
- 1261 Views
- 2 replies
- 0 kudos
- 0 kudos
hi @Alberto_Umana thanks. I created the token in databricks under User Settings > Access Tokens indeed. Not sure how to ensure is valid and has the necessary permissions to access the Databricks SQL warehouse. I generated it recently though.
- 0 kudos
- 8437 Views
- 2 replies
- 0 kudos
SAP Successfator
Hi Team,We are working on a new Data Product onboarding to the current Databricks Lakehouse Platform.The first step is foundation where we should get data from SAP success factors to S3+ Bronze layer and then do the initial setup of Lakehouse+Power B...
- 8437 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi everyone! Great article, by the way. What's your favorite strategy for winning in online games?
- 0 kudos
- 1730 Views
- 4 replies
- 0 kudos
GRPC call are not getting through on Databricks 15.4 LTS
Hi Team,I have updated spark version from 3.3.2 to 3.5.0 and switched to Databricks 15.4 LTS from 12.2 LTS so as to get Spark 3.5 version on the Databricks compute. We have moved from uploading libraries on DBFS to uploading libraries to Volumes as 1...
- 1730 Views
- 4 replies
- 0 kudos
- 0 kudos
And this was working before is it correct? When the init was hosted in DBFS
- 0 kudos
- 6244 Views
- 6 replies
- 1 kudos
[JAVA_GATEWAY_EXITED] Java gateway process exited before sending its port number.
Hi Team,In a streamlit app (in databricks) while creating the spark session getting below error, this is happening when running the app via web link."[JAVA_GATEWAY_EXITED] Java gateway process exited before sending its port number"Below is the code u...
- 6244 Views
- 6 replies
- 1 kudos
- 1 kudos
Per looking internally I see that Spark (Context) is not available in Apps The recommended way would be to use our available SDKs and connect to Clusters/DBSQL. No spark context is available- it’s meant to defer processing to other compute it can con...
- 1 kudos
- 790 Views
- 1 replies
- 0 kudos
DESCRIBE Table and SHOW CREAT TABLE shows contradictory NULL constraints
SHOW CREATE TABLE provides correct NULL constraints details of each column where as DESCRIBE TABLE shows wrong NULL constraints details?
- 790 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes, you are correct. The SHOW CREATE TABLE command provides accurate details about the NULL constraints for each column in a table, whereas the DESCRIBE TABLE command may show incorrect NULL constraints details. This discrepancy arises because SHOW ...
- 0 kudos
- 388 Views
- 0 replies
- 0 kudos
Things to Consider Before Selecting a Databricks Consultancy
What experience does the consultancy have with similar Databricks projects?Ensure they have relevant experience in your industry. Ask for examples of similar projects they've worked on.How do they manage Databricks costs?Inquire about their strategie...
- 388 Views
- 0 replies
- 0 kudos
- 21070 Views
- 4 replies
- 0 kudos
Resolved! SAP HANA Smart Data Access to Databricks SQL Warehouse
HiI'm currently looking into connecting to the SQL Warehouse through SDA/SDI. Does anyone have any experience doing so and can share some takeaways on how to implement it. We want to expose the Databricks tables to SAP. We're already doing this by us...
- 21070 Views
- 4 replies
- 0 kudos
- 0 kudos
Hey,If you're exploring how to connect your SQL Warehouse to SAP or want to streamline the process of transferring data from SAP HANA into Databricks, our SAP HANA to Databricks Connector could be a valuable tool. This connector allows you to directl...
- 0 kudos
- 2178 Views
- 7 replies
- 1 kudos
Databricks cleanroom functionality and billing
I'm new to Databricks and have been tasked with exploring Databricks Clean Rooms. I'm a bit confused about how billing works for Clean Rooms and their overall functionality. Specifically, I'm curious about the following:Environment Hosting: Are Clean...
- 2178 Views
- 7 replies
- 1 kudos
- 1 kudos
Is it adding foreign catalog table in datacleanroom feature also available after GA release, cause I tried it but was not able to see the foreign catalog in add data assets tab
- 1 kudos
- 473 Views
- 1 replies
- 0 kudos
Converting Managed Hive Metastore Table to External Table with Mount Point Location
We have a managed Hive Metastore (HMS) table, and we would like to convert it into an external table and the location of that external hms table as mount points.
- 473 Views
- 1 replies
- 0 kudos
- 0 kudos
You could create the table as external table by using CREATE TABLE student_copy AS SELECT * FROM student; to pull the data from the managed table.
- 0 kudos
- 3722 Views
- 2 replies
- 0 kudos
Unable to grant catalog access to service principal
Hi everyone,I created a service principals called TestServicePrincipal. I tried to grant the catalog access to the service principals, but the error mentioned that it could not find principal with name TestServicePrincipal. If I grant the access to s...
- 3722 Views
- 2 replies
- 0 kudos
- 0 kudos
The issue could be related to how the service principal is being resolved in your system. Unlike users, service principals are often registered in a directory (like Azure AD), and their names might not match what you’re using. Instead of using TestSe...
- 0 kudos
- 4432 Views
- 3 replies
- 0 kudos
Informatica ETLs
I'm delving into the challenges of ETL transformations, particularly moving from traditional platforms like Informatica to Databricks. Given the complexity of legacy ETLs, I'm curious about the approaches others have taken to integrate these with Dat...
- 4432 Views
- 3 replies
- 0 kudos
- 0 kudos
@DataYoga , you may explore the tool and services from Travinto Technologies . They have very good tools. We had explored their tool for our code coversion from Informatica, Datastage and abi initio to DATABRICKS , pyspark. Also we used for SQL quer...
- 0 kudos
- 1059 Views
- 4 replies
- 0 kudos
Login comunity
Na tela de login, eu coloco o código de verificação recebido por email e ao clicar volta para a mesma tela sem mensagem de erro e não loga.
- 1059 Views
- 4 replies
- 0 kudos
- 0 kudos
Is this happening with any browser in Windows?
- 0 kudos
- 19033 Views
- 6 replies
- 0 kudos
bigquery in notebook failing with unity catalog enabled cluster
bigquery(reading data from google cloud) failing with unity catalog enabled cluster. Same working fine without unity cluster. Any help is appreciated!Thanks,Sai
- 19033 Views
- 6 replies
- 0 kudos
- 0 kudos
Did anyone manage to solve this?Currently we can save data using a Unity Catalog enabled Access mode: Single User cluster, but we CANNOT with a Access mode: Shared.We have tried the steps described in (https://docs.databricks.com/en/connect/external-...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
88 | |
42 | |
42 | |
30 |