- 1551 Views
- 5 replies
- 0 kudos
Issue with Adding New Members to Existing Groups During Migration in User group Service Principle
Hi all,I have implemented a migration process to move groups from a source workspace to a target workspace using the following code. The code successfully migrates groups and their members to the target system, but I am facing an issue when it comes...
- 1551 Views
- 5 replies
- 0 kudos
- 0 kudos
I have provided response in https://community.databricks.com/t5/get-started-discussions/migrating-service-principals-from-non-unity-to-unity-enabled/m-p/103017#M4679
- 0 kudos
- 2218 Views
- 1 replies
- 1 kudos
Resolved! Why does a join on (df1.id == df2.id) result in duplicate columns while on="id" does not?
Why does a join with on (df1.id == df2.id) result in duplicate columns, but on="id" does not?I encountered an interesting behavior while performing a join on two Data frames. Here's the scenario: df1 = spark.createDataFrame([(1, "Alice"), (2, "Bob"),...
- 2218 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Tanay , Your intuition is correct here. In Apache Spark, the difference in behavior between on (df1.id == df2.id) and on="id" in a join stems from how Spark resolves and handles column naming during the join operation.When you use the first synta...
- 1 kudos
- 754 Views
- 2 replies
- 0 kudos
Create multiple dashboard subscription with filters
Hi Databricks community, We developed a dashboard that surfaces up several important KPIs for each project we have.In the top filter, we select the project name and the time frame and the dashboard will present the relevant KPIs and charts. I can eas...
- 754 Views
- 2 replies
- 0 kudos
- 0 kudos
You can achieve this by setting up different schedules for each project and specifying the default filter values accordingly Create the Dashboard: Ensure your dashboard is set up with the necessary filters, including the project filter. Set Defau...
- 0 kudos
- 7107 Views
- 2 replies
- 0 kudos
PPT material or document from Databricks Learning
Hello Databricks Community,I am a beginner with Databricks. I am wondering if we can download power point slides or learning documents from the Databricks Learning Platform. I like to read after taking the online course. Could you let me know? Curren...
- 7107 Views
- 2 replies
- 0 kudos
- 456 Views
- 1 replies
- 0 kudos
Databricks, Cloud Services Pricing
i am unable to find reason of not getting Databricks, Cloud Services Pricing why?
- 456 Views
- 1 replies
- 0 kudos
- 0 kudos
Can you please provide some more context on the issue you are facing to be able to properly assist you?
- 0 kudos
- 1376 Views
- 3 replies
- 1 kudos
Delta sharing vs CosmosDB
Hi All,We have a situation where we write data to CosmosDB and create JSON data for a transaction table, which includes a mini statement in JSON format.Now, we want to introduce the concept of delta sharing and share the transaction table. The Java ...
- 1376 Views
- 3 replies
- 1 kudos
- 1 kudos
Thanks for your reply,Right now, the team is transferring data from Databricks to Cosmos DB, and then they're using REST APIs to access that data. They handle about 100 requests per minute, with some tables needing around 100 requests per second due...
- 1 kudos
- 4330 Views
- 4 replies
- 0 kudos
How to get the Usage/DBU Consumption report without using system tables
Is there a way to get the usage/DBU consumption report without using system tables?
- 4330 Views
- 4 replies
- 0 kudos
- 0 kudos
You can get DBU consumption reports using the Azure Portal (for Azure SQL), through Metrics under your database's "Usage" section, or via Dynamic Management Views (DMVs) like sys.dm_db_resource_stats in SSMS. Third-party tools like SQL Sentry also of...
- 0 kudos
- 924 Views
- 5 replies
- 1 kudos
Não estou conseguindo logar na minha conta Databricks communit
Galera não consigo logar na minha conta Databricks Communit fala que meu email não tem nada criado nele, mas eu tenho essa conta a um bom tempo já e nunca me ocorreu isso, já até tentei criar uma outra conta com esse mesmo email, mas não consigo cria...
- 924 Views
- 5 replies
- 1 kudos
- 1 kudos
infelizmente Não tenho nenhuma url
- 1 kudos
- 1453 Views
- 2 replies
- 0 kudos
Calling the w.genie function throws a "API is not yet supported in the workspace" error. [0.39.0]
Hi everyone,I've been trying to call the databricks genie function, but even on the latest build, it throws the error stating: w.genie API is not yet supported in the workspace.Here is the output of the logs:> {> "content": "**REDACTED**"> }< {< "err...
- 1453 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @gluedhawkeye ,I tested this on my own and getting the same error:This is the same code as used here, but they have an info "This script implements an experimental chatbot that interacts with Databricks' Genie API, which is currently in Private Pr...
- 0 kudos
- 4476 Views
- 2 replies
- 0 kudos
I was charged by a free trial
Hello databricks community, I took a databricks course to prepare for certification exam and requested a 14-days free trial on february 13 at 4:51 PM. So, February 27 at 4:51 pm must be the end of the free trial, but it ended 1 day before. Additional...
- 4476 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello, @santiagortiiz ! It looks like you were charged for the AWS services, not for Databricks DBUs. On your screens, I see different amounts.
- 0 kudos
- 1127 Views
- 1 replies
- 0 kudos
Databricks and Cloud Services Pricing
Hi,If I connect databricks (trial version) with AWS/Azure/Google Cloud and then work on dashboards and Genie - will there be any minimal charges, or its completely free to use the cloud services?
- 1127 Views
- 1 replies
- 0 kudos
- 0 kudos
Anyway, you will pay for cloud provider products - VM, IPs. etc,
- 0 kudos
- 819 Views
- 2 replies
- 0 kudos
Issue with Percentage Calculation in Power BI Using Databricks as Source
Hi everyone,I've created a financial summary report in Power BI, and my source is Databricks. I have created a view for each financial metric name along with the calculations. All my amount fields are accurate, but when calculating percentages, I’m g...
- 819 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello Richie,In Databricks, you can use a combination of NULLIF and COALESCE functions to handle divide-by-zero scenarios effectively. Here's an example of how you can modify your percentage calculation: SELECT MetricNo, MetricName, Amo...
- 0 kudos
- 1437 Views
- 2 replies
- 0 kudos
Resolved! Datagrip connection error
I am trying to connect with Datagrip provided driver. I am not getting this to work with token from datagrips. The connection url is: jdbc:databricks://dbc-******.cloud.databricks.com:443/***_analytics;httpPath=/sql/1.0/warehouses/ba***3 I am gettin...
- 1437 Views
- 2 replies
- 0 kudos
- 0 kudos
hi @Alberto_Umana thanks. I created the token in databricks under User Settings > Access Tokens indeed. Not sure how to ensure is valid and has the necessary permissions to access the Databricks SQL warehouse. I generated it recently though.
- 0 kudos
- 8633 Views
- 2 replies
- 0 kudos
SAP Successfator
Hi Team,We are working on a new Data Product onboarding to the current Databricks Lakehouse Platform.The first step is foundation where we should get data from SAP success factors to S3+ Bronze layer and then do the initial setup of Lakehouse+Power B...
- 8633 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi everyone! Great article, by the way. What's your favorite strategy for winning in online games?
- 0 kudos
- 1935 Views
- 4 replies
- 0 kudos
GRPC call are not getting through on Databricks 15.4 LTS
Hi Team,I have updated spark version from 3.3.2 to 3.5.0 and switched to Databricks 15.4 LTS from 12.2 LTS so as to get Spark 3.5 version on the Databricks compute. We have moved from uploading libraries on DBFS to uploading libraries to Volumes as 1...
- 1935 Views
- 4 replies
- 0 kudos
- 0 kudos
And this was working before is it correct? When the init was hosted in DBFS
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
112 | |
56 | |
42 | |
30 |