- 8143 Views
- 5 replies
- 3 kudos
Resolved! Databricks runtime 14.3 gives error scala.math.BigInt cannot be cast to java.lang.Integer
We have a cluster running on 13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12).We want to test with a different type of cluster (14.3 LTS (includes Apache Spark 3.5.0, Scala 2.12))And all of a sudden we get errors that complain about a casting a Big...
- 8143 Views
- 5 replies
- 3 kudos
- 3 kudos
I have logged the issue with Microsoft last week and they confirmed it is a Databricks bug. A fix is supposedly being rolled out at the moment across Databricks regions. As anticipated, we have engaged the Databricks core team to further investigate ...
- 3 kudos
- 1755 Views
- 0 replies
- 0 kudos
broadcasted table reuse
in spark, table1 is small and broadcasted and joined with table 2. output is stored in df1. again, table1 is required to join with table3 and output need to be stored in df2. do it need to be broadcasted again?
- 1755 Views
- 0 replies
- 0 kudos
- 2101 Views
- 0 replies
- 0 kudos
how to run a group of cells in databricks ?
Hello,I was experimenting with a ML model with different parameters and check the results. However, the important part of this code is contained in a couple of cells (say cell # 12, 13 & 14). I like to proceed to the next cell only when the results a...
- 2101 Views
- 0 replies
- 0 kudos
- 3603 Views
- 1 replies
- 0 kudos
Unable to read data from API due to Private IP Restriction
I have data in my API Endpoint but am unable to read it using Databricks. My data is limited to my private IP address and can only be accessed over a VPN connection. I can't read data into Databricks as a result of this. I can obtain the data in VS C...
- 3603 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi AravindNaniThis is more of infrastructure questions, you have to make sure that:1) Your databricks Workspace is provisioned in VNET Injection mode2) Your VNET is either peered to "HUB" network where you have S2S VPN Connection to API or you have t...
- 0 kudos
- 3607 Views
- 1 replies
- 0 kudos
using the api for getting cost in usd
I'm trying to use the API of billable usage and I do get a report but I have not been able to get the usd cost report, only the dbuHours. I guess I've to change the meter_name but I cannot find the key for that parameter anywhere
- 3607 Views
- 1 replies
- 0 kudos
- 4952 Views
- 1 replies
- 0 kudos
databricks email notification
In databricks, if a job fails, then an email is sent off as notification.The recipient, receives the email with the link to the databricks workspace.Question:How is it possible the email is sent without any link, just the plain text in the email is w...
- 4952 Views
- 1 replies
- 0 kudos
- 2138 Views
- 0 replies
- 0 kudos
set up Azure Databricks workspace and Unity catalog - how to automate not using Terraform
Hi everyone, I am looking for a way to automate initial setup of Azure Databricks workspace and Unity Catalog but can't find anything on this topic other than Terraform. Can you share if this is possible with powershell, for example. Thank you un adv...
- 2138 Views
- 0 replies
- 0 kudos
- 10222 Views
- 3 replies
- 1 kudos
Fuzzy Match on PySpark using UDF/Pandas UDF
I'm trying to do fuzzy matching on two dataframes by cross joining them and then using a udf for my fuzzy matching. But using both python udf and pandas udf its either very slow or I get an error. @pandas_udf("int")def core_match_processor(s1: pd.Ser...
- 10222 Views
- 3 replies
- 1 kudos
- 1 kudos
I'm now getting the error: (SQL_GROUPED_AGG_PANDAS_UDF) is not supported on clusters in Shared access mode.Even though this article clearly states that pandas udf is supported for shared cluster in databrickshttps://www.databricks.com/blog/shared-clu...
- 1 kudos
- 6716 Views
- 2 replies
- 1 kudos
Resolved! Okta and Unified login
Hey Folks anyone put Databricks behind Okta and enabled Unified Login with workspaces that have a Unity Catalog metastore applied and some that don't?There are some workspaces we can't move over yet and it isn't clear in documentation if Unity Catalo...
- 6716 Views
- 2 replies
- 1 kudos
- 1 kudos
Yes, users should be able to use a single Okta application for all workspaces, regardless of whether the Unity Catalog metastore has been applied or not. The Unity Catalog is a feature that allows you to manage and secure access to your data across a...
- 1 kudos
- 1322 Views
- 0 replies
- 0 kudos
Public preview API not working - artifact-allowlists
I am trying to hit /api/2.1/unity-catalog/artifact-allowlists/as a part of INIT migration script. Its is in public preview, do we need to enable anything else to use a API which is in Public preview. I am getting 404 error. But using same token for ...
- 1322 Views
- 0 replies
- 0 kudos
- 2955 Views
- 1 replies
- 0 kudos
How to enable "Create Vector Search Index" button in DB workspace?
How to enable "Create Vector Search Index" button in DB workspace?Following is the screenshot from the Microsoft Ignite 2023 Databricks presentation:
- 2955 Views
- 1 replies
- 0 kudos
- 0 kudos
The feature is in public preview only in some regions, you can check the available regions in the documentation here. In addition there are certain requirements, such as a UC enabled workspace and Serverless Compute enabled, you can check all requir...
- 0 kudos
- 4620 Views
- 5 replies
- 0 kudos
CONVERT_TIMEZONE issue in DLT
I can run a query that uses the CONVERT_TIMEZONE function in a SQL notebook. When I move the code to my DLT notebook the pipeline produces this error:Cannot resolve function `CONVERT_TIMEZONE`Here is the line: CONVERT_TIMEZONE('UTC', 'America/Phoen...
- 4620 Views
- 5 replies
- 0 kudos
- 0 kudos
Yes, the notebook is set to SQL and the convert_timezone function is within a select statement.
- 0 kudos
- 7262 Views
- 2 replies
- 1 kudos
Can we get the actual query execution plan programmatically after a query is executed? Apart from UI
Let's say i have run a query and it showed me results. we can find the respective query execution plan on the UI. Is there any way we can get that execution plan through programmatically or through API?
- 7262 Views
- 2 replies
- 1 kudos
- 1 kudos
You can obtain the query execution plan programmatically using the EXPLAIN statement in SQL. The EXPLAIN statement displays the execution plan that the database planner generates for the supplied statement. The execution plan shows how the table(s) r...
- 1 kudos
- 3103 Views
- 2 replies
- 4 kudos
Top Kudoed Author 🌟🤩🧑🎤
I recently saw a link to the Kudos Leaderboard for the Community Discussions. It has always been my hope and fantasy , ever since I was a little child that I would someday be the #1 Kudoed Author on Community Discusions on community.Databricks.com....
- 3103 Views
- 2 replies
- 4 kudos
- 8939 Views
- 5 replies
- 7 kudos
Incremental ingestion of Snowflake data with Delta Live Table (CDC)
Hello,I have some data which are lying into Snowflake, so I want to apply CDC on them using delta live table but I am having some issues.Here is what I am trying to do: @dlt.view() def table1(): return spark.read.format("snowflake").options(**opt...
- 8939 Views
- 5 replies
- 7 kudos
- 7 kudos
The CDC for delta live works fine for delta tables, as you have noticed. However it is not a full blown CDC implementation/software.If you want to capture changes in Snowflake, you will have to implement some CDC method on Snowflake itself, and read...
- 7 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 117 | |
| 56 | |
| 42 | |
| 34 |