- 3682 Views
- 1 replies
- 0 kudos
Resolved! combining accounts
I have an AWS based databricks account with a few workspaces and an Azure Databricks workspace. How do I combine them into one account?I am particularly interested in setting up a single billing drop with all my Databricks costs.
- 3682 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @BillGuyTheScien Greetings! Currently, we do not have such a feature to combine multiple cloud usage into a single account. We do have a feature request for the same and it is considered for future. Currently, there is no ETA on that. You can bro...
- 0 kudos
- 1682 Views
- 0 replies
- 0 kudos
Read VCF files using latest runtime version
Hello everyone!I was reading VCF files using the glow library (Maven: io.projectglow:glow-spark3_2.12:1.2.1).The last version of this library only works with the spark's version 3.3.2 so if I need to use a newer runtime with a more recent spark versi...
- 1682 Views
- 0 replies
- 0 kudos
- 4565 Views
- 1 replies
- 0 kudos
Catalog issue
When i was trying to create catalog i got an error saying to mention azure storage account and storage container in the following query -CREATE CATALOG IF NOT EXISTS Databricks_Anu_Jal_27022024MANAGED LOCATION 'abfss://<databricks-workspace-stack-anu...
- 4565 Views
- 1 replies
- 0 kudos
- 5873 Views
- 0 replies
- 0 kudos
Run spark code in notebook by setting spark conf instead of databricks connect configure in runtime
Hi community, I wanted to understand if there is a way to pass config values to spark session in runtime than using databricks-connect configure to run spark code. One way I found out is given here: https://stackoverflow.com/questions/63088121/config...
- 5873 Views
- 0 replies
- 0 kudos
- 4803 Views
- 4 replies
- 1 kudos
sparkR.session
Why might this be erroring out? My understanding is that SparkR is built into Databricks.Code:library(SparkR, include.only=c('read.parquet', 'collect'))sparkR.session() Error:Error in sparkR.session(): could not find function "sparkR.session"
- 4803 Views
- 4 replies
- 1 kudos
- 1 kudos
It happens with any code; even something as simple as...x <- 2 + 2
- 1 kudos
- 3154 Views
- 1 replies
- 0 kudos
How to add delay between databricks workflow job tasks.?
I want to add an explicit time delay between databricks workflow job tasks, any help would be greatly appreciated. Thanks
- 3154 Views
- 1 replies
- 0 kudos
- 0 kudos
@Milliman - you could add min_retry_interval_millis to add delay between start of the failed run and the subsequent retry run. Reference is here
- 0 kudos
- 1627 Views
- 0 replies
- 0 kudos
workspace level sso gives authentication failed error
We have enabled workspace level sso , and have V2.0 version of databricks using azure EntraID groups and azure applications.Values in both databricks and azure application matchStill we get sso auth failed error.How can this be resolved , SAML tracer...
- 1627 Views
- 0 replies
- 0 kudos
- 3813 Views
- 0 replies
- 0 kudos
🌟 Welcome Newcomers! 🌟
Hello and welcome to our wonderful Community!Whether you are here by chance or intention, we're thrilled to have you join us. Before you dive into the plethora of discussions and activities happening here, we'd love to get to know you better! ...
- 3813 Views
- 0 replies
- 0 kudos
- 1084 Views
- 0 replies
- 0 kudos
This job uses a format which has been deprecated since 2016
After creating a Databricks job using CLI v0.214.0 from a JSON input.I see the following message in the UI: "This job uses a format which has been deprecated since 2016, update it to dependent libraries automatically or learn more"When I update it, I...
- 1084 Views
- 0 replies
- 0 kudos
- 5459 Views
- 0 replies
- 0 kudos
TIMEZONE
Can I get some help from Databricks to help me understand how those timestamps being interpreted? Some are really confusing me. I have timestamp coming into AWS Databricks as String type. And the string timestamp is represented in UTC. I ran below qu...
- 5459 Views
- 0 replies
- 0 kudos
- 10436 Views
- 1 replies
- 0 kudos
Syntax of UPDATE Command in DataBricks
Hi All,I am testing the sql generated by our ETL software to see if it can run on data bricks SQL which I believe is Delta Tables underneath. This is the statement we are testing. As far as I can tell from the manual the from clause is not supported ...
- 10436 Views
- 1 replies
- 0 kudos
- 1361 Views
- 1 replies
- 0 kudos
Workspace level changes in databricks
Hi Team,Is there any way in data bricks to track the changes that happen at notebook level. I know using system tables we can track the changes that happen at table/view/function/ other objects. However, is there any way to track all those changes in...
- 1361 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @ManiTejaG , Hope you are doing well! You can use the Audit logs to capture the notebook events that are logged at the workspace level. Please refer to this for more details : https://docs.databricks.com/en/administration-guide/account-settings/a...
- 0 kudos
- 6610 Views
- 4 replies
- 1 kudos
Resolved! Unable to use the bamboolib in a databricks notebook
I am getting the same below error message while trying to use Bamboolib in a databricks notebook.Error: Could not display the widget. Please re-run the cell.I have installed the pypi libraries for bamboolib on the cluster and the databricks runtime v...
- 6610 Views
- 4 replies
- 1 kudos
- 3268 Views
- 1 replies
- 0 kudos
Bamboolib Error: Could not display the widget. Please re-run the cell.
Trying to run bamboolib via command bamI get: Error: Could not display the widget. Please re-run the cell.I get this sometimes. Not others.Other times bam UI works. Help
- 3268 Views
- 1 replies
- 0 kudos
- 0 kudos
I am getting the same below error message while trying to use Bamboolib in a databricks notebook.Error: Could not display the widget. Please re-run the cell.I have installed the pypi libraries for bamboolib on the cluster and the databricks runtime v...
- 0 kudos
- 2811 Views
- 0 replies
- 0 kudos
Pandas_Udod max batch size not working in notebook
Hello I am trying to set max batch size for pandas-udf in Databricks notebook, but in my tests it doesn’t have any effect on size. spark.conf.set("spark.sql.execution.arrow.enabled", "true")spark.conf.set('spark.sql.execution.arrow.maxRecordsPerBatch...
- 2811 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 119 | |
| 56 | |
| 42 | |
| 34 |