- 4941 Views
- 3 replies
- 0 kudos
Tableau Desktop connection error from Mac M1
Hi, Im getting the below error while connecting SQL Warehouse from the tableau desktop. I installed the latest ODBC drivers (2.7.5) but I can confirm that the driver name is different. From the error message I see libsparkodbc_sbu.dylib but in my lap...
- 4941 Views
- 3 replies
- 0 kudos
- 0 kudos
Have you referred to this document?https://help.tableau.com/current/pro/desktop/en-us/examples_databricks.html https://help.tableau.com/current/pro/desktop/en-us/examples_databricks.htm
- 0 kudos
- 9276 Views
- 2 replies
- 2 kudos
Resolved! Migrating dashboards from one workspace to another workspace
I'm exporting dashboard objects from an existing workspace to new workspace but after importing ,the underlying dashboards data is not coming to new workspace. I'm using the below code. Can anyone helpimport osimport requestsimport jsonimport logging...
- 9276 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi, you can use the workspace API to import the dashboard: https://learn.microsoft.com/en-us/azure/databricks/dashboards/tutorials/workspace-lakeview-api#step-3-import-a-dashboard A code example is available on this thread: https://community.databric...
- 2 kudos
- 3335 Views
- 5 replies
- 1 kudos
To trigger databricks workflow on defined frequency
Hi Databricks, I am trying to run a databricks workflow on scheduled basis (for e.g. the frequency is after every five mins). Here is the databricks.yaml file: bundle: name: dab_demo# include:# - resources/*.ymlvariables: job_cluster_key: desc...
- 3335 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi, let me explain you the current scenario, we have databricks workflows which has DS, DE and MLOps tasks. The workflows are meant to be triggered on a specific frequency i.e. Monthly and Quarterly and Quarterly workflow depends on the Monthly workf...
- 1 kudos
- 2563 Views
- 1 replies
- 1 kudos
Cannot use Databricks VSCode extensions on repository with ".devcontainer" folder
Hi, I have a repository that contains a ".devcontainer" folder. I use VSCode and try to use the databricks extension following this guide When I run "Upload and Run File on Databricks" I have this error and then cannot run the python scriptSync Error...
- 2563 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Retired_mod The folder lays plain in the repository. When logging on the databricks Web UI I can see that the folder (together with its content) was correctly copied But I still have the error messsage in VSCode
- 1 kudos
- 968 Views
- 1 replies
- 0 kudos
Exam got suspended
Hi Team,In this morning i was taking the exam suddenly proctor asked me to show the room and walls i showed him then he suspended my exam no one was there. I dont understand why he suspended the exam. Kindly reschedule the exam i need the certificati...
- 968 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi Team,Could you please atleast reschedule the exam. i will take it from the prometric centerthanks
- 0 kudos
- 3335 Views
- 2 replies
- 1 kudos
R Package Installation Best Practices
Hello,We are new to databricks and are wondering what the best practices are for R package installation. We currently have cluster spin up wait times of more than 20 minutes with our init scripts. We have tried the following:1. Libraries tab in the c...
- 3335 Views
- 2 replies
- 1 kudos
- 1 kudos
@Retired_mod Thank you for your detailed response! I think we would like to use Docker if we can because we are not using RStudio but R directly in the databricks notebooks and workflows. So, anymore information about R and Docker and Databricks woul...
- 1 kudos
- 6773 Views
- 6 replies
- 2 kudos
data quality check in data engineering
Can we use deequ library with azure databricks ? if yes Please provide some support material or examplesIs there any similar data quality library or suggestion to achieve automatic data quality check during data engineering (Azure databricks)Thanks i...
- 6773 Views
- 6 replies
- 2 kudos
- 2 kudos
Hi there! You could also take a look at Rudol, it enables no-code Data Quality validations to enable non-technical roles such as Business Analysts or Data Stewards to configure quality checks by themselves.
- 2 kudos
- 4411 Views
- 8 replies
- 9 kudos
Resolved! Facing StorageContext Error while trying to access DBFS
This issue has hindered my practice for the whole day. I scoured the web and couldn't find anybody who has faced this particular error. The error I am getting is: DBFS file browserStorageContext com.databricks.backend.storage.StorageContextType$DbfsR...
- 4411 Views
- 8 replies
- 9 kudos
- 9 kudos
Yeah, unable to save any file with rdd.saveTextfile and to upload any file using the workspace.
- 9 kudos
- 4431 Views
- 5 replies
- 5 kudos
Resolved! Unable to upload files from DBFS
When clicked on upload i am seeing below errorStorageContext com.databricks.backend.storage.StorageContextType$DbfsRoot$@2f3c3220 for workspace 1406865167171326 is not set in theCustomerStorageInfo.
- 4431 Views
- 5 replies
- 5 kudos
- 5 kudos
same issue at my end since yesterday.. does anyone know the reason and what needs to be done from our side (if any) to fix this.
- 5 kudos
- 959 Views
- 0 replies
- 0 kudos
DBCU Plans is costlier vs Job Compute Premium 0.30 per DBU Please justify
Please help me to understood the % of savings how Databricks are calculating DBCUThey are telling if I take DBCU 12500 plan the price will be with discount 12000 and 4% discount.That means if I consume 12500 DBU, I am paying for this $12000 and 4% sa...
- 959 Views
- 0 replies
- 0 kudos
- 1653 Views
- 2 replies
- 2 kudos
Resolved! I am facing issue with DBFS File server
Hi,I am facing the issue DBFS File server, anyone guide me how to resolve the issue. what steps should I take to resolve storage issue ?
- 1653 Views
- 2 replies
- 2 kudos
- 1900 Views
- 3 replies
- 3 kudos
Unablet to Install Python Wheel Library
Hello Team,Can someone let me know if there has been some changes to Databricks Community Edition such that it's no longer possible to install Python Wheel libraries? I was able to install Python Wheel libraries as recently as a few days ago, but now...
- 1900 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi there, Check your runtime version to check you are using a version that supports that and I think nowadays the recommended pattern (don’t quote me) is to store your wheel files on the workspace tree
- 3 kudos
- 3031 Views
- 3 replies
- 4 kudos
Resolved! How does AutoLoader works when triggered via Azure Data Factory?
Hi,I am currently creating an AutoLoader in databricks and will be using ADF as an orchestrator.I am quite confused how this will handle my data so please clarify if I misunderstood it.First, I will run my ADF pipeline which includes an activity to c...
- 3031 Views
- 3 replies
- 4 kudos
- 4 kudos
Auto loader will process files incrementally. Let's say you have a files in existing directory called /input_filesFirst time you run autoloader, it will read all files in that directory (unless you set an option includeExsistingFiles to false, like y...
- 4 kudos
- 4326 Views
- 3 replies
- 2 kudos
How can I deduplicate data from my stream?
Hi,I'm new to databricks and I'm trying to use stream for my incremental data. This data has duplicates which can be solved using a window function. Can you check where my code goes wrong?1-------#Using Auto Loader to read new files schema = df1.sche...
- 4326 Views
- 3 replies
- 2 kudos
- 2 kudos
Hi @zll_0091 ,Change the output mode to update. Other than that, your code looks fine, but I would rename variable microdf to windowSpec, because now it's little confusing.
- 2 kudos
- 3968 Views
- 3 replies
- 0 kudos
Resolved! DLT Compute: "Ephemeral" Job Compute vs. All-purpose compute 2.0 ... WHY?
Hi there, this is a follow-up from a discussion I started last monthSolved: Re: DLT Compute: "Ephemeral" Job Compute vs. All-p... - Databricks Community - 71661Based on what was discussed, I understand that it's not possible to use "All Purpose Clust...
- 3968 Views
- 3 replies
- 0 kudos
- 0 kudos
@ChristianRRL regarding on why DLT doesn't allow you to use all-purpose clusters: 1. The DLT runtime is derived from the shared compute DBR, it's not the same runtime and has different features than the common all-purpose runtime. A DLT pipeline is n...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
4 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 120 | |
| 57 | |
| 42 | |
| 37 |