- 12006 Views
- 2 replies
- 0 kudos
Long running jobs get lost
Hello,I tried to schedule a long running job and surprisingly it does seem to neither terminate (and thus does not let the cluster shut down), nor continue running, even though the state is still "Running":But the truth is that the job has miserably ...
- 12006 Views
- 2 replies
- 0 kudos
- 0 kudos
Have you looked at the sql plan to see what the spark job 72 was doing?
- 0 kudos
- 2415 Views
- 2 replies
- 0 kudos
Reading csv file with spark throws [insufficient privelage] error
Hello Community,I have some csv files saved in databricks workspace and want to read them with spark. I make use of the commanddf = spark.read.format('csv').load(r'filepath') However, it throws the error.org.apache.spark.SparkSecurityException: [INSU...
- 2415 Views
- 2 replies
- 0 kudos
- 0 kudos
If this a UC enabled workspace, you need to provide the right access.
- 0 kudos
- 4781 Views
- 3 replies
- 2 kudos
Resolved! Update regarding Community Reward Store
Hi Team,Is there any update on the Community Reward Store, as it's been discontinued from the old portal, and we still can't see the new portal for that.Is there any expected date when this will be available for community members?
- 4781 Views
- 3 replies
- 2 kudos
- 1296 Views
- 1 replies
- 0 kudos
Using SQL for Structured Streaming
Hi!I'm new to Databricks. I'm trying to create a data pipeline with structured streaming. A minimal example data pipeline would look like: read from upstream Kafka source, do some data transformation, then write to downstream Kafka sink. I want to do...
- 1296 Views
- 1 replies
- 0 kudos
- 0 kudos
Ok I figured out why I was getting an error on the usage of `read_kafka`. My default cluster was set up with the wrong Databricks runtime
- 0 kudos
- 2003 Views
- 0 replies
- 3 kudos
Error while encoding: java.lang.RuntimeException: org.apache.spark.sql.catalyst.util.GenericArrayDa
Hello:)we are trying to run an existing working flow that works currently on EMR, on databricks.we use LTS 10.4, and when loading the data we get the following error:at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:...
- 2003 Views
- 0 replies
- 3 kudos
- 4427 Views
- 1 replies
- 0 kudos
if any user has only permission 'select table' in unityCatalog but not having permission to ext loc
Hi,Suppose one use having access 'Select' permission the table but user not having any permission to table external location in the 'external location'.. User will be able to read the data from table?? if yes how can user will be able to read the wh...
- 4427 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod , thanks for response.. Why the hyperlink command not showing full?
- 0 kudos
- 2940 Views
- 5 replies
- 0 kudos
How to switch Workspaces via menue
Hello,In various webinars and videos featuring Databricks instructors, I have noticed that it is possible to switch between different workspaces using the top menu within a workspace. However, in our organization, we have three separate workspaces wi...
- 2940 Views
- 5 replies
- 0 kudos
- 0 kudos
Hi @RobinK looking at screenshots provided i can see you have access to different workspaces but still the dropdown is not visible for you, i also checked if there is any setting for same but i didnt found it.you can raise a ticket to databricks and ...
- 0 kudos
- 3508 Views
- 1 replies
- 1 kudos
Resolved! Issue with creating cluster on Community Edition
I have recently signed up for Databricks Community Edition and have yet to succesfully create a cluster.I get this message when trying to create a cluster:"Self-bootstrap failure during launch. Please try again later and contact Databricks if the pro...
- 3508 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @dustint121 It's Databricks internal issue; wait for some time and it will resolve.
- 1 kudos
- 3793 Views
- 3 replies
- 1 kudos
Resolved! Databricks community edition down?
I am getting this error when trying to create a cluster: "Self-bootstrap failure during launch. Please try again later and contact Databricks if the problem persists. Node daemon fast failed and did not answer ping for instance"
- 3793 Views
- 3 replies
- 1 kudos
- 1 kudos
I still have this issue, and have yet to successfully create a cluster instance.Please advise on how this error was fixed.
- 1 kudos
- 2456 Views
- 3 replies
- 0 kudos
Autoloader update table when new changes are made
Hello,Everyday a new file of the same name gets sent to my storage account with old and new data appended at the end. Columns may also be added during one of these file updates. This file does a complete overwrite of the previous file. Is it possibl...
- 2456 Views
- 3 replies
- 0 kudos
- 0 kudos
This may be helpful - the bit on allow overwritehttps://docs.databricks.com/en/ingestion/auto-loader/faq.html
- 0 kudos
- 3201 Views
- 1 replies
- 0 kudos
System Tables - Billing schema
Hi Experts!We enabled UC and also the system table (Billing) to start monitoring usage and cost. We were able to create a dashboard where we can see the usage and cost for each workspace. The usage table in the billing schema has workspace_id but I'd...
- 3201 Views
- 1 replies
- 0 kudos
- 0 kudos
@Retired_mod Im also not seeing the compute names logged in the system billing tables. Is this located elsewhere?
- 0 kudos
- 982 Views
- 0 replies
- 0 kudos
Azure Oauth Passthrough with the Go Driver
Can anyone point me towards some resources for achieving this? I already have the token.Trying with: dbsql.WithAccessToken(settings.Token)But I'm getting the following error:Unable to load OAuth Config: request error after 1 attempt(s): unexpected HT...
- 982 Views
- 0 replies
- 0 kudos
- 3791 Views
- 3 replies
- 0 kudos
Resolved! vscode python project for development
Hi,I'm trying to set up a local development environment using python / vscode / poetry. Also, linting is enabled (Microsoft pylance extension) and the python.analysis.typeCheckingMode is set to strict.We are using python files for our code (.py) whit...
- 3791 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi Alexandru, Take a look at VSCode extension for Databricks : https://marketplace.visualstudio.com/items?itemName=databricks.databricks
- 0 kudos
- 2030 Views
- 1 replies
- 0 kudos
Can browse external Storage, but can not create a Table from there - VNET, ADLSGen2
Hi there!Hope somebody here can help me. We have created a new Databricks Account on Azure with the ARM template for VNET injection.We have all the subnets etc., unitiy catalog active and the connector for databricks.I want now to create my first tab...
- 2030 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi,To solve this problem, the following Microsoft documentation can be used to configure the NCC to enable the connection between the private Azure storage and the serverless resources.https://learn.microsoft.com/en-us/azure/databricks/security/netwo...
- 0 kudos
- 4637 Views
- 6 replies
- 1 kudos
DataFrame to CSV write has issues due to multiple commas inside an row value
Hi alliam working on a data containing JSON fields with embedded commas into CSV format. iam facing challenges due to the commas within the JSON being misinterpreted as column delimiters during the conversion process.i tried several methods to modify...
- 4637 Views
- 6 replies
- 1 kudos
- 1 kudos
Hi Sai, I assume that the problem comes not from the PySpark, but from Excel. I tried to reproduce the error and didn't find the way - that a good thing, right ? Please try the following : df.write.format("csv").save("/Volumes/<my_catalog_name>/<m...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
4 -
Azure data disk
1 -
Azure databricks
13 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Group
1 -
Community Members
1 -
Compute
3 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
9 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks notebook
2 -
Databricks Notebooks
2 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
1 -
Google Bigquery
1 -
HIPAA
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
MlFlow
2 -
Model Training
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
4 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
1 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
122 | |
56 | |
42 | |
30 | |
20 |