- 1386 Views
- 1 replies
- 0 kudos
Using SQL for Structured Streaming
Hi!I'm new to Databricks. I'm trying to create a data pipeline with structured streaming. A minimal example data pipeline would look like: read from upstream Kafka source, do some data transformation, then write to downstream Kafka sink. I want to do...
- 1386 Views
- 1 replies
- 0 kudos
- 0 kudos
Ok I figured out why I was getting an error on the usage of `read_kafka`. My default cluster was set up with the wrong Databricks runtime
- 0 kudos
- 2200 Views
- 0 replies
- 3 kudos
Error while encoding: java.lang.RuntimeException: org.apache.spark.sql.catalyst.util.GenericArrayDa
Hello:)we are trying to run an existing working flow that works currently on EMR, on databricks.we use LTS 10.4, and when loading the data we get the following error:at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:...
- 2200 Views
- 0 replies
- 3 kudos
- 4569 Views
- 1 replies
- 0 kudos
if any user has only permission 'select table' in unityCatalog but not having permission to ext loc
Hi,Suppose one use having access 'Select' permission the table but user not having any permission to table external location in the 'external location'.. User will be able to read the data from table?? if yes how can user will be able to read the wh...
- 4569 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod , thanks for response.. Why the hyperlink command not showing full?
- 0 kudos
- 3234 Views
- 5 replies
- 0 kudos
How to switch Workspaces via menue
Hello,In various webinars and videos featuring Databricks instructors, I have noticed that it is possible to switch between different workspaces using the top menu within a workspace. However, in our organization, we have three separate workspaces wi...
- 3234 Views
- 5 replies
- 0 kudos
- 0 kudos
Hi @RobinK looking at screenshots provided i can see you have access to different workspaces but still the dropdown is not visible for you, i also checked if there is any setting for same but i didnt found it.you can raise a ticket to databricks and ...
- 0 kudos
- 3692 Views
- 1 replies
- 1 kudos
Resolved! Issue with creating cluster on Community Edition
I have recently signed up for Databricks Community Edition and have yet to succesfully create a cluster.I get this message when trying to create a cluster:"Self-bootstrap failure during launch. Please try again later and contact Databricks if the pro...
- 3692 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @dustint121 It's Databricks internal issue; wait for some time and it will resolve.
- 1 kudos
- 3963 Views
- 3 replies
- 1 kudos
Resolved! Databricks community edition down?
I am getting this error when trying to create a cluster: "Self-bootstrap failure during launch. Please try again later and contact Databricks if the problem persists. Node daemon fast failed and did not answer ping for instance"
- 3963 Views
- 3 replies
- 1 kudos
- 1 kudos
I still have this issue, and have yet to successfully create a cluster instance.Please advise on how this error was fixed.
- 1 kudos
- 2835 Views
- 3 replies
- 0 kudos
Autoloader update table when new changes are made
Hello,Everyday a new file of the same name gets sent to my storage account with old and new data appended at the end. Columns may also be added during one of these file updates. This file does a complete overwrite of the previous file. Is it possibl...
- 2835 Views
- 3 replies
- 0 kudos
- 0 kudos
This may be helpful - the bit on allow overwritehttps://docs.databricks.com/en/ingestion/auto-loader/faq.html
- 0 kudos
- 3440 Views
- 1 replies
- 0 kudos
System Tables - Billing schema
Hi Experts!We enabled UC and also the system table (Billing) to start monitoring usage and cost. We were able to create a dashboard where we can see the usage and cost for each workspace. The usage table in the billing schema has workspace_id but I'd...
- 3440 Views
- 1 replies
- 0 kudos
- 0 kudos
@Retired_mod Im also not seeing the compute names logged in the system billing tables. Is this located elsewhere?
- 0 kudos
- 1055 Views
- 0 replies
- 0 kudos
Azure Oauth Passthrough with the Go Driver
Can anyone point me towards some resources for achieving this? I already have the token.Trying with: dbsql.WithAccessToken(settings.Token)But I'm getting the following error:Unable to load OAuth Config: request error after 1 attempt(s): unexpected HT...
- 1055 Views
- 0 replies
- 0 kudos
- 2389 Views
- 1 replies
- 0 kudos
Can browse external Storage, but can not create a Table from there - VNET, ADLSGen2
Hi there!Hope somebody here can help me. We have created a new Databricks Account on Azure with the ARM template for VNET injection.We have all the subnets etc., unitiy catalog active and the connector for databricks.I want now to create my first tab...
- 2389 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi,To solve this problem, the following Microsoft documentation can be used to configure the NCC to enable the connection between the private Azure storage and the serverless resources.https://learn.microsoft.com/en-us/azure/databricks/security/netwo...
- 0 kudos
- 5134 Views
- 6 replies
- 1 kudos
DataFrame to CSV write has issues due to multiple commas inside an row value
Hi alliam working on a data containing JSON fields with embedded commas into CSV format. iam facing challenges due to the commas within the JSON being misinterpreted as column delimiters during the conversion process.i tried several methods to modify...
- 5134 Views
- 6 replies
- 1 kudos
- 1 kudos
Hi Sai, I assume that the problem comes not from the PySpark, but from Excel. I tried to reproduce the error and didn't find the way - that a good thing, right ? Please try the following : df.write.format("csv").save("/Volumes/<my_catalog_name>/<m...
- 1 kudos
- 3456 Views
- 1 replies
- 0 kudos
Access Delta sharing from Azure Data Factory
I recently got access to delta sharing and I am looking to access the data from the tables in share through ADF. I used linked services such as REST API and HTTP and successfully established connection using the credential file token and http path, h...
- 3456 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey, I think you'll need to use a Databricks activity instead of Copy See : https://learn.microsoft.com/en-us/azure/data-factory/connector-overview#integrate-with-more-data-storeshttps://learn.microsoft.com/en-us/azure/data-factory/transform-data-dat...
- 0 kudos
- 2996 Views
- 4 replies
- 1 kudos
Redefine ETL strategy with pypskar approach
Hey everyone!I've some previous experience with Data Engineering, but totally new in Databricks and Delta Tables.Starting this thread hoping to ask some questions and asking for help on how to design a process.So I have essentially 2 delta tables (sa...
- 2996 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi @databird , You can review the code of each demo by opening the content via "View the Notebooks" or by exploring the following repo : https://github.com/databricks-demos (you can try to search for "merge" to see all the occurrences, for example) T...
- 1 kudos
- 2084 Views
- 2 replies
- 0 kudos
There is no certification number in my Databricks certificate that i had received after passing the
I enrolled myself for the Databricks data engineer certification recently and gave a shot at the exam and i did clear it successfully. I have received the certificate in the form of a pdf file along with a URL in which i can see my certificate and ba...
- 2084 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @vinay076 Thanks for asking! Our support team can provide you with a credential ID. Please file a ticket with our support team, give them your email associated with your certification, and they can get you the credential ID.
- 0 kudos
- 8354 Views
- 5 replies
- 4 kudos
Resolved! How obtain a list of workflows in Databricks?
I need to obtain a list of my Databricks workflows with their job IDs in a notebook Databricks
- 8354 Views
- 5 replies
- 4 kudos
- 4 kudos
Hi @VabethRamirez , Also, instead of using directly the API, you can use databricks Python sdk : %pip install databricks-sdk --upgrade dbutils.library.restartPython()from databricks.sdk import WorkspaceClient w = WorkspaceClient() job_list = w.jobs...
- 4 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
88 | |
42 | |
42 | |
30 |