- 1152 Views
- 1 replies
- 0 kudos
Infrastructure reverse engineering
How can we do infrastructure reverse engineering in Databricks? Any kind of solution accelerator or tool that can scan through the environment and identify the solutions /components that are used in the Databricks environment and the complexity withi...
- 1152 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi Team,Could you please help us on this.Regards,Phanindra
- 0 kudos
- 1622 Views
- 1 replies
- 0 kudos
Got Suspended while taking Databricks Certified Data Analyst Associate Assessment
Hi Team, What I experienced today from the proctor was so not nice, my experience with proctor today was very frustrating and pathetic, I was taking my assessment today 11-26-2023, 2pm, I was already on between question 22nd to 25th when my assessmen...
- 1622 Views
- 1 replies
- 0 kudos
- 0 kudos
@osalawu Sorry to hear you had an issue with your exam. In order to protect your Webassessor account information, please file a ticket with our support team. Please include your Webassessor login ID, the exam, and a couple of dates and times that wil...
- 0 kudos
- 1417 Views
- 0 replies
- 0 kudos
Unable to log metrics to table/driver using spark StreamingQueryListener
Hi Everyone,Hope everyone is enjoying their holiday time.I have spark streaming job (copy activity from location A to location B) and a Query Listener and attached to the streaming job.As mentioned over the attachments I can see numInputRows gets log...
- 1417 Views
- 0 replies
- 0 kudos
- 2412 Views
- 0 replies
- 0 kudos
Arguments parsing in Databricks python jobs
On Databricks created a job task with task type as Python script from s3. However, when arguments are passed via Parameters option, running into unrecognized arguments' error.Code in s3 file:import argparse def parse_arguments(): parser = argpar...
- 2412 Views
- 0 replies
- 0 kudos
- 1463 Views
- 0 replies
- 0 kudos
Unity Catalog - Limited Options for Connection Objects
I’m currently trying to create a Foreign Catalog based on a Connection object of type SQLSERVER. This would allow me to directly access our on-premises MS SQL database from within Azure Databricks using Unity Catalog.As I’m part of a large organizati...
- 1463 Views
- 0 replies
- 0 kudos
- 1442 Views
- 1 replies
- 0 kudos
Can not able to connect to ADLS Gen2 using ABFSS (urgent)
Hi all,I was having an error when trying to connect to azure data lake. "SSLHandshakeExceptio" Please see the detail below.Please help!
- 1442 Views
- 1 replies
- 0 kudos
- 0 kudos
The error message you provided, "SSLHandshakeExceptio," indicates that there is a problem with the SSL handshake between your system and the Azure Data Lake. This issue can be caused by various factors, such as incorrect settings, network issues, or ...
- 0 kudos
- 1750 Views
- 0 replies
- 0 kudos
Structured Streaming - Kafka Offset Management
In my team, we decided to move from spark streaming to structured streaming, mainly cause it says that it's legacy and we want to benefit new features from structured streaming.However we have an issue with committing offsets.Previously on spark stre...
- 1750 Views
- 0 replies
- 0 kudos
- 21443 Views
- 2 replies
- 0 kudos
.overwriteschema + writestream
HelloI have issue with overwriting schema while using writestream - I do not receive any error - however schema remain unchangedBelow exampledf_abc = spark.readstream .format("cloudFiles") .option("cloudFiles.format", "parquet") .option"cloudF...
- 21443 Views
- 2 replies
- 0 kudos
- 9997 Views
- 3 replies
- 0 kudos
dataset.cache() not working : NoSuchObjectException(message:There is no database named global_temp)
ERROR RetryingHMSHandler: NoSuchObjectException(message:There is no database named global_temp)at org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:508)at org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.j...
- 9997 Views
- 3 replies
- 0 kudos
- 0 kudos
@Retired_mod can you please help me to resolve this issue.
- 0 kudos
- 1325 Views
- 0 replies
- 0 kudos
I want to perform transpose in my dataframe using pyspark
I have 26 columns and 18k rows which I want to transpose and make 18k columns and 26 rows.Don't want any data summation of aggregation just transpose as it is. Can anyone pls suggest.P.S Not via converting to pandas because of large data
- 1325 Views
- 0 replies
- 0 kudos
- 2190 Views
- 1 replies
- 0 kudos
Delta Live Tables Slowly Changing Dimensions Type 2 with Joins
Hi,I may be missing something really obvious here. The organisation I work for has started using Delta Live Tables in Databricks for data modelling, recently. One of the dimensions I am trying to model takes data from 3 existing tables in our data la...
- 2190 Views
- 1 replies
- 0 kudos
- 0 kudos
Can it be because the default join is `inner` and that means the row must exists in both tables
- 0 kudos
- 1111 Views
- 1 replies
- 0 kudos
Databricks Community Post Editor Issues
Anyone else constantly having errors with this editor when using any of the 'features' like code sample?Can we please have a Markdown Editor or at least the ability to edit the HTML this tool creates to fix all the bugs it makes?
- 1111 Views
- 1 replies
- 0 kudos
- 0 kudos
Here is a fun one: "The message body contains h d, which is not permitted in this community. Please remove this content before sending your post."Had to add the space between h and d to be able to post it. This means code samples can't contain `ch d...
- 0 kudos
- 2072 Views
- 0 replies
- 0 kudos
Databricks Advanced Data Engineering Course Factually Incorrect and Misleading
On Video 4 of the Advanced Data Engineering with Databricks course at 3:08 the presenter says 'No one else can do what we can with a single solution' . This is far from truth, Palantir foundry is miles ahead of databricks in Data Governance , Ease of...
- 2072 Views
- 0 replies
- 0 kudos
- 6501 Views
- 0 replies
- 0 kudos
Error handling best practices
Hi Team,Could you please share the best practices for error handling in Databricks for the following: 1. Notebook level 2.Job level 3. Code level(Python) 4. streaming 5. DLT & Autoloader Kindly suggest details around Error handling...
- 6501 Views
- 0 replies
- 0 kudos
- 4372 Views
- 2 replies
- 1 kudos
Resolved! Rendering markdown images hard coded as data image png base64 in notebook
Hi all,For training purposes, I have cloned a repo from John Snow Labs into my Databricks account and am working in the notebook that you can review at https://github.com/JohnSnowLabs/spark-nlp-workshop/blob/master/open-source-nlp/03.0.SparkNLP_Pretr...
- 4372 Views
- 2 replies
- 1 kudos
- 1 kudos
Try changing the magic command for that cell from %md to %md-sandbox to see if that helps the image to render appropriately.
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Delta Lake
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
90 | |
42 | |
42 | |
30 |