- 1643 Views
- 2 replies
- 0 kudos
Python coding in notebook with a (long) token
I have written a python program (called by a trigger) that uses a token issued by a third party app (it's circa 400 bytes long including '.' and '-'). When I copy/paste this token into a Databricks notebook - curious formatting takes place and a coup...
- 1643 Views
- 2 replies
- 0 kudos
- 0 kudos
Hey Paul, You can use databricks secrets for preserving the integrity of the token.Here's the databricks doc for refernece : https://docs.databricks.com/aws/en/security/secrets
- 0 kudos
- 4454 Views
- 3 replies
- 0 kudos
Save output of show table extended to table?
I want to save the output of show table extended in catalogName like 'mysearchtext*';to a table.How do I do that?
- 4454 Views
- 3 replies
- 0 kudos
- 0 kudos
Use DESCRIBE EXTENDED customer AS JSON this returns as a json data . This you can load Applicable to databricks 16.2 and abovehttps://docs.databricks.com/aws/en/sql/language-manual/sql-ref-syntax-aux-describe-table
- 0 kudos
- 5335 Views
- 2 replies
- 1 kudos
Missing Genie - Upload File Feature in Preview Section
Despite having admin privileges for both the workspace and Genie Workspace, we are unable to see the "Genie - Upload File" feature under the Preview section, even though the documentation indicates it should be available.We also attempted switching r...
- 5335 Views
- 2 replies
- 1 kudos
- 1 kudos
For more information around upload a file option please refer https://docs.databricks.com/aws/en/genie/file-uploadit supports csv and excel datasets as of now with condition that files must be smaller than 200 MB and contain fewer than 100 columns du...
- 1 kudos
- 1976 Views
- 4 replies
- 4 kudos
Resolved! using Azure Databricks vs using Databricks directly
Hi friends,A quick question regarding how data, workspace controls works while using "Azure Databricks". I am planning to use Azure Databricks that comes as part of my employer's Azure Subscriptions. I work for a Public sector organization, which is ...
- 1976 Views
- 4 replies
- 4 kudos
- 4769 Views
- 0 replies
- 0 kudos
Support for managed identity based authentication in python kafka client
We followed this document https://docs.databricks.com/aws/en/connect/streaming/kafka?language=Python#msk-aad to use Kafka client to read events from our event hub for a feature.As part of the SFI, the guidance is to move away from client secret and u...
- 4769 Views
- 0 replies
- 0 kudos
- 563 Views
- 1 replies
- 0 kudos
Right course for ML engineer
Hi I would like to learn databricks so that I could look for job opportunities as a ML engineer. I have background with python programming, computer vision (OpenCV) .not having much of experience with azure , aws so on.which course here is good with ...
- 563 Views
- 1 replies
- 0 kudos
- 0 kudos
Given your background in Python programming and computer vision but limited experience with cloud platforms, the best pathway to enter the job market as MLE using Databricks is to pursue the Databricks Certified Machine Learning Associate certificati...
- 0 kudos
- 3363 Views
- 1 replies
- 1 kudos
Resolved! Why does .collect() cause a shuffle while .show() does not?
I’m learning Spark using the book Spark: The Definitive Guide and came across some behavior I’m trying to understand.I am reading a csv_file which has 3 columns: DEST_COUNTRY_NAME, ORIGIN_COUNTRY_NAME, count. The dataset has a total of 256 rows.Here’...
- 3363 Views
- 1 replies
- 1 kudos
- 1 kudos
Q1: collect() moves all data to the driver, hence a shufle. show() just shows x records from the df, from a partition (or more partitions if x > partition size). No shuffling needed.For display purposes the results are of course gathered on the driv...
- 1 kudos
- 1827 Views
- 2 replies
- 2 kudos
Lazy evaluation in serverless vs all purpose compute ?
As you can see right now I am connected to serverless compute and when I give wrong path, spark does lazy evaluation and gives error on display. However, when I switch from serverless to my all purpose cluster I get the error when I create the df its...
- 1827 Views
- 2 replies
- 2 kudos
- 2 kudos
Based on the scenario, what https://community.databricks.com/t5/user/viewprofilepage/user-id/156441 saying is correct though the eager evaluation property is false in both cases and for All-Purpose clusters, Spark is checking the path immediately whe...
- 2 kudos
- 756 Views
- 1 replies
- 0 kudos
Unable to access external table created by DLT
I originally set the Storage location in my DLT as abfss://{container}@{storageaccount}.dfs.core.windows.net/...But when running the DLT I got the following error:So I decided to leave the above Storage location blank and define the path parameter in...
- 756 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Tommy , Thanks for your question. I would encourage you to verify once using a Pro SQL Warehouse temporarily instead of a Serverless SQL Warehouse given the compute differences between the two - Pro compute resides in your data plane, Serverless ...
- 0 kudos
- 2202 Views
- 2 replies
- 2 kudos
OCRmyPDF in Databricks
Hello,Do any of you have experience with using OCRmyPDF in Databricks? I have tried to install it in various was with different versions, but my notebook keep crashing with the error:The Python process exited with exit code 139 (SIGSEGV: Segmentation...
- 2202 Views
- 2 replies
- 2 kudos
- 2 kudos
Refer to this link too https://community.databricks.com/t5/data-engineering/pdf-parsing-in-notebook/td-p/14636
- 2 kudos
- 3279 Views
- 2 replies
- 0 kudos
Can I automate notebook tagging based on workspace folder structure?
Hi all,I’m currently organizing a growing number of notebooks in our Databricks workspace and trying to keep things manageable with proper tagging and metadata. One idea I had was to automatically apply tags to notebooks based on their folder structu...
- 3279 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @EllaClark, Yes, you can automate tagging of Databricks notebooks based on folder structure using the REST API and a script. Use the Workspace API to list notebook paths, extract folder names, and treat them as tags.If the API supports metadata up...
- 0 kudos
- 870 Views
- 1 replies
- 1 kudos
Resolved! Simple notebook sync
Hi, is there a simple way to sync a local notebook with a Databricks notebook? For example, is it possible to just connect to the Databricks kernel or something similar?I know there are IDE extensions for this, but unfortunately, they use the local d...
- 870 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Kabi, as of my knowledge databricks doesn’t support directly connecting to Databricks kernel. However, here are practical ways to sync your local notebook with Databricks:You can use Git to version control your notebooks. Clone your repo into Dat...
- 1 kudos
- 537 Views
- 1 replies
- 1 kudos
Databricks Dashboard ,passing Prompt Values from one page to another
HI Guys,I have a dashboard with main page where I have a base query and added a date time range widget and linked it to filter the base query , Now I have a Page 2 where i use a a different sumamrized query as a source , base query2 . I need this qu...
- 537 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Mani2105, I guess currently, Databricks dashboards don’t support sharing widget parameters like date range filters across pages. Each page is isolated, so filters must be recreated manually per page. Manual configuration remains the only way to m...
- 1 kudos
- 1124 Views
- 2 replies
- 1 kudos
Asinine bad word detection
Are you kidding me here--I couldn't post this reply because (see arrows because I can't say the words)? I've run afoul of this several times before, bad word detection was a solved problem in the 1990s and there is even a term for errors like this--...
- 1124 Views
- 2 replies
- 1 kudos
- 1 kudos
Hello @Rjdudley! Thank you for bringing this to our attention. We understand how frustrating it can be to have your message incorrectly flagged, especially when you're contributing meaningfully. While our filters are in place to maintain a safe space...
- 1 kudos
- 1020 Views
- 5 replies
- 1 kudos
AI/BI Dashboard - Hide Column in Table Visualization, but not in exported data
How can I hide specific colum from a table visualization, but not in the exported data.I have over 200 columns in my query result and the ui freeze while I want to show it in a table visualization. So I want to hide specific columns, but if I export ...
- 1020 Views
- 5 replies
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
115 | |
56 | |
42 | |
34 |