- 839 Views
- 1 replies
- 0 kudos
Trusted assets vs query examples
¡Hi community! In recent days I explored trusted assets in my genie space and this working very well! but I feel a little confused :sIn my genie space I have many queries examples when I create a new function with the same query example for verify th...
- 839 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Dulce42! It depends on your use case. If your function covers the scenario well, you don’t need a separate query example. Having both for the same purpose can create redundancy and make things more complex. Choose the option that best fits you...
- 0 kudos
- 1273 Views
- 2 replies
- 0 kudos
Resolved! Need help to add personal email to databricks partner account
I have been actively using the Databricks Partner Academy for the past three years through my current organization. As I am planning to transition to a new company, I would like to ensure continued access to my training records and certifications.Cur...
- 1273 Views
- 2 replies
- 0 kudos
- 1140 Views
- 1 replies
- 0 kudos
Python versions - Notebooks and DBR
Hi,I have a problem with conflicting python versions in a notebook running with the Databricks 14 day free trial. One example:spark.conf.get("spark.databricks.clusterUsageTags.clusterName") # Returns: "Python versions in the Spark Connect client and...
- 1140 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Terje, were you able to fix it? From what I know, during the free trial period we’re limited to the default setup, so version mismatches can’t be resolved unless we upgrade to a paid workspace.
- 0 kudos
- 628 Views
- 1 replies
- 1 kudos
Completed Machine learning course
I have completed my course for Machine learning as part of Learning festival.
- 628 Views
- 1 replies
- 1 kudos
- 1734 Views
- 2 replies
- 0 kudos
Python coding in notebook with a (long) token
I have written a python program (called by a trigger) that uses a token issued by a third party app (it's circa 400 bytes long including '.' and '-'). When I copy/paste this token into a Databricks notebook - curious formatting takes place and a coup...
- 1734 Views
- 2 replies
- 0 kudos
- 0 kudos
Hey Paul, You can use databricks secrets for preserving the integrity of the token.Here's the databricks doc for refernece : https://docs.databricks.com/aws/en/security/secrets
- 0 kudos
- 4550 Views
- 3 replies
- 0 kudos
Save output of show table extended to table?
I want to save the output of show table extended in catalogName like 'mysearchtext*';to a table.How do I do that?
- 4550 Views
- 3 replies
- 0 kudos
- 0 kudos
Use DESCRIBE EXTENDED customer AS JSON this returns as a json data . This you can load Applicable to databricks 16.2 and abovehttps://docs.databricks.com/aws/en/sql/language-manual/sql-ref-syntax-aux-describe-table
- 0 kudos
- 5492 Views
- 2 replies
- 1 kudos
Missing Genie - Upload File Feature in Preview Section
Despite having admin privileges for both the workspace and Genie Workspace, we are unable to see the "Genie - Upload File" feature under the Preview section, even though the documentation indicates it should be available.We also attempted switching r...
- 5492 Views
- 2 replies
- 1 kudos
- 1 kudos
For more information around upload a file option please refer https://docs.databricks.com/aws/en/genie/file-uploadit supports csv and excel datasets as of now with condition that files must be smaller than 200 MB and contain fewer than 100 columns du...
- 1 kudos
- 2169 Views
- 4 replies
- 4 kudos
Resolved! using Azure Databricks vs using Databricks directly
Hi friends,A quick question regarding how data, workspace controls works while using "Azure Databricks". I am planning to use Azure Databricks that comes as part of my employer's Azure Subscriptions. I work for a Public sector organization, which is ...
- 2169 Views
- 4 replies
- 4 kudos
- 597 Views
- 1 replies
- 0 kudos
Right course for ML engineer
Hi I would like to learn databricks so that I could look for job opportunities as a ML engineer. I have background with python programming, computer vision (OpenCV) .not having much of experience with azure , aws so on.which course here is good with ...
- 597 Views
- 1 replies
- 0 kudos
- 0 kudos
Given your background in Python programming and computer vision but limited experience with cloud platforms, the best pathway to enter the job market as MLE using Databricks is to pursue the Databricks Certified Machine Learning Associate certificati...
- 0 kudos
- 3519 Views
- 1 replies
- 1 kudos
Resolved! Why does .collect() cause a shuffle while .show() does not?
I’m learning Spark using the book Spark: The Definitive Guide and came across some behavior I’m trying to understand.I am reading a csv_file which has 3 columns: DEST_COUNTRY_NAME, ORIGIN_COUNTRY_NAME, count. The dataset has a total of 256 rows.Here’...
- 3519 Views
- 1 replies
- 1 kudos
- 1 kudos
Q1: collect() moves all data to the driver, hence a shufle. show() just shows x records from the df, from a partition (or more partitions if x > partition size). No shuffling needed.For display purposes the results are of course gathered on the driv...
- 1 kudos
- 1931 Views
- 2 replies
- 2 kudos
Lazy evaluation in serverless vs all purpose compute ?
As you can see right now I am connected to serverless compute and when I give wrong path, spark does lazy evaluation and gives error on display. However, when I switch from serverless to my all purpose cluster I get the error when I create the df its...
- 1931 Views
- 2 replies
- 2 kudos
- 2 kudos
Based on the scenario, what https://community.databricks.com/t5/user/viewprofilepage/user-id/156441 saying is correct though the eager evaluation property is false in both cases and for All-Purpose clusters, Spark is checking the path immediately whe...
- 2 kudos
- 866 Views
- 1 replies
- 0 kudos
Unable to access external table created by DLT
I originally set the Storage location in my DLT as abfss://{container}@{storageaccount}.dfs.core.windows.net/...But when running the DLT I got the following error:So I decided to leave the above Storage location blank and define the path parameter in...
- 866 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Tommy , Thanks for your question. I would encourage you to verify once using a Pro SQL Warehouse temporarily instead of a Serverless SQL Warehouse given the compute differences between the two - Pro compute resides in your data plane, Serverless ...
- 0 kudos
- 2549 Views
- 2 replies
- 2 kudos
OCRmyPDF in Databricks
Hello,Do any of you have experience with using OCRmyPDF in Databricks? I have tried to install it in various was with different versions, but my notebook keep crashing with the error:The Python process exited with exit code 139 (SIGSEGV: Segmentation...
- 2549 Views
- 2 replies
- 2 kudos
- 2 kudos
Refer to this link too https://community.databricks.com/t5/data-engineering/pdf-parsing-in-notebook/td-p/14636
- 2 kudos
- 3443 Views
- 2 replies
- 0 kudos
Can I automate notebook tagging based on workspace folder structure?
Hi all,I’m currently organizing a growing number of notebooks in our Databricks workspace and trying to keep things manageable with proper tagging and metadata. One idea I had was to automatically apply tags to notebooks based on their folder structu...
- 3443 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @EllaClark, Yes, you can automate tagging of Databricks notebooks based on folder structure using the REST API and a script. Use the Workspace API to list notebook paths, extract folder names, and treat them as tags.If the API supports metadata up...
- 0 kudos
- 969 Views
- 1 replies
- 1 kudos
Resolved! Simple notebook sync
Hi, is there a simple way to sync a local notebook with a Databricks notebook? For example, is it possible to just connect to the Databricks kernel or something similar?I know there are IDE extensions for this, but unfortunately, they use the local d...
- 969 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Kabi, as of my knowledge databricks doesn’t support directly connecting to Databricks kernel. However, here are practical ways to sync your local notebook with Databricks:You can use Git to version control your notebooks. Clone your repo into Dat...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
BI Integrations
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
2 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 120 | |
| 57 | |
| 42 | |
| 35 |