- 10574 Views
- 2 replies
- 2 kudos
Databricks Asset bundle
Hi,I am new to databricks, We are trying to use Databricks asset bundles for code deployment .I have spect a lot of time but still so many things are not clear to me.Can we change the target path of the notebooks deployed from /shared/.bundle/* to so...
- 10574 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @Retired_mod,Thank you for your post, I thought it will solve my issues too, however after reading your suggestion it was nothing new for me, because l already have done exactly that.. Here is what I have dome so you or anyone can replicate it:1. ...
- 2 kudos
- 2371 Views
- 1 replies
- 0 kudos
Using nested dataframes with databricks-connect>13.x
We needed to move to databricks-connect>13.x. Now I facing the issue that when I work with a nested dataframe of the structure```root|-- a: string (nullable = true)|-- b: array (nullable = true)| |-- element: struct (containsNull = true)| | |-- c: s...
- 2371 Views
- 1 replies
- 0 kudos
- 0 kudos
In addition here is the full stack trace23/12/07 14:51:56 ERROR SerializingExecutor: Exception while executing runnable grpc_shaded.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable@33dfd6ecgrpc_shaded.io.grpc...
- 0 kudos
- 770 Views
- 0 replies
- 0 kudos
Auto Loader notebook for multiple tables
Hi Team,My requirement is ,i do have File A from source A which needs to write into Multiple Delta tables i.e DeltaTableA,DeltaTableB,DeltaTableC. Is it possible to have a single instance of an autoloader script. (multiple write streams). Could you p...
- 770 Views
- 0 replies
- 0 kudos
- 3138 Views
- 1 replies
- 0 kudos
How to facilitate incremental updates to an SCD Type 1 table that uses SCD Type 2 source tables
I have an SCD Type 1 delta table (target) for which I am trying to figure out how to facilitate insert, updates, and deletes. This table is sourced by multiple delta tables, with an SCD Type 2 structure, which are joined together to create the targe...
- 3138 Views
- 1 replies
- 0 kudos
- 0 kudos
Correction (I can't seem to edit or remove original post):- "... trying to think through an process" --> *a* process- "Thoughts and advice or much appreciated" --> Thoughts and/or advice are much appreciated.
- 0 kudos
- 4915 Views
- 4 replies
- 0 kudos
Cluster Access mode set to Shared on Databricks, results in connection refused on Exasol
I am trying to run a TRUNCATE command on my exasol DWH from Databricks using the pyexasol. This works perfectly fine when I have the cluster access mode as "No Isolation Shared" which does not have access to our Unity Catalog. When I change the clust...
- 4915 Views
- 4 replies
- 0 kudos
- 0 kudos
Interesting. Did you try with "Single User" mode, which also has UC support?
- 0 kudos
- 1238 Views
- 0 replies
- 0 kudos
Error from Knime trought proxy
I want to connect to Databricks from Knime on a company computer that uses a proxy. The error I'm encountering is as follows: ERROR Create Databricks Environment 3:1 Execute failed: Could not open the client transport with JDBC URI: jdbc:hive2://adb-...
- 1238 Views
- 0 replies
- 0 kudos
- 4223 Views
- 2 replies
- 0 kudos
'NotebookHandler' object has no attribute 'setContext' in pyspark streaming in AWS
I am facing issue while calling dbutils.notebook.run() inside of pyspark streaming with concurrent.executor. At first the error is "pyspark.sql.utils.IllegalArgumentException: Context not valid. If you are calling this outside the main thread,you mus...
- 4223 Views
- 2 replies
- 0 kudos
- 0 kudos
The error message you're encountering in PySpark when using dbutils.notebook.run() suggests that the context in which you are attempting to call the run() method is not valid. PySpark notebooks in Databricks have certain requirements when it comes to...
- 0 kudos
- 31999 Views
- 3 replies
- 7 kudos
Introducing the Data Intelligence Platforms
Introducing the Data Intelligence Platform, our latest AI-driven data platform constructed on a lakehouse architecture. It’s not just an incremental improvement over current data platforms, but a fundamental shift in product strategy and roadmap. E...
- 31999 Views
- 3 replies
- 7 kudos
- 7 kudos
Hmm I preferred naming related to water like data lake, delta lake and lakehouse
- 7 kudos
- 857 Views
- 0 replies
- 0 kudos
Databricks to make a machine learning model
Hey all,I've been using a voice cloning AI and it's working well. I'm thinking of using Databricks to make a machine learning model for speech tech. I want to start with personal content creation. Any tips or advice would be great!
- 857 Views
- 0 replies
- 0 kudos
- 641 Views
- 0 replies
- 0 kudos
thinking of using Databricks to make a machine learning model
Hey community,I've been using this voice cloning AI and it's working well. I'm thinking of using Databricks to make a machine learning model for speech tech. I want to start with personal content creation.Any tips or advice would be great!
- 641 Views
- 0 replies
- 0 kudos
- 3068 Views
- 1 replies
- 0 kudos
java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManager.<init>(Lcom/amazonaw
- 3068 Views
- 1 replies
- 0 kudos
- 2897 Views
- 0 replies
- 0 kudos
MissingCredentialScopeException when writing hail matrix table to unity volumes
Hello,I am tinkering with using Unity Catalog Volumes with hailI tried the following hl_1000g.write( str(VOLUMES_PATH / '1kg.mt'), overwrite=True )Where 'VOLUMES_PATH' is `Path("/Volumes") / "hail" / "volumes_testing" / "1kg"`Unfortunately ...
- 2897 Views
- 0 replies
- 0 kudos
- 4256 Views
- 1 replies
- 0 kudos
Azure Databricks Notebook Sharing, Notebook Exporting and Notebook Clipboard copy download
Hello,I would like to know in which scenario Azure Databricks User would be able to download Notebook Command Output if Notebook Result Download is disabled. Do we know if Privilege user would be able to share sensitive information with non-privilege...
- 4256 Views
- 1 replies
- 0 kudos
- 0 kudos
Thank you Kaniz. Can we disable the exporting of notebook except Source File? If yes, then how do we achieve is?Also, we do not want to share the notebook which has any kind of notebook results, can we use spark.databricks.query.displayMaxRows and se...
- 0 kudos
- 1792 Views
- 0 replies
- 0 kudos
Billing usage per user
Hi Team ,Unity catalog is not enabled in our workspace, We would like to know the billing usage information per user ,could you please help us how to get these details( by using notebook level script).Regards,Phanindra
- 1792 Views
- 0 replies
- 0 kudos
- 9857 Views
- 1 replies
- 0 kudos
Resolved! 'Unity Catalog Volumes is not enabled on this instance' error
Hi all,tl;dr I ran the following on a docker-backed personal compute instance (running 13.3-LTS)```%sqlUSE CATALOG hail;USE SCHEMA volumes_testing;CREATE VOLUME 1kg COMMENT 'Testing 1000 Genomes volume';```But this gives```ParseException: [UC_VOLU...
- 9857 Views
- 1 replies
- 0 kudos
- 0 kudos
Resolved with the setting "spark.databricks.unityCatalog.volumes.enabled" = "true"
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
2 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 119 | |
| 57 | |
| 42 | |
| 35 |