- 5194 Views
- 2 replies
- 0 kudos
Resolved! Using Python RPA Library on Databricks
Hi I didn't see any conversations regarding using python RPA package on Data bricks clusters. Is anyone doing this or have gotten it to successfully work on the clusters? I ran into the following errors:1) Initially I was getting the error below rega...
- 5194 Views
- 2 replies
- 0 kudos
- 0 kudos
If you want to capture browser screenshot, you can use playwright%sh pip install playwright playwright install sudo apt-get update playwright install-deps from playwright.async_api import async_playwright async with async_playwright() as p: ...
- 0 kudos
- 2430 Views
- 0 replies
- 0 kudos
Unity Catalog: Databricks *Specific* Features
Good day,Deceptively simple question, are there any "Databricks only" specific features that Unity Catalog offers? I understand that generally speaking enabling UC offers some of the following:Data Discovery and LineageAuditing and MonitoringAccess C...
- 2430 Views
- 0 replies
- 0 kudos
- 3941 Views
- 2 replies
- 0 kudos
Problem login in
Hello allI´m new in this platform, I sign up, validated my email, create my password everything is fine and when I try to log in and start a message came upI create a new password but same happen again! But it works a few times, I think like 3 times....
- 3941 Views
- 2 replies
- 0 kudos
- 0 kudos
Can you confirm the username was created all lower case? Login is case sensitive so you need to make sure the username is set same exact as you add it in the console or workspace
- 0 kudos
- 2861 Views
- 1 replies
- 0 kudos
how to create volume using databricks cli commands
I am new to using volumes on databricks. Is there a way to create volume using CLI commands.On the similar note, is there a way to create DBFS directories and subdirectories using single command.for example: I want to copy file here dbfs:/FileStore/T...
- 2861 Views
- 1 replies
- 0 kudos
- 0 kudos
Creates a new volume. The user could create either an external volume or a managed volume. An external volume will be created in the specified external location, while a managed volume will be located in the default location which is specified bythe...
- 0 kudos
- 9160 Views
- 3 replies
- 3 kudos
Resolved! DLT Job Clusters: Continuous vs Triggered Cluster Start Times
Hi there,I'm curious if anyone is able to definitively help me answer how DLT Job Clusters operate/run.For example, the following is my baseline understanding of DLT Job Clusters. If I run a Triggered DLT Pipeline (e.g. daily) the job cluster takes m...
- 9160 Views
- 3 replies
- 3 kudos
- 3 kudos
Ideally one would expect clusters used for DLT pipeline to terminate after the pipeline execution has finished. However, while running in `development` environment, you'll notice it doesn't terminate on its own, whereas in `production` it terminates ...
- 3 kudos
- 2536 Views
- 1 replies
- 0 kudos
Can I update a table comment using REST API?
https://docs.databricks.com/api/workspace/tablesIt seems I could only list/delete tables, is there a way to update a table's metadata like comment or detail fields by REST API?
- 2536 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @al2co33 , We don't currently provide any APIs for updating table comments, however you can utilize the SQL Statement Execution API to do it. You can use the following tutorial to ALTER TABLE/COLUMN COMMENT. https://learn.microsoft.com/en-us/azure...
- 0 kudos
- 3791 Views
- 1 replies
- 0 kudos
Databricks Java - Create Jar in Java 11
I am trying to a run simple print java program which is not working and getting compilation version issues though i changed the environment variable points to java 11. Can you please help me ? Can we create java with spark session and execute as a ja...
- 3791 Views
- 1 replies
- 0 kudos
- 0 kudos
@Databricks_Java You can run command like this: spark-submit --class com.test.Main example.jarand make sure to check the java version and match with the DBR compatibility
- 0 kudos
- 2197 Views
- 1 replies
- 0 kudos
Disable personal compute with the Databricks API or UI
For a production environment, I want to disable the personal compute policy, because I do not want that all users can create personal compute clusters in production. Unfortunately, I am not able to access the account console, so I want to revoke perm...
- 2197 Views
- 1 replies
- 0 kudos
- 4630 Views
- 1 replies
- 0 kudos
No space left on device and IllegalStateException: Have already allocated a maximum of 8192 pages
Hello, I'm writing to bring to your attention an issue that we have encountered while working with Data bricks and seek your assistance in resolving it.Context of the Error : When a sql query(1700 lines) is ran, corresponding data bricks job is faili...
- 4630 Views
- 1 replies
- 0 kudos
- 0 kudos
are processing Parquet files or what is the format of your tables? can you split your sql query instead of having a huge query with 1700 lines
- 0 kudos
- 2382 Views
- 3 replies
- 0 kudos
Autoloader file latency
Hi Team,I would like to understand if there is a metadata table for the autoloader in Databricks that captures information about file arrival and processing.The reason we are experiencing data issues is because our table A receives hundreds of files ...
- 2382 Views
- 3 replies
- 0 kudos
- 0 kudos
Check with cloud_files_state() API You can find examples here https://docs.databricks.com/en/ingestion/auto-loader/production.html#querying-files-discovered-by-auto-loader
- 0 kudos
- 2616 Views
- 2 replies
- 2 kudos
Resolved! Regarding cloning my gitrepo under workspace/Users/user_name
Hi all,I am recently started using databricks. I want to my git repo under workspace/Users/user_name path which I can't able to do it. But i can able to clone only under repo directory by default.Can anyone pls advice me regarding this Thank you
- 2616 Views
- 2 replies
- 2 kudos
- 4511 Views
- 4 replies
- 0 kudos
Connect my spark code running in AWS ECS to databricks cluster
Hi team, I wanted to know if there is a way to connect a piece of my pyspark code running in ECS to Databricks cluster and leverage the databricks compute using Databricks connect?I see Databricks connect is for connecting local ide code to databrick...
- 4511 Views
- 4 replies
- 0 kudos
- 0 kudos
Noted @Retired_mod @RonDeFreitas. I am currently using Databricks runtime v12.2 (which is < v13.0). I followed this doc (Databricks Connect for Databricks Runtime 12.2 LTS and below) and connected my local terminal to Databricks cluster and was able ...
- 0 kudos
- 5543 Views
- 2 replies
- 0 kudos
Resolved! spark context in databricks
Hi @all,In Azure Databricks,I am using structured streaming for each batch functionality, in one of the functions I am creating tempview with pyspark dataframe (*Not GlobalTempView) and trying to access the same temp view by using spark.sql functiona...
- 5543 Views
- 2 replies
- 0 kudos
- 0 kudos
Do you face this issue without spark streaming as well? Also, could you share a minimal repo code preferably without streaming?
- 0 kudos
- 974 Views
- 0 replies
- 0 kudos
Databricks & Bigquery
Databricks is packaging a old version of big-query jar(Databricks also repackaged and created a fat jar), and our application needs a latest jar. Now the latest jar depends on spark-bigquery-connector.properties file for a property scala.binary.vers...
- 974 Views
- 0 replies
- 0 kudos
- 2129 Views
- 1 replies
- 0 kudos
Unity catalog internal error - quality monitoring
I try to get my head around the quality monitoring functionality in Unity Catalog. I configured one of the tables in our unity catalog. My assumption is that the profile and drift metrics tables are automatically created. But when I get an internal e...
- 2129 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, were you able to resolve this, am having a similar issue - thanks
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
4 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
Aws databricks
1 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
20 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 129 | |
| 72 | |
| 57 | |
| 42 |