- 6131 Views
- 7 replies
- 3 kudos
Writing a single huge dataframe into Azure SQL Database using JDBC
Hi All,I am currently trying to read data from a materialized view as a single dataframe which contains around 10M of rows and then write it into an Azure SQL database. However, I don't see the spark job moving a bit even an hour is passed. I have al...
- 6131 Views
- 7 replies
- 3 kudos
- 3 kudos
Hi @yeungcase , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedba...
- 3 kudos
- 1110 Views
- 1 replies
- 0 kudos
Mounts in Databricks
How is it possible to prohibit a certain user from being able to see the mounts created in Databricks. Even if the user issues the %fs ls /mnt command, it doesn't return anything.
- 1110 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Elcio , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedback n...
- 0 kudos
- 7284 Views
- 6 replies
- 3 kudos
Driver and worker node utalisation
Hi all! Can anyone tell me if having a worker node(s) of the same type as my driver node make a difference performance-wise if I am running normal python code on a notebook as a job on this cluster? I am running mostly machine learning libraries such...
- 7284 Views
- 6 replies
- 3 kudos
- 3364 Views
- 2 replies
- 1 kudos
Resolved! Profile setup for databricks Account
Hi Team,we usually do setup profile for databrikcs at workspace level, and i have done this using host url and token which is working fine like below.[DEFAULT]host = workspaceurlusername = ****password = tokenNow my question is how we set for databri...
- 3364 Views
- 2 replies
- 1 kudos
- 1 kudos
You cannot create a PAT token for you Account Console authentication, on this case you will have 2 options to authenticate:Using Basic auth, on which on your profile you will set you username and password to log in, same as you do to log in in the br...
- 1 kudos
- 1188 Views
- 0 replies
- 0 kudos
Databricks Asset Bundle Error 'KEY of the resource to run"
Hello Team,I am new to DAB and running it for the first time through the Databricks CLI.The bundle validation is successful but while running it errors out error="expected a KEY of the resource to run".Can anyone help me on what to check to resolve t...
- 1188 Views
- 0 replies
- 0 kudos
- 1180 Views
- 1 replies
- 1 kudos
Databricks community group in Kerala
Calling All Data Enthusiasts in Kerala! Hey everyone,I'm excited about the idea of launching a Databricks Community Group here in Kerala! This group would be a hub for learning, sharing knowledge, and networking among data enthusiasts, analysts, a...
- 1180 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @AswinGovindan77 , Do you have an estimate of how many enthusiasts will be joining you? If you provide a rough number, we can create a group for you on Community!
- 1 kudos
- 6850 Views
- 6 replies
- 2 kudos
Can't create branch of public git repo
Hi,I have cloned a public git repo into my Databricks account. It's a repo associated with an online training course. I'd like to work through the notebooks, maybe make some changes and updates, etc., but I'd also like to keep a clean copy of it. M...
- 6850 Views
- 6 replies
- 2 kudos
- 2 kudos
It occurs to me that one valid solution to this problem is simply to fork the repo and work there. Pretty standard approach, I guess, although not something I've ever been in the habit of doing.
- 2 kudos
- 1173 Views
- 1 replies
- 1 kudos
Resolved! DATABRICKS FESTVAL VOUCHER
what should be the timezone which we have to consider for earning the voucher
- 1173 Views
- 1 replies
- 1 kudos
- 1 kudos
@Aman_Dahiya Guessing you are referring to this event! There is no categorization for time zones, it is open to all. You can earn the voucher regardless of your timezone.
- 1 kudos
- 2631 Views
- 3 replies
- 2 kudos
Resolved! When does the cost of JOB COMPUTE start to be calculated?
I'm trying to run a workflow with job compute.Job compute needs to be pending for about 5 to 7 minutes before executing the workflow. I think it takes time to find a suitable instance in the cloud, configure the environment, install libraries, etc.An...
- 2631 Views
- 3 replies
- 2 kudos
- 2 kudos
There is a big discrepancy to take note of here. It is the difference between Databricks job costs (DBUs) and billing for the cloud provider (the actual VM doing the the work). Billing for the Databricks job starts when the Spark context is being ini...
- 2 kudos
- 1308 Views
- 0 replies
- 0 kudos
Asset Bundle template: ignore file based on variable value
Hello,I am creating an asset-bundle-template which should be able to include/exclude certain files, depending on the value of a parameter, e.g. if parameter "include_file" is set to true, then the bundle init command should include the template file(...
- 1308 Views
- 0 replies
- 0 kudos
- 1456 Views
- 1 replies
- 1 kudos
Error with: %run ../Includes/Classroom-Setup-SQL
Hi Guys,Just started the ASP 2.1L Spare SQL Lab and I get this error, when I run the first setup SQL command:%run ../Includes/Classroom-Setup-SQLThe execution of this command did not finish successfullyPython interpreter will be restarted.Python inte...
- 1456 Views
- 1 replies
- 1 kudos
- 1 kudos
You can create a cluster compatible with the notebooks. So It will work.
- 1 kudos
- 2267 Views
- 1 replies
- 1 kudos
Databricks CLI bundle run multiple jobs
Hi!I am using bundles to deploy various workflows, and as part of a CI pipeline, I want to run integration tests. They are in notebooks that I deploy and run with databricks CLI bundle with Azure DevOps.This allows me to run only one job at a time as...
- 2267 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Retired_mod and thanks for your reply.I see, so all tests will need to be defined in one config file under one job.
- 1 kudos
- 61504 Views
- 3 replies
- 3 kudos
Resolved! How to import excel on databricks
To import an Excel file into Databricks, you can follow these general steps:1. **Upload the Excel File**:- Go to the Databricks workspace or cluster where you want to work.- Navigate to the location where you want to upload the Excel file.- Click on ...
- 61504 Views
- 3 replies
- 3 kudos
- 3 kudos
The question here is how to read the multi-excel files based on path.The mentioned solution interacts with one file only, do we have the ability to read all the Excel files in the folder?
- 3 kudos
- 813 Views
- 0 replies
- 0 kudos
Retry Trigger for Specific Errors and Custom Error States in Workflow UI
Hello everyone,In a workflow, is it possible to trigger a retry only for a specific error on a single task? I want the workflow UI to show a run as failed for both managed and unmanaged errors, but I don't want to trigger the retry for managed error...
- 813 Views
- 0 replies
- 0 kudos
- 1906 Views
- 3 replies
- 2 kudos
Resolved! Try Databricks sign up failed
Hi, I am trying to use Databricks with the community edition. However, when I tried to create an account, the sign-up failed after I completed the puzzle.
- 1906 Views
- 3 replies
- 2 kudos
- 2 kudos
Thank you so much for sharing, this is really helpful.
- 2 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
97 | |
52 | |
42 | |
30 |