- 6205 Views
- 1 replies
- 0 kudos
How to access storage with private endpoint
We know that Databricks with VNET injection (our own VNET) allows is to connect to blob storage/ ADLS Gen2 over private endpoints and peering. This is what we typically do.We have a client who created Databricks with EnableNoPublicIP=No (secure clust...
- 6205 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @jx1226 , were you able to solve this at the customer? I am currently struggling with same issues here.
- 0 kudos
- 4513 Views
- 0 replies
- 0 kudos
Installing R packages for a customer docker container for compute
Hi,I'm trying to create a customer docker image with some R packages re-installed. However, when I try to use it in a notebook, it can't seem to find the installed packages. The build runs fine.FROM databricksruntime/rbase:14.3-LTS## update system li...
- 4513 Views
- 0 replies
- 0 kudos
- 1471 Views
- 0 replies
- 0 kudos
DLT CDC/SCD - Taking the latest ID per day
Hi I'm creating a DLT pipeline which uses DLT CDC to implement SCD Type 1 to take the latest record using a datetime column which works with no issues:@dlt.view def users(): return spark.readStream.table("source_table") dlt.create_streaming_table(...
- 1471 Views
- 0 replies
- 0 kudos
- 2544 Views
- 0 replies
- 0 kudos
What happened to the JobIds in the parallel runs (again)????
Hey Databricks, Why did you take away the jobids from the parallel runs? We use those to identify which output goes with which run. Please put them back.Benedetta
- 2544 Views
- 0 replies
- 0 kudos
- 3329 Views
- 3 replies
- 0 kudos
Error ingesting zip files: ExecutorLostFailure Reason: Command exited with code 50
Hi,We are trying to ingest zip files into Azure Databricks delta lake using COPY INTO command. There are 100+ zip files with average size of ~300MB each.Cluster configuration:1 driver: 56GB, 16 cores2-8 workers: 32GB, 8 cores (each). Autoscaling enab...
- 3329 Views
- 3 replies
- 0 kudos
- 0 kudos
Although we were able to copy the zip files onto the DB volume, we were not able to share them with any system outside of the Databricks environment. Guess delta sharing does not support sharing files that are on UC volumes.
- 0 kudos
- 2289 Views
- 0 replies
- 1 kudos
Not able to access data registered in Unity Catalog using Simba ODBC driver
Hi folks, I'm working on a project with Databricks using Unity Catalog and a connection to SSIS (SQL Server Integration Services).My team is trying to access data registered in Unity Catalog using Simba ODBC driver version 2.8.0.1002. They mentioned ...
- 2289 Views
- 0 replies
- 1 kudos
- 6221 Views
- 3 replies
- 4 kudos
Resolved! Unable to read available file after downloading
After downloading a file using `wget`, I'm attempting to read it by spark.read.json.I am getting error: PATH_NOT_FOUND - Path does not exist: dbfs:/tmp/data.json. SQLSTATE: 42K03File <command-3327713714752929>, line 2 I have checked the file do exist...
- 6221 Views
- 3 replies
- 4 kudos
- 4 kudos
Hi @sharma_kamal , Good Day! Could you please try the below code suggested by @ThomazRossito , it will help you. Also please refer to the below document to work with the files on Databricks: https://docs.databricks.com/en/files/index.html Please l...
- 4 kudos
- 903 Views
- 0 replies
- 0 kudos
databricks job compute price w.r.t running time
I have two workflows (jobs) in data bricks (AWS) with below cluster specs (job base cluster NOT general purpose)Driver: i3.xlarge · Workers: i3.xlarge · 2-8 workers Job 1 takes 10 min to completeJob 2 takes 50 min to completeQuestions:DBU cost is sam...
- 903 Views
- 0 replies
- 0 kudos
- 4367 Views
- 0 replies
- 0 kudos
Spark Handling White Space as NULL
I have a very strange thing happening. I'm importing a csv file and nulls and blanks are being interpreted correctly. What is strange is that a column that regularly has a single space character value is having the single space converted to null.I'...
- 4367 Views
- 0 replies
- 0 kudos
- 1758 Views
- 1 replies
- 0 kudos
How to identify Worker and Driver instance in AWS console for databricks instance?
For AWS databricks, i have configured 1 worker and 1 driver node with same node type. In the AWS console, all details are same for the two instance, as instance id only different. How to identify which instance id is for worker and which one is for d...
- 1758 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod Thanks for your response.There is no instance pool ID while configured the cluster, then how will i able to differentiate then.Could you give any alternative way for finding the driver instance id and worker instance id in the AWS con...
- 0 kudos
- 2891 Views
- 0 replies
- 0 kudos
Usage of if else condition for data check
Hi,In a particular Workflows Job, I am trying to add some data checks in between each task by using If else statement. I used following statement in a notebook to call parameter in if else condition to check logic.{"job_id": XXXXX,"notebook_params": ...
- 2891 Views
- 0 replies
- 0 kudos
- 2693 Views
- 1 replies
- 1 kudos
Databricks Repos API Limitations
Hi, I have started using databricks recently, and I'm not able find a right solution in the documentations. i have linked multiple repos in my databricks workspace in the repos folders, and I wanted to update the repos with remote AzureDevops reposit...
- 2693 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Ha2001 , Good Day! Databricks API has a limit of 10 per second for the /repos/* combined requests in the workspace. You can check the below documentation for the API limit: https://docs.databricks.com/en/resources/limits.html#:~:text=Git%20fold...
- 1 kudos
- 4928 Views
- 4 replies
- 1 kudos
Resolved! Spark streaming query stops after code exception in notebook since 14.3
Hi!I am experiencing something that I cannot find in the documentation: in databricks, using the databricks runtime 13.X, when I start a streaming query (using .start method), it creates a new query and while it is running I can execute other code in...
- 4928 Views
- 4 replies
- 1 kudos
- 1 kudos
You can use the help portal: https://help.databricks.com/s/
- 1 kudos
- 1690 Views
- 1 replies
- 1 kudos
Why so many different domains and accounts?
I've lost count of how many different domains and accounts Databricks is requiring for me to use their services. Every domain is requiring its own account username, password, etc., and nothing is synced. I can't even keep track of which email addre...
- 1690 Views
- 1 replies
- 1 kudos
- 1 kudos
Pluscustomer-academy.databricks.comaccounts.cloud.databricks.comdatabricks.my.site.com
- 1 kudos
- 2574 Views
- 0 replies
- 0 kudos
How is model drift calculated when the baseline table has no timestamp column?
I try to understand how Databricks computes the model drift when the baseline table is available. What I understood from the documentation is Databricks processes both the primary and the baseline tables according to the specified granularities in th...
- 2574 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
88 | |
42 | |
42 | |
30 |