- 1008 Views
- 5 replies
- 0 kudos
S60 Eliminate SPN secrets - Connect Azure Databricks to ADLS Gen2 , Gen1 via custom AD token
Hi Team,In Azure Databricks, we currently use Service Principal when creating Mount Points to Azure storage ( ADLS Gen1, ADLS Gen 2 and Azure Blob Storage).As part of S360 action to eliminate SPN secrets, we were asked to move to SPN+certificate / MS...
- 1008 Views
- 5 replies
- 0 kudos
- 0 kudos
@ramesitexp Yes @szymon_dybczak is correct for now only valid option is below : OAuth 2.0 with a Microsoft Entra ID service principalShared access signatures (SAS)Account keys For now we are using OAuth 2.0 with a Microsoft Entra ID service principal...
- 0 kudos
- 294 Views
- 1 replies
- 0 kudos
Free trail account
Hi Can I able to create unity catalog using free trail account?
- 294 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Spyro_3 ,Yeah, you should be able to create unity catalog. Trial version allows you to create premium workspace which is required for unity catalog. Notice, that to setup metastore you also need to have Global Administrator permission (if we are ...
- 0 kudos
- 1135 Views
- 7 replies
- 3 kudos
Writing a single huge dataframe into Azure SQL Database using JDBC
Hi All,I am currently trying to read data from a materialized view as a single dataframe which contains around 10M of rows and then write it into an Azure SQL database. However, I don't see the spark job moving a bit even an hour is passed. I have al...
- 1135 Views
- 7 replies
- 3 kudos
- 3 kudos
Hi @yeungcase , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedba...
- 3 kudos
- 738 Views
- 3 replies
- 1 kudos
Private Python Package in Serverless Job
I am trying to create a Databricks Job using Serverless Compute. I am using wheel file to run the Python Job.The wheel file has setup.py file using which all dependencies are installed. One of the package dependency is a private package hosted on Git...
- 738 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @vinitkhandelwal, When using Serverless Compute in Databricks, you’re right that there’s no direct option to add init scripts. However, you can still achieve your goal of installing a private package hosted on Gitlab Python Package Registry.
- 1 kudos
- 778 Views
- 3 replies
- 2 kudos
Resolved! When does the cost of JOB COMPUTE start to be calculated?
I'm trying to run a workflow with job compute.Job compute needs to be pending for about 5 to 7 minutes before executing the workflow. I think it takes time to find a suitable instance in the cloud, configure the environment, install libraries, etc.An...
- 778 Views
- 3 replies
- 2 kudos
- 2 kudos
There is a big discrepancy to take note of here. It is the difference between Databricks job costs (DBUs) and billing for the cloud provider (the actual VM doing the the work). Billing for the Databricks job starts when the Spark context is being ini...
- 2 kudos
- 679 Views
- 2 replies
- 1 kudos
Error with: %run ../Includes/Classroom-Setup-SQL
Hi Guys,Just started the ASP 2.1L Spare SQL Lab and I get this error, when I run the first setup SQL command:%run ../Includes/Classroom-Setup-SQLThe execution of this command did not finish successfullyPython interpreter will be restarted.Python inte...
- 679 Views
- 2 replies
- 1 kudos
- 1 kudos
You can create a cluster compatible with the notebooks. So It will work.
- 1 kudos
- 1053 Views
- 2 replies
- 0 kudos
Azure Synapse vs Databricks
Hi team,Could you kindly provide your perspective on the cost and performance comparison between Azure Synapse and Databricks SQL Warehouse/serverless, as well as their respective use cases? Thank you.
- 1053 Views
- 2 replies
- 0 kudos
- 0 kudos
Agree with @mhiltner, it doesn't make sense to compare it with Synapse, as it's literally dead. You most likely want to compare it to Fabric instead. Fabric is highly under development, but IMHO it still lacks behind other Data/AI solutions. No catal...
- 0 kudos
- 30749 Views
- 4 replies
- 3 kudos
Resolved! How to import excel on databricks
To import an Excel file into Databricks, you can follow these general steps:1. **Upload the Excel File**:- Go to the Databricks workspace or cluster where you want to work.- Navigate to the location where you want to upload the Excel file.- Click on ...
- 30749 Views
- 4 replies
- 3 kudos
- 3 kudos
The question here is how to read the multi-excel files based on path.The mentioned solution interacts with one file only, do we have the ability to read all the Excel files in the folder?
- 3 kudos
- 302 Views
- 1 replies
- 0 kudos
CLI is not helpful in exporting Error: expected to have the absolute path of the object or directory
I try to export a job as a DBA in order to create an Asset Bundle according to thishttps://community.databricks.com/t5/data-engineering/databricks-asset-bundle-dab-from-existing-workspace/td-p/49309I am on Windows 10 Pro x64 withDatabricks CLI v0.223...
- 302 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes, the error message is not very meaningful. I believe it's the way how you need to pass the absolute path on windows system. Can you try these approaches?* --file "c:\\mydba"* --file "c:\mydba"
- 0 kudos
- 644 Views
- 3 replies
- 2 kudos
Resolved! Try Databricks sign up failed
Hi, I am trying to use Databricks with the community edition. However, when I tried to create an account, the sign-up failed after I completed the puzzle.
- 644 Views
- 3 replies
- 2 kudos
- 2 kudos
Thank you so much for sharing, this is really helpful.
- 2 kudos
- 4793 Views
- 6 replies
- 0 kudos
Streaming xls files Using Auto Loader
Hello,Is there a way to read .xls files using auto loader or is there any workaround since excel files are not supported by the auto loader per the following document?https://docs.databricks.com/en/ingestion/auto-loader/options.htmlThanks.
- 4793 Views
- 6 replies
- 0 kudos
- 0 kudos
I am facing the same issue--I have a stream that I'd like to use autoloader on with an .xlsx. Is there any update to any workarounds on this issue?
- 0 kudos
- 596 Views
- 1 replies
- 0 kudos
File Not Found Error while reading pickle file
Hello, thereI have a pickle file uploaded in a mounted location in databricks ( /dbfs/mnt/blob/test.pkl). I am trying to read this pickle file using the below python snippetwith open(path + "test.pkl", "rb") as f: bands = pickle.load(f)But it t...
- 596 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @divyasri1504 , Make sure you’re using the correct path to access the file. In Databricks, you should typically prefix everything with /dbfs (or dbfs:/ for native functions). Try using the full path like this: with open("/dbfs/mnt/blob/test.pkl...
- 0 kudos
- 4234 Views
- 2 replies
- 0 kudos
Resolved! Using private package, getting ERROR: No matching distribution found for myprivatepackage
My project's setup.py filefrom setuptools import find_packages, setup PACKAGE_REQUIREMENTS = ["pyyaml","confluent-kafka", "fastavro", "python-dotenv","boto3", "pyxlsb", "aiohttp", "myprivatepackage"] LOCAL_REQUIREMENTS = ["delta-spark", "scikit-lea...
- 4234 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi, Does this look like a dependency error? All the dependencies are packed in the whl? Also, could you please confirm if all the limitations are satified? Refer: https://docs.databricks.com/en/compute/access-mode-limitations.html
- 0 kudos
- 934 Views
- 2 replies
- 0 kudos
Is DBFS going to be deprecated?
Is DBFS going to be deprecated? As I am using /dbfs/FileStore/tables/ location where a jar file is stored, and I am copying this jar file to /databricks/jars locations.My concerns is as DBFS root and mounts are deprecated, is that mean in coming days...
- 934 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Raphael,I am trying below init script to achieve this task, PFAAnd getting error as below,Cluster scoped init script abfss://container@storage.dfs.core.windows.net/init_script.sh failed: Failure to initialize configuration for storage account stor...
- 0 kudos
- 4933 Views
- 7 replies
- 0 kudos
Is it possible to view Databricks cluster metrics using REST API
I am looking for some help on getting databricks cluster metrics such as memory utilization, CPU utilization, memory swap utilization, free file system using REST API.I am trying it in postman using databricks token and with my Service Principal bear...
- 4933 Views
- 7 replies
- 0 kudos
- 0 kudos
At my company we are also interested in this feature, is there an ETA?
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
12.2 LST
1 -
Access Data
2 -
Access Delta Tables
1 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API
1 -
API Documentation
1 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
2 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
2 -
Azure data disk
1 -
Azure databricks
10 -
Azure Databricks SQL
4 -
Azure databricks workspace
1 -
Azure Unity Catalog
4 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Best Practices
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Bug
1 -
Catalog
1 -
Certification
1 -
Certification Exam
1 -
Certification Voucher
1 -
CICD
2 -
Cli
1 -
Cloud_files_state
1 -
cloudera sql
1 -
CloudFiles
1 -
Cluster
3 -
clusterpolicy
1 -
Code
1 -
Community Group
1 -
Community Social
1 -
Compute
2 -
conditional tasks
1 -
Cost
1 -
Credentials
1 -
CustomLibrary
1 -
CustomPythonPackage
1 -
Data Engineering
2 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
DataAISummit2023
1 -
DatabrickHive
1 -
databricks
2 -
Databricks Academy
1 -
Databricks Alerts
1 -
Databricks Audit Logs
1 -
Databricks Cluster
1 -
Databricks Clusters
1 -
Databricks Community
1 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
1 -
Databricks Delta Table
2 -
Databricks Documentation
1 -
Databricks JDBC
1 -
Databricks Job
1 -
Databricks jobs
2 -
Databricks Lakehouse Platform
1 -
Databricks notebook
1 -
Databricks Notebooks
2 -
Databricks Platform
1 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks SQL
1 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
3 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
DatabricksJobCluster
1 -
DataDays
1 -
DataMasking
2 -
dbdemos
1 -
DBRuntime
1 -
DDL
1 -
deduplication
1 -
Delt Lake
1 -
Delta
6 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
6 -
Delta Sharing
2 -
deltaSharing
1 -
denodo
1 -
Deny assignment
1 -
Devops
1 -
DLT
8 -
DLT Pipeline
6 -
DLT Pipelines
5 -
DLTCluster
1 -
Documentation
2 -
Dolly
1 -
Download files
1 -
dropduplicatewithwatermark
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
Feature Store
1 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
glob
1 -
Good Documentation
1 -
Google Bigquery
1 -
hdfs
1 -
Help
1 -
How to study Databricks
1 -
informatica
1 -
Jar
1 -
Java
1 -
JDBC Connector
1 -
Job Cluster
1 -
Job Task
1 -
Kubernetes
1 -
Lineage
1 -
LLMs
1 -
Login
1 -
Login Account
1 -
Machine Learning
1 -
MachineLearning
1 -
masking
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Metastore
1 -
MlFlow
2 -
Mlops
1 -
Model Serving
1 -
Model Training
1 -
Mount
1 -
Networking
1 -
nic
1 -
Okta
1 -
ooze
1 -
os
1 -
Password
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
1 -
policies
1 -
PostgresSQL
1 -
Pricing
1 -
pubsub
1 -
Pyspark
1 -
Python
1 -
Quickstart
1 -
RBAC
1 -
Repos Support
1 -
Reserved VM's
1 -
Reset
1 -
run a job
1 -
runif
1 -
S3
1 -
SAP SUCCESS FACTOR
1 -
Schedule
1 -
SCIM
1 -
Serverless
1 -
Service principal
1 -
Session
1 -
Sign Up Issues
2 -
Significant Performance Difference
1 -
Spark
2 -
sparkui
2 -
Splunk
1 -
sqoop
1 -
Start
1 -
Stateful Stream Processing
1 -
Storage Optimization
1 -
Structured Streaming ForeachBatch
1 -
Summit23
2 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
tabrikck
1 -
Tags
1 -
Troubleshooting
1 -
ucx
2 -
Unity Catalog
1 -
Unity Catalog Error
2 -
UntiyCatalog
1 -
Update
1 -
user groups
1 -
Venicold
3 -
volumes
2 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
with open
1 -
Women
1 -
Workflow
2 -
Workspace
2
- « Previous
- Next »