cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

rameshkumar610
by New Contributor
  • 1008 Views
  • 5 replies
  • 0 kudos

S60 Eliminate SPN secrets - Connect Azure Databricks to ADLS Gen2 , Gen1 via custom AD token

Hi Team,In Azure Databricks, we currently use Service Principal when creating Mount Points to Azure storage ( ADLS Gen1, ADLS Gen 2 and Azure Blob Storage).As part of S360 action to eliminate SPN secrets, we were asked to move to SPN+certificate / MS...

  • 1008 Views
  • 5 replies
  • 0 kudos
Latest Reply
iakshaykr
New Contributor III
  • 0 kudos

@ramesitexp Yes @szymon_dybczak is correct for now only valid option is below : OAuth 2.0 with a Microsoft Entra ID service principalShared access signatures (SAS)Account keys For now we are using OAuth 2.0 with a Microsoft Entra ID service principal...

  • 0 kudos
4 More Replies
Spyro_3
by New Contributor
  • 294 Views
  • 1 replies
  • 0 kudos

Free trail account

Hi Can I able to create unity catalog using free trail account?

  • 294 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Contributor
  • 0 kudos

Hi @Spyro_3 ,Yeah, you should be able to create unity catalog. Trial version allows you to create premium workspace which is required for unity catalog. Notice, that to setup metastore you also need to have Global Administrator permission (if we are ...

  • 0 kudos
yeungcase
by New Contributor III
  • 1135 Views
  • 7 replies
  • 3 kudos

Writing a single huge dataframe into Azure SQL Database using JDBC

Hi All,I am currently trying to read data from a materialized view as a single dataframe which contains around 10M of rows and then write it into an Azure SQL database. However, I don't see the spark job moving a bit even an hour is passed. I have al...

yeungcase_0-1719991764991.png yeungcase_0-1719991355253.png yeungcase_1-1719991508190.png
  • 1135 Views
  • 7 replies
  • 3 kudos
Latest Reply
Rishabh_Tiwari
Community Manager
  • 3 kudos

Hi @yeungcase , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedba...

  • 3 kudos
6 More Replies
vinitkhandelwal
by New Contributor III
  • 738 Views
  • 3 replies
  • 1 kudos

Private Python Package in Serverless Job

I am trying to create a Databricks Job using Serverless Compute. I am using wheel file to run the Python Job.The wheel file has setup.py file using which all dependencies are installed. One of the package dependency is a private package hosted on Git...

Screenshot 2024-07-03 at 8.50.30 PM.png
  • 738 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @vinitkhandelwal, When using Serverless Compute in Databricks, you’re right that there’s no direct option to add init scripts. However, you can still achieve your goal of installing a private package hosted on Gitlab Python Package Registry.

  • 1 kudos
2 More Replies
himanmon
by New Contributor III
  • 778 Views
  • 3 replies
  • 2 kudos

Resolved! When does the cost of JOB COMPUTE start to be calculated?

I'm trying to run a workflow with job compute.Job compute needs to be pending for about 5 to 7 minutes before executing the workflow. I think it takes time to find a suitable instance in the cloud, configure the environment, install libraries, etc.An...

  • 778 Views
  • 3 replies
  • 2 kudos
Latest Reply
jacovangelder
Honored Contributor
  • 2 kudos

There is a big discrepancy to take note of here. It is the difference between Databricks job costs (DBUs) and billing for the cloud provider (the actual VM doing the the work). Billing for the Databricks job starts when the Spark context is being ini...

  • 2 kudos
2 More Replies
SHS
by New Contributor
  • 679 Views
  • 2 replies
  • 1 kudos

Error with: %run ../Includes/Classroom-Setup-SQL

Hi Guys,Just started the ASP 2.1L Spare SQL Lab and I get this error, when I run the first setup SQL command:%run ../Includes/Classroom-Setup-SQLThe execution of this command did not finish successfullyPython interpreter will be restarted.Python inte...

  • 679 Views
  • 2 replies
  • 1 kudos
Latest Reply
cibele
New Contributor II
  • 1 kudos

You can create a cluster compatible with the notebooks. So It will work.

  • 1 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 1053 Views
  • 2 replies
  • 0 kudos

Azure Synapse vs Databricks

 Hi team,Could you kindly provide your perspective on the cost and performance comparison between Azure Synapse and Databricks SQL Warehouse/serverless, as well as their respective use cases? Thank you.

  • 1053 Views
  • 2 replies
  • 0 kudos
Latest Reply
Witold
Contributor III
  • 0 kudos

Agree with @mhiltner, it doesn't make sense to compare it with Synapse, as it's literally dead. You most likely want to compare it to Fabric instead. Fabric is highly under development, but IMHO it still lacks behind other Data/AI solutions. No catal...

  • 0 kudos
1 More Replies
OnerFusion-AI
by New Contributor
  • 30749 Views
  • 4 replies
  • 3 kudos

Resolved! How to import excel on databricks

To import an Excel file into Databricks, you can follow these general steps:1. **Upload the Excel File**:- Go to the Databricks workspace or cluster where you want to work.- Navigate to the location where you want to upload the Excel file.- Click on ...

  • 30749 Views
  • 4 replies
  • 3 kudos
Latest Reply
AhmedAlnaqa
New Contributor III
  • 3 kudos

The question here is how to read the multi-excel files based on path.The mentioned solution interacts with one file only, do we have the ability to read all the Excel files in the folder?

  • 3 kudos
3 More Replies
vanagnostopoulo
by New Contributor II
  • 302 Views
  • 1 replies
  • 0 kudos

CLI is not helpful in exporting Error: expected to have the absolute path of the object or directory

I try to export a job as a DBA in order to create an Asset Bundle according to thishttps://community.databricks.com/t5/data-engineering/databricks-asset-bundle-dab-from-existing-workspace/td-p/49309I am on Windows 10 Pro x64 withDatabricks CLI v0.223...

  • 302 Views
  • 1 replies
  • 0 kudos
Latest Reply
Witold
Contributor III
  • 0 kudos

Yes, the error message is not very meaningful. I believe it's the way how you need to pass the absolute path on windows system. Can you try these approaches?* --file "c:\\mydba"* --file "c:\mydba"

  • 0 kudos
LLLMMM
by New Contributor III
  • 644 Views
  • 3 replies
  • 2 kudos

Resolved! Try Databricks sign up failed

Hi, I am trying to use Databricks with the community edition. However, when I tried to create an account, the sign-up failed after I completed the puzzle. 

Screenshot 2024-07-05 at 20.45.53.png
  • 644 Views
  • 3 replies
  • 2 kudos
Latest Reply
lovinchanglei
New Contributor II
  • 2 kudos

Thank you so much for sharing, this is really helpful.

  • 2 kudos
2 More Replies
YS1
by Contributor
  • 4793 Views
  • 6 replies
  • 0 kudos

Streaming xls files Using Auto Loader

Hello,Is there a way to read .xls files using auto loader or is there any workaround since excel files are not supported by the auto loader per the following document?https://docs.databricks.com/en/ingestion/auto-loader/options.htmlThanks.

Get Started Discussions
auto_loader
streaming
  • 4793 Views
  • 6 replies
  • 0 kudos
Latest Reply
Sicnarf
New Contributor II
  • 0 kudos

I am facing the same issue--I have a stream that I'd like to use autoloader on with an .xlsx. Is there any update to any workarounds on this issue?

  • 0 kudos
5 More Replies
divyasri1504
by New Contributor
  • 596 Views
  • 1 replies
  • 0 kudos

File Not Found Error while reading pickle file

Hello, thereI have a pickle file uploaded in a mounted location in databricks ( /dbfs/mnt/blob/test.pkl). I am trying to read this pickle file using the below python snippetwith open(path + "test.pkl", "rb") as f:       bands = pickle.load(f)But it t...

  • 596 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @divyasri1504 , Make sure you’re using the correct path to access the file. In Databricks, you should typically prefix everything with /dbfs (or dbfs:/ for native functions). Try using the full path like this: with open("/dbfs/mnt/blob/test.pkl...

  • 0 kudos
vinitkhandelwal
by New Contributor III
  • 4234 Views
  • 2 replies
  • 0 kudos

Resolved! Using private package, getting ERROR: No matching distribution found for myprivatepackage

My project's setup.py filefrom setuptools import find_packages, setup PACKAGE_REQUIREMENTS = ["pyyaml","confluent-kafka", "fastavro", "python-dotenv","boto3", "pyxlsb", "aiohttp", "myprivatepackage"] LOCAL_REQUIREMENTS = ["delta-spark", "scikit-lea...

Get Started Discussions
dbx
package
private
python
  • 4234 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, Does this look like a dependency error? All the dependencies are packed in the whl? Also, could you please confirm if all the limitations are satified? Refer:  https://docs.databricks.com/en/compute/access-mode-limitations.html 

  • 0 kudos
1 More Replies
ArvindDige
by New Contributor II
  • 934 Views
  • 2 replies
  • 0 kudos

Is DBFS going to be deprecated?

Is DBFS going to be deprecated? As I am using /dbfs/FileStore/tables/ location where a jar file is stored, and I am copying this jar file to /databricks/jars locations.My concerns is as DBFS root and mounts are deprecated, is that mean in coming days...

  • 934 Views
  • 2 replies
  • 0 kudos
Latest Reply
ArvindDige
New Contributor II
  • 0 kudos

Hi Raphael,I am trying below init script to achieve this task, PFAAnd getting error as below,Cluster scoped init script abfss://container@storage.dfs.core.windows.net/init_script.sh failed: Failure to initialize configuration for storage account stor...

  • 0 kudos
1 More Replies
Prashanthkumar
by New Contributor III
  • 4933 Views
  • 7 replies
  • 0 kudos

Is it possible to view Databricks cluster metrics using REST API

I am looking for some help on getting databricks cluster metrics such as memory utilization, CPU utilization, memory swap utilization, free file system using REST API.I am trying it in postman using databricks token and with my Service Principal bear...

Prashanthkumar_0-1705104529507.png
  • 4933 Views
  • 7 replies
  • 0 kudos
Latest Reply
javierbg
New Contributor III
  • 0 kudos

At my company we are also interested in this feature, is there an ETA?

  • 0 kudos
6 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels
Top Kudoed Authors