cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Data_Engineer3
by Contributor III
  • 1894 Views
  • 5 replies
  • 0 kudos

Delete databricks community post

Hi All,If in case if I make any mistake in my previous post in the databricks community, how can delete the post which I was posted and got replied on the same. Is this possible to delete the old post which was already replied by someone else.Thanks,

  • 1894 Views
  • 5 replies
  • 0 kudos
Latest Reply
Data_Engineer3
Contributor III
  • 0 kudos

I need to delete my old post which contain the details which should not be shared.https://community.databricks.com/t5/data-engineering/need-to-define-the-struct-and-array-of-struct-field-colum-in-the/m-p/58131#M31022 

  • 0 kudos
4 More Replies
jose_db_aws
by New Contributor III
  • 630 Views
  • 1 replies
  • 1 kudos

Resolved! unable to perform CD using github actions for MLops

Hi, I am new to databricks and MLOPS as well. I am trying out the databricks asset bundles mlops tutorial at https://docs.databricks.com/en/dev-tools/bundles/mlops-stacks.htmlmy cloud account is using AWS. I had asked to setup FS and Model Registry i...

jose_db_aws_0-1721042139398.png jose_db_aws_0-1721043189198.png jose_db_aws_1-1721042263883.png jose_db_aws_2-1721042559490.png
  • 630 Views
  • 1 replies
  • 1 kudos
Latest Reply
jose_db_aws
New Contributor III
  • 1 kudos

The problem was with the secrets at github actions were different from the Personal access token that I had set as my environment variable, DATABRICKS_TOKEN locally. After I copied this over the github's STAGING_WORKSPACE_TOKEN it is able to run the ...

  • 1 kudos
phguk
by New Contributor III
  • 2410 Views
  • 4 replies
  • 0 kudos

Adding NFS storage as external volume (Unity)

Can anyone share experience (or point me to another reference) that describes how to configure Azure Blob storage which has NFS enabled as an external volume to Databricks ?I've succeeded in adding SMB storage to Databricks but (if I understand prope...

  • 2410 Views
  • 4 replies
  • 0 kudos
Latest Reply
walter
New Contributor II
  • 0 kudos

hi @phguk could you share how you managed to create an external volume referencing to an azure fileshare ?are you using Unity catalog for this ? it was my understanding this is not possible.

  • 0 kudos
3 More Replies
Rajani
by Contributor II
  • 775 Views
  • 2 replies
  • 2 kudos

Resolved! How to pass a dynamic query to source server from databricks

I have this usecase wherein i am supposed to pass a dynamic query to get data from source I have tried the query option but its giving error SparkConnectGrpcException: (com.microsoft.sqlserver.jdbc.SQLServerException) Incorrect syntax near the keywor...

  • 775 Views
  • 2 replies
  • 2 kudos
Latest Reply
Rajani
Contributor II
  • 2 kudos

Hi  thanks for your reply,I have used foreign catalouge to fetch required data from information schema  then i am creating the dynamic query in databricks and then passing in query this is working for me! @Kaniz_Fatma

  • 2 kudos
1 More Replies
rameshkumar610
by New Contributor
  • 1123 Views
  • 5 replies
  • 0 kudos

S60 Eliminate SPN secrets - Connect Azure Databricks to ADLS Gen2 , Gen1 via custom AD token

Hi Team,In Azure Databricks, we currently use Service Principal when creating Mount Points to Azure storage ( ADLS Gen1, ADLS Gen 2 and Azure Blob Storage).As part of S360 action to eliminate SPN secrets, we were asked to move to SPN+certificate / MS...

  • 1123 Views
  • 5 replies
  • 0 kudos
Latest Reply
iakshaykr
New Contributor III
  • 0 kudos

@ramesitexp Yes @szymon_dybczak is correct for now only valid option is below : OAuth 2.0 with a Microsoft Entra ID service principalShared access signatures (SAS)Account keys For now we are using OAuth 2.0 with a Microsoft Entra ID service principal...

  • 0 kudos
4 More Replies
Spyro_3
by New Contributor
  • 344 Views
  • 1 replies
  • 0 kudos

Free trail account

Hi Can I able to create unity catalog using free trail account?

  • 344 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Contributor III
  • 0 kudos

Hi @Spyro_3 ,Yeah, you should be able to create unity catalog. Trial version allows you to create premium workspace which is required for unity catalog. Notice, that to setup metastore you also need to have Global Administrator permission (if we are ...

  • 0 kudos
yeungcase
by New Contributor III
  • 1442 Views
  • 7 replies
  • 3 kudos

Writing a single huge dataframe into Azure SQL Database using JDBC

Hi All,I am currently trying to read data from a materialized view as a single dataframe which contains around 10M of rows and then write it into an Azure SQL database. However, I don't see the spark job moving a bit even an hour is passed. I have al...

yeungcase_0-1719991764991.png yeungcase_0-1719991355253.png yeungcase_1-1719991508190.png
  • 1442 Views
  • 7 replies
  • 3 kudos
Latest Reply
Rishabh_Tiwari
Community Manager
  • 3 kudos

Hi @yeungcase , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedba...

  • 3 kudos
6 More Replies
vinitkhandelwal
by New Contributor III
  • 922 Views
  • 3 replies
  • 1 kudos

Private Python Package in Serverless Job

I am trying to create a Databricks Job using Serverless Compute. I am using wheel file to run the Python Job.The wheel file has setup.py file using which all dependencies are installed. One of the package dependency is a private package hosted on Git...

Screenshot 2024-07-03 at 8.50.30 PM.png
  • 922 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @vinitkhandelwal, When using Serverless Compute in Databricks, you’re right that there’s no direct option to add init scripts. However, you can still achieve your goal of installing a private package hosted on Gitlab Python Package Registry.

  • 1 kudos
2 More Replies
himanmon
by New Contributor III
  • 954 Views
  • 3 replies
  • 2 kudos

Resolved! When does the cost of JOB COMPUTE start to be calculated?

I'm trying to run a workflow with job compute.Job compute needs to be pending for about 5 to 7 minutes before executing the workflow. I think it takes time to find a suitable instance in the cloud, configure the environment, install libraries, etc.An...

  • 954 Views
  • 3 replies
  • 2 kudos
Latest Reply
jacovangelder
Honored Contributor
  • 2 kudos

There is a big discrepancy to take note of here. It is the difference between Databricks job costs (DBUs) and billing for the cloud provider (the actual VM doing the the work). Billing for the Databricks job starts when the Spark context is being ini...

  • 2 kudos
2 More Replies
SHS
by New Contributor
  • 750 Views
  • 2 replies
  • 1 kudos

Error with: %run ../Includes/Classroom-Setup-SQL

Hi Guys,Just started the ASP 2.1L Spare SQL Lab and I get this error, when I run the first setup SQL command:%run ../Includes/Classroom-Setup-SQLThe execution of this command did not finish successfullyPython interpreter will be restarted.Python inte...

  • 750 Views
  • 2 replies
  • 1 kudos
Latest Reply
cibele
New Contributor II
  • 1 kudos

You can create a cluster compatible with the notebooks. So It will work.

  • 1 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 1602 Views
  • 2 replies
  • 0 kudos

Azure Synapse vs Databricks

 Hi team,Could you kindly provide your perspective on the cost and performance comparison between Azure Synapse and Databricks SQL Warehouse/serverless, as well as their respective use cases? Thank you.

  • 1602 Views
  • 2 replies
  • 0 kudos
Latest Reply
Witold
Contributor III
  • 0 kudos

Agree with @mhiltner, it doesn't make sense to compare it with Synapse, as it's literally dead. You most likely want to compare it to Fabric instead. Fabric is highly under development, but IMHO it still lacks behind other Data/AI solutions. No catal...

  • 0 kudos
1 More Replies
OnerFusion-AI
by New Contributor
  • 33091 Views
  • 4 replies
  • 3 kudos

Resolved! How to import excel on databricks

To import an Excel file into Databricks, you can follow these general steps:1. **Upload the Excel File**:- Go to the Databricks workspace or cluster where you want to work.- Navigate to the location where you want to upload the Excel file.- Click on ...

  • 33091 Views
  • 4 replies
  • 3 kudos
Latest Reply
AhmedAlnaqa
New Contributor III
  • 3 kudos

The question here is how to read the multi-excel files based on path.The mentioned solution interacts with one file only, do we have the ability to read all the Excel files in the folder?

  • 3 kudos
3 More Replies
LLLMMM
by New Contributor III
  • 773 Views
  • 3 replies
  • 2 kudos

Resolved! Try Databricks sign up failed

Hi, I am trying to use Databricks with the community edition. However, when I tried to create an account, the sign-up failed after I completed the puzzle. 

Screenshot 2024-07-05 at 20.45.53.png
  • 773 Views
  • 3 replies
  • 2 kudos
Latest Reply
lovinchanglei
New Contributor II
  • 2 kudos

Thank you so much for sharing, this is really helpful.

  • 2 kudos
2 More Replies
YS1
by Contributor
  • 4927 Views
  • 6 replies
  • 0 kudos

Streaming xls files Using Auto Loader

Hello,Is there a way to read .xls files using auto loader or is there any workaround since excel files are not supported by the auto loader per the following document?https://docs.databricks.com/en/ingestion/auto-loader/options.htmlThanks.

Get Started Discussions
auto_loader
streaming
  • 4927 Views
  • 6 replies
  • 0 kudos
Latest Reply
Sicnarf
New Contributor II
  • 0 kudos

I am facing the same issue--I have a stream that I'd like to use autoloader on with an .xlsx. Is there any update to any workarounds on this issue?

  • 0 kudos
5 More Replies
divyasri1504
by New Contributor
  • 793 Views
  • 1 replies
  • 0 kudos

File Not Found Error while reading pickle file

Hello, thereI have a pickle file uploaded in a mounted location in databricks ( /dbfs/mnt/blob/test.pkl). I am trying to read this pickle file using the below python snippetwith open(path + "test.pkl", "rb") as f:       bands = pickle.load(f)But it t...

  • 793 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @divyasri1504 , Make sure you’re using the correct path to access the file. In Databricks, you should typically prefix everything with /dbfs (or dbfs:/ for native functions). Try using the full path like this: with open("/dbfs/mnt/blob/test.pkl...

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels