cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

alvaro_databric
by New Contributor III
  • 4256 Views
  • 0 replies
  • 0 kudos

Azure Databricks Spot Cost

Hi all,I started using Azure Spot VMs by switching on the spot option when creating a cluster, however in the Azure billing dashboard, after some months of using spot instances, I only have OnDemand PurchaseType. Does someone guess what could be happ...

  • 4256 Views
  • 0 replies
  • 0 kudos
THIAM_HUATTAN
by Valued Contributor
  • 53407 Views
  • 8 replies
  • 2 kudos

Skip number of rows when reading CSV files

staticDataFrame = spark.read.format("csv")\ .option("header", "true").option("inferSchema", "true").load("/FileStore/tables/Consumption_2019/*.csv") when above, I need an option to skip say first 4 lines on each CSV file, How do I do that?

  • 53407 Views
  • 8 replies
  • 2 kudos
Latest Reply
Michael_Appiah
Contributor II
  • 2 kudos

The option... .option("skipRows", <number of rows to skip>) ...works for me as well. However, I am surprised that the official Spark doc does not list it as a CSV Data Source Option: https://spark.apache.org/docs/latest/sql-data-sources-csv.html#data...

  • 2 kudos
7 More Replies
rsamant07
by New Contributor III
  • 1555 Views
  • 0 replies
  • 0 kudos

TLS Mutual Authentication for Databricks API

Hi,we are exploring the use of Databricks Statement Execution API for sharing the data through API to different consumer applications, however  we have a security requirement  to configure TLS Mutual Authentication to limit the consumer application t...

  • 1555 Views
  • 0 replies
  • 0 kudos
IvanK
by New Contributor III
  • 4370 Views
  • 1 replies
  • 0 kudos

Register permanent UDF from Python file

Hello,I am trying to create a permanent UDF from a Python file with dependencies that are not part of the standard Python library.How do I make use of CREATE FUNCTION (External) [1] to create a permanent function in Databricks, using a Python file th...

Data Engineering
Create function
python
  • 4370 Views
  • 1 replies
  • 0 kudos
nikhilkumawat
by New Contributor III
  • 11085 Views
  • 3 replies
  • 1 kudos

Install maven package on job cluster

I have a single user cluster and I have created a workflow which will read excel file from Azure storage account. For reading excel file I am using com.crealytics:spark-excel_2.13:3.4.1_0.19.0  library on single user all-purpose cluster.I have alread...

  • 11085 Views
  • 3 replies
  • 1 kudos
Latest Reply
nikhilkumawat
New Contributor III
  • 1 kudos

Hi @Retired_mod Can you ellaborate few more things:1. When spark-shell installs any maven package, what is the default location where it downloads the jar file ?2. As far as I know default location for jars is "/databricks/jars/" from where spark pic...

  • 1 kudos
2 More Replies
merca
by Valued Contributor II
  • 12277 Views
  • 7 replies
  • 7 kudos

How can I give users permissions to see the objects metadata without access to data

Only permissions I can see are select and this gives access to data and that is very unwanted. I only want users to see the metadata, like table/view/column names and descriptions/comments and location and such but not to see any data.

  • 12277 Views
  • 7 replies
  • 7 kudos
Latest Reply
merca
Valued Contributor II
  • 7 kudos

@Uma Maheswara Rao Desula​ , @Geeta Sai Boddu​  and @S S​ ,Thank you for the responses. I have gotten answer from Databricks and it seems this is not possible and this is something that is investigated as a capability.

  • 7 kudos
6 More Replies
silvadev
by New Contributor III
  • 9998 Views
  • 1 replies
  • 0 kudos

Resolved! MongoDB Spark Connector v10.x read error on Databricks 13.x

I have facing a error when I am trying to read data from any MongoDB collection using MongoDB Spark Connector v10.x on Databricks v13.x.The below error appear to start at line #113 of MongoDB Spark Connector Library (v10.2.0):  java.lang.NoSuchMethod...

Data Engineering
mongodb
spark
  • 9998 Views
  • 1 replies
  • 0 kudos
Latest Reply
silvadev
New Contributor III
  • 0 kudos

The problem was fixed in Databricks Runtime 13.3 LTS.

  • 0 kudos
jonathan-dufaul
by Valued Contributor
  • 14662 Views
  • 2 replies
  • 0 kudos

Resolved! Error updating workflow, webhook not found?

I have no idea what this error means or what it could mean. When I'm trying to save a workflow I get a popup saying this:

image
  • 14662 Views
  • 2 replies
  • 0 kudos
Latest Reply
Robin_LOCHE
New Contributor II
  • 0 kudos

I had the same issue, thanks for the info! Apparently it's also possible to fix it by removing all the actual notification in the interface (the bugged one is not displayed, but if you remove everything for some reason it removes the bugged one too)....

  • 0 kudos
1 More Replies
DJey
by New Contributor III
  • 22379 Views
  • 7 replies
  • 0 kudos

connect to azure sql database from databricks using service principal

Hi All, Can someone please help me with the Python code to connect Azure SQL Database to Databricks using Service Principle instead of directly passing username and password. I'm using above code but getting above error. Refer Screenshot 2.Please hel...

DJey_0-1688048752356.png DJey_1-1688048784120.png
Data Engineering
Azure Databricks
Azure SQL Database
Databricks
  • 22379 Views
  • 7 replies
  • 0 kudos
Latest Reply
Joe_Suarez
New Contributor III
  • 0 kudos

First, you need to create a service principal in Azure and grant it the necessary permissions to access your Azure SQL Database to do crm data enrichment. You can do this using the Azure CLI or the Azure Portal. Ensure that your Databricks cluster ha...

  • 0 kudos
6 More Replies
JessGa
by New Contributor II
  • 4368 Views
  • 3 replies
  • 0 kudos

Requesting an exam reattempt

Hi DB SupportI gave Databricks Certified Associate Engineering Exam today but missed by just by one percent. I got 68.88% and pass is 70%.I am planning to reattempt this exam in coming days and was hoping you could help, Could you kindly give me anot...

  • 4368 Views
  • 3 replies
  • 0 kudos
Latest Reply
Littlereb5
New Contributor II
  • 0 kudos

I recommend reaching out to Databricks directly or checking their official certification website for information on retake policies, voucher availability, and any discounts or promotions they may offer for reattempts.

  • 0 kudos
2 More Replies
MauiWarrior
by New Contributor
  • 5691 Views
  • 0 replies
  • 0 kudos

Installing fpp3 R package on Databricks

In R notebook I am running:     install.packages('fpp3', dependencies = TRUE) And getting back errors:     ERROR: dependency ‘vctrs’ is not available for package ‘slider’I then install 'vctrs' and it again generates similar error that some package is...

  • 5691 Views
  • 0 replies
  • 0 kudos
data_turtle
by New Contributor
  • 1699 Views
  • 0 replies
  • 0 kudos

How do I get AWS costs from my SQL Warehouses?

Hi,How do I find the AWS associated costs from my databricks SQL warehouse usage? I tried using tags but they didn't show up in the AWS cost explorer.My use case is I am running some DBT - Databricks jobs and I want to find the cost for certain jobs....

  • 1699 Views
  • 0 replies
  • 0 kudos
Sabtout
by New Contributor II
  • 2640 Views
  • 1 replies
  • 0 kudos

Using Python UDF in Delta live table

Hello,I tried running a python UDF in a Delta Live Table workflow in Advanced mode but it did not run and gave the "Python UDF is not supported in your environment" error.Can I get a clear picture if the Python External UDFs are supported or not?

  • 2640 Views
  • 1 replies
  • 0 kudos
Latest Reply
Sabtout
New Contributor II
  • 0 kudos

Hi @Retired_mod I ran this SQL query in my Catalog (I'm using Unity Catalog) :CREATE OR REPLACE FUNCTION cat_projint_dev.silver.GetEditor(prompt STRING)RETURNS STRINGLANGUAGE PYTHONAS $$print(prompt)$$ Then I ran a Delta Live Table workflow using Uni...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels