cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

shrikant_kulkar
by New Contributor III
  • 4452 Views
  • 2 replies
  • 2 kudos

c# connector for databricks Delta Sharing

Any plans for adding c# connector? What are alternate ways in current state? 

  • 4452 Views
  • 2 replies
  • 2 kudos
Latest Reply
Shawn_Eary
Contributor
  • 2 kudos

I'm having problems getting the REST API calls for Delta Sharing to work. Python and Power BI work fine but the C# code that Databricks AI generates does not work. I keep getting an "ENDPOINT NOT FOUND" error even though config.share is fine.A C# con...

  • 2 kudos
1 More Replies
Wijnand
by New Contributor II
  • 2279 Views
  • 1 replies
  • 0 kudos

Updates on a column in delta table with downstream autoloader

I've got the following questions:1. Can I pause autoloader jobs, delete cluster that was used to run these jobs, create a new cluster and run jobs with newer version cluster?2. I have one autoloader job that ingests JSONs and transforms this to a del...

  • 2279 Views
  • 1 replies
  • 0 kudos
Latest Reply
cgrant
Databricks Employee
  • 0 kudos

Hello, 1.Yes you can pause the job, delete the cluster, upgrade versions of the cluster, etc. With Auto Loader and Structured Streaming the important thing is making sure that the checkpointLocation stays in tact, so no deletions, modifications, or m...

  • 0 kudos
Shree23
by New Contributor III
  • 2965 Views
  • 2 replies
  • 0 kudos

scalar function in databricks

Hi Expert,here is sql server scalar function how to convert in databricks functionSQLCREATE function [dbo].[gettrans](@PickupCompany nvarchar(2),@SupplyCountry int, @TxnSource nvarchar(10),@locId nvarchar(50), @ExternalSiteId nvarchar(50))RETURNS INT...

  • 2965 Views
  • 2 replies
  • 0 kudos
Latest Reply
MathieuDB
Databricks Employee
  • 0 kudos

Hello @Shree23 ,In Databricks, you can create scalar or tabular functions using SQL or Python. Here is the documentation .I converted your SQL Server function to Databricks standards. CREATE OR REPLACE FUNCTION gettrans( PickupCompany STRING, Sup...

  • 0 kudos
1 More Replies
OlekNV
by New Contributor
  • 2759 Views
  • 2 replies
  • 0 kudos

Enable system schemas

Hello All,I'm new with Databricks,Have an issue within enable system schemas. When run api call to check system schemas status in metastores -I see that all schemas in "Unavailable" state (except "information_schema", which is "ENABLE_COMPLETED").Is ...

  • 2759 Views
  • 2 replies
  • 0 kudos
Latest Reply
vaishalisai
New Contributor II
  • 0 kudos

I am also facing the same issues.

  • 0 kudos
1 More Replies
Ruby8376
by Valued Contributor
  • 7577 Views
  • 8 replies
  • 2 kudos

Expose delta table data to Salesforce - odata?

HI Looking for suggestiongs to stream on demand data from databricks delta tables to salesforce.Is odata a good option? 

  • 7577 Views
  • 8 replies
  • 2 kudos
Latest Reply
fegvilela
New Contributor II
  • 2 kudos

Hey, I think this might helphttps://www.salesforce.com/uk/news/press-releases/2024/04/25/zero-copy-partner-network/

  • 2 kudos
7 More Replies
Nandhini_Kumar
by New Contributor III
  • 6415 Views
  • 7 replies
  • 0 kudos

How to get databricks performance metrics programmatically?

How to retrieve all Databricks performance metrics on an hourly basis. Is there a recommended method or API available for retrieving performance metrics ?

  • 6415 Views
  • 7 replies
  • 0 kudos
Latest Reply
holly
Databricks Employee
  • 0 kudos

The spark logs are available through cluster logging. This is enabled at the cluster level for you to choose the destination for the logs.  Just a heads up - interpreting them at scale is not trivial. I'd recommend having a read through the overwatch...

  • 0 kudos
6 More Replies
Chris_Konsur
by New Contributor III
  • 3967 Views
  • 4 replies
  • 1 kudos

an autoloader in file notification mode to get files from S3 on AWS -Error

I configured an autoloader in file notification mode to get files from S3 on AWS.spark.readStream\.format("cloudFiles")\.option("cloudFiles.format", "json")\.option("cloudFiles.inferColumnTypes", "true")\.option("cloudFiles.schemaLocation", "dbfs:/au...

  • 3967 Views
  • 4 replies
  • 1 kudos
Latest Reply
Selz
New Contributor II
  • 1 kudos

In case anyone else stumbles across this, I was able to fix my issue by setting up an instance profile with the file notification permissions and attaching the instance profile to the job cluster. It wasn't clear from the documentation that the file ...

  • 1 kudos
3 More Replies
Ludo
by New Contributor III
  • 7377 Views
  • 4 replies
  • 3 kudos

[DeltaTable] Usage with Unity Catalog (ParseException)

Hi,I'm migrating my workspaces to Unity Catalog and the application to use three-level notation. (catalog.database.table)See: Tutorial: Delta Lake | Databricks on AWSI'm having the following exception when trying to use DeltaTable.forName(string name...

  • 7377 Views
  • 4 replies
  • 3 kudos
Latest Reply
Ludo
New Contributor III
  • 3 kudos

Thank you for the quick feedback @saipujari_spark Indeed, it's working great within a notebook with Databricks Runtime 13.2 which most likely has a custom behavior for unity catalog. It's not working in my scala application running in local with dire...

  • 3 kudos
3 More Replies
BSalla
by New Contributor II
  • 1928 Views
  • 3 replies
  • 0 kudos

Supporting Material for self phased Data Analysis with Databricks Course

Hi All,Newbie here, Any idea where I can find the supporting materials that are used in the online "Data Analysis with Databricks Course" that the instructor is using?. It seems to be having the scripts to create schema, tables, etcThanks in advance 

  • 1928 Views
  • 3 replies
  • 0 kudos
Latest Reply
Advika_
Databricks Employee
  • 0 kudos

Hello @BSalla, here is our suggested learning path. I hope it helps you!

  • 0 kudos
2 More Replies
RoseCliver1
by New Contributor II
  • 957 Views
  • 2 replies
  • 1 kudos

Finding Materials for Databricks Course

Hi All,Where can I find the supporting materials used by the instructor in the online "Data Analysis with Databricks" course?  It appears to include scripts for creating schemas, tables, and other database structures.Thanks in advance.

  • 957 Views
  • 2 replies
  • 1 kudos
Latest Reply
Advika_
Databricks Employee
  • 1 kudos

Hello @RoseCliver1 and @BSalla! To access the course supporting materials, please raise a support ticket here.

  • 1 kudos
1 More Replies
khaansaab
by New Contributor II
  • 1576 Views
  • 1 replies
  • 1 kudos

Resolved! read file from local machine(my computer) and create a dataframe

I want to create a notebook and add a widget which will allow the user to select a file from local machine(my computer) and read the contents of the file and create a dataframe. is it possible? and how? in dbutils.widgets i dont have any options for ...

  • 1576 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @khaansaab ,Currently, there is now out of the box mechanism that will allow you to that. As a workaround,you can create a UC volume and tell your users to upload files into that volume. Then you can create notebook that will have file_name parame...

  • 1 kudos
mahfooz_iiitian
by New Contributor III
  • 1062 Views
  • 1 replies
  • 0 kudos

Merging customer and company account into single account

I have two accounts: One is my company account and another one is my personal account in databricks community. I want to merge it into single one. Kindly let me know how to do it.

  • 1062 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika_
Databricks Employee
  • 0 kudos

Hello @mahfooz_iiitian! Please send an email to community@databricks.com with both of your email addresses, specifying which account you’d like to retain. The IT team will assist you with merging the accounts.

  • 0 kudos
Dharshan777
by New Contributor
  • 778 Views
  • 1 replies
  • 0 kudos

Databricks Data engineer Exam got suspended while still 8 minutes left.

Hi @Cert-Team,I hope this message finds you well.Request ID- #00556592 I am writing to seek clarification regarding my recent exam, which was suspended due to a reflection issue caused by my spectacles. During the exam, the proctor paused it and aske...

  • 778 Views
  • 1 replies
  • 0 kudos
Latest Reply
Cert-TeamOPS
Databricks Employee
  • 0 kudos

Hello @Dharshan777 , We are sorry to hear that your exam was suspended.. Thank you for filing a ticket with our support team. Please allow the support team 24-48 hours for a resolution. In the meantime, you can review the following documentation: Beh...

  • 0 kudos
JohnJustus
by New Contributor III
  • 14254 Views
  • 3 replies
  • 0 kudos

Space in Column names when writing to Hive

All,I have the following code.df_Warehouse_Utilization = (    spark.table("hive_metastore.dev_ork.bin_item_detail")    .join(df_DIM_Bins,col('bin_tag')==df_DIM_Bins.BinKey,'right')    .groupby(col('BinKey'))    .agg(count_distinct(when(col('serial_lo...

  • 14254 Views
  • 3 replies
  • 0 kudos
Latest Reply
KandyKad
New Contributor III
  • 0 kudos

Hi,I have faced this issue a few times. When we are overwriting the dataframes to hive catalog in databricks, it doesn't naturally allow for column names to have spaces or special characters. However, you can add an option statement to bypass that ru...

  • 0 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels