- 1610 Views
- 2 replies
- 0 kudos
scalar function in databricks
Hi Expert,here is sql server scalar function how to convert in databricks functionSQLCREATE function [dbo].[gettrans](@PickupCompany nvarchar(2),@SupplyCountry int, @TxnSource nvarchar(10),@locId nvarchar(50), @ExternalSiteId nvarchar(50))RETURNS INT...
- 1610 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @Shree23 ,In Databricks, you can create scalar or tabular functions using SQL or Python. Here is the documentation .I converted your SQL Server function to Databricks standards. CREATE OR REPLACE FUNCTION gettrans( PickupCompany STRING, Sup...
- 0 kudos
- 1742 Views
- 2 replies
- 0 kudos
Enable system schemas
Hello All,I'm new with Databricks,Have an issue within enable system schemas. When run api call to check system schemas status in metastores -I see that all schemas in "Unavailable" state (except "information_schema", which is "ENABLE_COMPLETED").Is ...
- 1742 Views
- 2 replies
- 0 kudos
- 4434 Views
- 4 replies
- 1 kudos
Databricks Job Failure + Service now Integration
Hi Team,Could you please suggest how to raise the service now ticket, in case of Databricks job failure?Regards ,Phanindra
- 4434 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi , Can this JSON response to Service Now be edited before being sent? What are the different ways it can be edited?
- 1 kudos
- 3956 Views
- 8 replies
- 2 kudos
Expose delta table data to Salesforce - odata?
HI Looking for suggestiongs to stream on demand data from databricks delta tables to salesforce.Is odata a good option?
- 3956 Views
- 8 replies
- 2 kudos
- 2 kudos
Hey, I think this might helphttps://www.salesforce.com/uk/news/press-releases/2024/04/25/zero-copy-partner-network/
- 2 kudos
- 3318 Views
- 7 replies
- 0 kudos
How to get databricks performance metrics programmatically?
How to retrieve all Databricks performance metrics on an hourly basis. Is there a recommended method or API available for retrieving performance metrics ?
- 3318 Views
- 7 replies
- 0 kudos
- 0 kudos
The spark logs are available through cluster logging. This is enabled at the cluster level for you to choose the destination for the logs. Just a heads up - interpreting them at scale is not trivial. I'd recommend having a read through the overwatch...
- 0 kudos
- 2371 Views
- 4 replies
- 1 kudos
an autoloader in file notification mode to get files from S3 on AWS -Error
I configured an autoloader in file notification mode to get files from S3 on AWS.spark.readStream\.format("cloudFiles")\.option("cloudFiles.format", "json")\.option("cloudFiles.inferColumnTypes", "true")\.option("cloudFiles.schemaLocation", "dbfs:/au...
- 2371 Views
- 4 replies
- 1 kudos
- 1 kudos
In case anyone else stumbles across this, I was able to fix my issue by setting up an instance profile with the file notification permissions and attaching the instance profile to the job cluster. It wasn't clear from the documentation that the file ...
- 1 kudos
- 4293 Views
- 4 replies
- 3 kudos
[DeltaTable] Usage with Unity Catalog (ParseException)
Hi,I'm migrating my workspaces to Unity Catalog and the application to use three-level notation. (catalog.database.table)See: Tutorial: Delta Lake | Databricks on AWSI'm having the following exception when trying to use DeltaTable.forName(string name...
- 4293 Views
- 4 replies
- 3 kudos
- 3 kudos
Thank you for the quick feedback @saipujari_spark Indeed, it's working great within a notebook with Databricks Runtime 13.2 which most likely has a custom behavior for unity catalog. It's not working in my scala application running in local with dire...
- 3 kudos
- 543 Views
- 1 replies
- 0 kudos
Merging customer and company account into single account
I have two accounts: One is my company account and another one is my personal account in databricks community. I want to merge it into single one. Kindly let me know how to do it.
- 543 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @mahfooz_iiitian! Please send an email to community@databricks.com with both of your email addresses, specifying which account you’d like to retain. The IT team will assist you with merging the accounts.
- 0 kudos
- 686 Views
- 2 replies
- 0 kudos
Is anyone using Databricks for Advanced HR Analytcs?
We are starting out with Databricks and I would like to use the tools to build out Advanced Analytics for HR.
- 686 Views
- 2 replies
- 0 kudos
- 269 Views
- 1 replies
- 0 kudos
Databricks Data engineer Exam got suspended while still 8 minutes left.
Hi @Cert-Team,I hope this message finds you well.Request ID- #00556592 I am writing to seek clarification regarding my recent exam, which was suspended due to a reflection issue caused by my spectacles. During the exam, the proctor paused it and aske...
- 269 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Dharshan777 , We are sorry to hear that your exam was suspended.. Thank you for filing a ticket with our support team. Please allow the support team 24-48 hours for a resolution. In the meantime, you can review the following documentation: Beh...
- 0 kudos
- 9794 Views
- 11 replies
- 1 kudos
Cannot create an account to try Community Edition
Hi,Whenever I try to signup for an account, I keep getting the following message - "an error has occurred. please try again later" when I click on the button "get started with databricks community edition".Could you please let me know why this could...
- 9794 Views
- 11 replies
- 1 kudos
- 1 kudos
I got the same problem if I try to register or login through Community Edition link. But I tried by clicking the "Try Databricks" button on top right corner of the https://www.databricks.com/ home page, I was able to register and login successfully j...
- 1 kudos
- 349 Views
- 2 replies
- 0 kudos
spark_partition_id() - User does not have permission SELECT on anonymous function
I'm trying to verify the partitions assigned to rows.I'm running something like this:from pyspark.sql.functions import spark_partition_id df = spark.read.table("some.uc.table").limit(10) df = df.repartition(2) df = df.withColumn("partitionid", spar...
- 349 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @jes, I have validate your failure internally and found that there is already an internal request to address this behavior. Are you using a shared access mode cluster? As this behavior does not look to be observed when using single access mode...
- 0 kudos
- 473 Views
- 4 replies
- 1 kudos
Connection type 'SALESFORCE' is not enabled. Please enable the connection to use it.
I'm trying to connect to Salesforce in databricks, I'm following this:https://learn.microsoft.com/en-us/azure/databricks/query-federation/salesforce-data-cloud#sql-1and when I run the "Create Catalog..." I see this error, how would I enable salesforc...
- 473 Views
- 4 replies
- 1 kudos
- 1 kudos
The reason they're getting this error is because workspace is not enabled for the LakeFlow Connect preview. Could you please file a ticket with us, as we might required additional details. Please refer to: https://docs.databricks.com/en/resources/s...
- 1 kudos
- 596 Views
- 1 replies
- 0 kudos
Databricks User Group
Are there any Databricks User Group Meetups in UK?
- 596 Views
- 1 replies
- 0 kudos
- 0 kudos
You can find some of the groups in EMEA here: https://community.databricks.com/t5/europe-middle-east-and-africa/ct-p/EMEA
- 0 kudos
- 488 Views
- 1 replies
- 0 kudos
Resolved! Using Autoloader with merge
Hi Everyone, I have been trying to use autoloader with foreach so that I could able to use merge into in databricks, but while using I have been getting below error.error-Found error inside foreachBatch Python processMy code-from delta.tables import ...
- 488 Views
- 1 replies
- 0 kudos
- 0 kudos
It seems the columns of your join condition are not found. Are they in the dataframes/table?Also try to put the whole join condition in a single string:"s.JeHeaderId = t.JeHeaderId and s.JeLineId = t.JeLineId"
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »