cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

BradSheridan
by Valued Contributor
  • 1334 Views
  • 1 replies
  • 0 kudos

using a UDF in a Windows function

I have created a UDF using:%sqlCREATE OR REPLACE FUNCTION f_timestamp_max()....And I've confirmed it works with:%sqlselect f_timestamp_max()But when I try to use it in a Window function (lead over partition), I get:AnalysisException: Using SQL functi...

  • 1334 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, As of now, Spark SQL supports three kinds of window functions: ranking functions. analytic functions. aggregate functions. Please refer: https://docs.databricks.com/sql/language-manual/sql-ref-window-functions.html#parameters

  • 0 kudos
Haima
by New Contributor
  • 320 Views
  • 0 replies
  • 0 kudos

FileNotFoundError: [Errno 2] /dbfs/fileone.csv

I'm trying to transfer my csv file from databricks to sftp but i'm getting file not found error.here is my code:file_size = sftp.stat("/dbfs/fileone.csv").st_sizewith open("/dbfs/fileone.csv", "rb") as fl:return self.putfo(fl, Destinationpath, file_s...

  • 320 Views
  • 0 replies
  • 0 kudos
User16869510359
by Esteemed Contributor
  • 4254 Views
  • 3 replies
  • 0 kudos

Resolved! How many notebooks/jobs can I run in parallel on a Databricks cluster?

Is there a limit on it and is the limit configurable?

  • 4254 Views
  • 3 replies
  • 0 kudos
Latest Reply
User16869510359
Esteemed Contributor
  • 0 kudos

There is a hard limit of 145 active execution contexts on a Cluster. This is to ensure the cluster is not overloaded with too many parallel threads starving for resources. The limit is not configurable. If there are more than 145 parallel jobs to be ...

  • 0 kudos
2 More Replies
data_serf
by New Contributor
  • 1906 Views
  • 3 replies
  • 1 kudos

Resolved! How to integrate java 11 code in Databricks

Hi all,We're trying to attach java libraries which are compiled/packaged using Java 11.After doing some research it looks like even the most recent runtimes use Java 8 which can't run the Java 11 code ("wrong version 55.0, should be 52.0" errors)Is t...

  • 1906 Views
  • 3 replies
  • 1 kudos
Latest Reply
matthewrj
New Contributor II
  • 1 kudos

I have tried setting JNAME=zulu11-ca-amd64 under Cluster > Advanced options > Spark > Environment variables but it doesn't seem to work. I still get errors indicating Java 8 is the JRE and in the Spark UI under "Environment" I still see:Java Home: /u...

  • 1 kudos
2 More Replies
齐木木
by New Contributor III
  • 714 Views
  • 1 replies
  • 3 kudos

Resolved! The case class reports an error when running in the notebook

As shown in the figure, the case class and the json string are converted through fasterxml.jackson, but an unexpected error occurred during the running of the code. I think this problem may be related to the loading principle of the notebook. Because...

image.png local image
  • 714 Views
  • 1 replies
  • 3 kudos
Latest Reply
齐木木
New Contributor III
  • 3 kudos

code:var str="{\"app_type\":\"installed-app\"}" import com.fasterxml.jackson.databind.ObjectMapper import com.fasterxml.jackson.module.scala.DefaultScalaModule val mapper = new ObjectMapper() mapper.registerModule(DefaultScalaModule) ...

  • 3 kudos
WBM1
by New Contributor
  • 223 Views
  • 0 replies
  • 0 kudos

wbm.com.pk

WBM is the best online Supermarket in Pakistan provides you with Fast home delivery of your complete grocery, Home Cleaning, Skincare, Baby Products, and Mosquito Repellent Collection.https://wbm.com.pk/

  • 223 Views
  • 0 replies
  • 0 kudos
Deepak_Kandpal
by New Contributor III
  • 2293 Views
  • 3 replies
  • 2 kudos

Resolved! Enable credential passthrough Option is not available in new UI for Job Cluster

Hi All,I am trying to add new workflow which require to use credential passthrough, but when I am trying to create new Job Cluster from Workflow -> Jobs -> My Job, the option of Enable credential passthrough is not available. Is there any other way t...

image
  • 2293 Views
  • 3 replies
  • 2 kudos
Latest Reply
Rostislaw
New Contributor III
  • 2 kudos

assuming your Excel file is located on ADLS you can add a service principal to the cluster configuration. see: https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/azure-storage#--access-azure-data-lake-storage-gen2-or-blob-stora...

  • 2 kudos
2 More Replies
vamsi0132
by New Contributor II
  • 564 Views
  • 0 replies
  • 1 kudos

BUG in TIME ZONE EST function

Hi,I found the bug while using in "from_utc_timestamp" function while using from UTC time stamp to EST time stampBelow is the Query Query:select trim(current_timestamp()) as Current,trim(from_utc_timestamp(current_timestamp(),'EST')) as EST,trim(from...

image
  • 564 Views
  • 0 replies
  • 1 kudos
TutorBees_Net
by New Contributor
  • 932 Views
  • 2 replies
  • 0 kudos

Tutorbees logo black

We provide online tutoring for students from Grade 5 and all the way up to professionals. You can find the best tutors for Maths, Biology, Physics, Chemistry, English, Social Sciences, Urdu in the comfort of your home. You can also find professional ...

  • 932 Views
  • 2 replies
  • 0 kudos
Latest Reply
frillow
New Contributor II
  • 0 kudos

The academized review is not that clear. The company seems legitimate enough, but the anonymous profiles make customers and users doubt its legitimacy. While Academized does list the number of custom feedbacks it offers and the fields of specializati...

  • 0 kudos
1 More Replies
Raj4
by New Contributor III
  • 1626 Views
  • 7 replies
  • 0 kudos

www.databricks.com

Hi Team , I have attended the virtual instructor-led training on 23-08-2022 (https://www.databricks.com/p/webinar/databricks-lakehouse-fundamentals-learning-plan). As per the steps mentioned i have completed all of the steps for getting voucher, but ...

  • 1626 Views
  • 7 replies
  • 0 kudos
Latest Reply
amit
New Contributor II
  • 0 kudos

Thanks @Nadia Elsayed​  for quick response. I have booked my exam with supplied coupon without any issue.Thanks, Amit

  • 0 kudos
6 More Replies
sp334
by New Contributor II
  • 6016 Views
  • 5 replies
  • 3 kudos

Resolved! NPIP tunnel setup failed [WaitForNgrokTunnel]

Hello All, We have deployed a new databricks instance in Azure cloud 1) Databricks service attached public subnet/private subnet (delegated to Microsoft.Databricks/workspaces)2) i created a job with cluster runtime ( 1 worker: Standard_DS3_v27.3 LTS...

  • 6016 Views
  • 5 replies
  • 3 kudos
Latest Reply
fabienv
New Contributor II
  • 3 kudos

In case others run into this in the future. Here is something additional to check:Is your account/workspace enabled for the "compliance security profile"? If yes, you should see a little shield icon in the lower left-hand corner of the workspace Once...

  • 3 kudos
4 More Replies
tompile
by New Contributor III
  • 2190 Views
  • 7 replies
  • 10 kudos

Resolved! Is it possible to make use of pygit2 or GitPython packages to reference git repositories from within databricks?

I am making use of repos in databricks and am trying to reference the current git branch from within the notebook session.For example:from pygit2 import Repositoryrepo = Repository('/Workspace/Repos/user@domain/repository')The code above throws an er...

  • 2190 Views
  • 7 replies
  • 10 kudos
Latest Reply
niburg123
New Contributor III
  • 10 kudos

You cannot use this as far as i know, but you can put a workaround in a notebook if you are calling code from your repo via a notebook:repo_path = "/Repos/xyz_repo_path/xyz_repo_name"repo_path_fs = "/Workspace" + repo_pathrepo_branch = "main"def chec...

  • 10 kudos
6 More Replies
swetha
by New Contributor III
  • 1374 Views
  • 2 replies
  • 1 kudos

I am unable to attach a streaming listener to a spark streaming job. Error: no streaming listener attached to the spark application is the error we are observing post accessing streaming statistics API. Please help us with this issue ASAP. Thanks.

Issue:After adding the listener jar file in the cluster init script, the listener is working (From what I see in the stdout/log4j logs)But when I try to hit the 'Content-Type: application/json' http://host:port/api/v1/applications/app-id/streaming/st...

  • 1374 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vidula
Honored Contributor
  • 1 kudos

Hi @swetha kadiyala​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 1 kudos
1 More Replies
Sadiq
by New Contributor III
  • 1474 Views
  • 6 replies
  • 4 kudos

Fixed length file from Databricks notebook ( Spark SQL)

Hi ,I need help writing data from azure databricks notebook into Fixed Length .txt.notebook has 10 lakh rows and 86 columns. can anyone suggest me

  • 1474 Views
  • 6 replies
  • 4 kudos
Latest Reply
Vidula
Honored Contributor
  • 4 kudos

Hi @sadiq vali​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 4 kudos
5 More Replies
Labels
Top Kudoed Authors