cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

Sachinbt
by New Contributor II
  • 1187 Views
  • 2 replies
  • 2 kudos

DataBricks Certification Exam Got Suspended. Need help in resolving the issue

Hi Team,My databricks exam got suspened on 16th April today Morning and it is still in the suspended state. I have raised a support request using the below linkhttps://help.databricks.com/s/contact-us?ReqType=training .​ but I haven’t received the ti...

image.png
  • 1187 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Sachin Kumara​ We are going through a contract renewal with our vendor, Accredible. Once our new contract goes through you will get your badge this week. Thank you for understanding!

  • 2 kudos
1 More Replies
tytytyc26
by New Contributor II
  • 1792 Views
  • 3 replies
  • 0 kudos

Resolved! Problem with accessing element using Pandas UDF in Image Processing

Hi everyone,I was stuck at this for very long time. Not a very familiar user of using Spark for image processing. I was trying to resize images that are loaded into a Spark DF. However, it keeps throwing error that I am not able to access the element...

  • 1792 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

 @Yan Chong Tan​ :The error you are facing is due to the fact that you are trying to access the attribute "width" of a string object in the resize_image function. Specifically, input_dim is a string object, but you are trying to access its width attr...

  • 0 kudos
2 More Replies
andrew0117
by Contributor
  • 3525 Views
  • 4 replies
  • 0 kudos

Resolved! partition on a csv file

When I use SQL code like "create table myTable (column1 string, column2 string) using csv options('delimiter' = ',', 'header' = 'true') location 'pathToCsv'" to create a table from a single CSV file stored in a folder within an Azure Data Lake contai...

  • 3525 Views
  • 4 replies
  • 0 kudos
Latest Reply
pvignesh92
Honored Contributor
  • 0 kudos

Hi @andrew li​, When you specify a path with LOCATION keyword, Spark will consider that to be an EXTERNAL table. So when you dropped the table, you underlying data if any will not be cleared. So in you case, as this is an external table, you folder s...

  • 0 kudos
3 More Replies
oleole
by Contributor
  • 3074 Views
  • 3 replies
  • 3 kudos

Resolved! How to delay a new job run after job

I have a daily job run that occasionally fails with the error: The spark driver has stopped unexpectedly and is restarting. Your notebook will be automatically reattached. After I get the notification that this job failed on schedule, I manually run ...

image.png image.png
  • 3074 Views
  • 3 replies
  • 3 kudos
Latest Reply
oleole
Contributor
  • 3 kudos

According to this documentation, you can specify the wait time between the "start" of the first run and the retry start time.

  • 3 kudos
2 More Replies
rshark
by New Contributor II
  • 6407 Views
  • 3 replies
  • 0 kudos

Error when calling SparkR from within a Python notebook

I’ve had success with R magic (R cells in a Python notebook) and running an R script from a Python notebook, up to the point of connecting R to a Spark cluster. In either case, I can’t get a `SparkSession` to initialize. 2-cell (Python) notebook exa...

  • 6407 Views
  • 3 replies
  • 0 kudos
Latest Reply
Dooley
Valued Contributor
  • 0 kudos

The answer I can give you to have this work for you is to call the R notebooks from your Python notebook. Just save each dataframe as a delta table to pass between the languages.How to call a notebook from another notebook? here is a link

  • 0 kudos
2 More Replies
Josh_Stafford
by New Contributor II
  • 1437 Views
  • 2 replies
  • 1 kudos

Using dbutils.fs.ls on URI with square brackets results in error

Square brackets in ADLS are accepted, so why can't I list the files in the folder? I have tried escaping the square brackets manually, but then the escaped values are re-escaped from %5B to %255B and %5D to %255D. I get:URISyntaxException: Illegal ...

  • 1437 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Joshua Stafford​ :The URISyntaxException error you are encountering is likely due to the fact that square brackets are reserved characters in URIs (Uniform Resource Identifiers) and need to be properly encoded when used in a URL. In this case, it ap...

  • 1 kudos
1 More Replies
THeodor
by New Contributor II
  • 4206 Views
  • 11 replies
  • 4 kudos

Certificate and Badge not received

I have cleared my certification exam, Databricks Certified Data Engineer Associate on 07 April 2023. I haven't received any certification or badge yet.I sent an email to training databricks and they told me that this problem has been solved...Any Hel...

  • 4206 Views
  • 11 replies
  • 4 kudos
Latest Reply
Nadia1
Honored Contributor
  • 4 kudos

Hello all,We are going through a contract renewal with our vendor, Accredible. Once our new contract goes through you will get your badge this week. Thank you for understanding.

  • 4 kudos
10 More Replies
akashsharma7119
by Contributor
  • 20850 Views
  • 13 replies
  • 8 kudos

Resolved! Not able to generate Access Token for Service Principal using rest API

I am trying to generate a Databricks token for a service principal (SP). I have created the SP in Azure AD and have used the Databricks rest api to add it as an admin.When using the Databricks rest API "/api/2.0/token-management/on-behalf-of/tokens" ...

  • 20850 Views
  • 13 replies
  • 8 kudos
Latest Reply
callumwhite
New Contributor III
  • 8 kudos

Hi all,I believe I found a temporary fix for this -Generate an AAD token for the service principle in Azure. Follow this guide if you don't know how to -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/api/latest/aad/service-prin-aad-toke...

  • 8 kudos
12 More Replies
Ligaya
by New Contributor II
  • 23553 Views
  • 3 replies
  • 2 kudos

ValueError: not enough values to unpack (expected 2, got 1)

Code:Writer.jdbc_writer("Economy",economy,conf=CONF.MSSQL.to_dict(), modified_by=JOB_ID['Economy'])The problem arises when i try to run the code, in the specified databricks notebook, An error of "ValueError: not enough values to unpack (expected 2, ...

  • 23553 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@Jillinie Park​ :The error message you are seeing ("ValueError: not enough values to unpack (expected 2, got 1)") occurs when you try to unpack an iterable object into too few variables. In your case, the error is happening on this line of code:schem...

  • 2 kudos
2 More Replies
killjoy
by New Contributor III
  • 4405 Views
  • 2 replies
  • 0 kudos

Unexpected failure while fetching notebook - What can we do from our side?

Hello!We got some pipelines running in Azure Data Factory that call Databricks Notebooks to run data transformations. This morning at 6:21 AM (UTC) we got an error " Unexpected failure while fetching notebook" inside a notebook that calls another one...

  • 4405 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Rita Fernandes​ :Based on the error message you provided, it seems like the issue might be related to the version mismatch between the ANTLR tool used for code generation and the current runtime version. Additionally, the error message suggests that...

  • 0 kudos
1 More Replies
d_meaker
by New Contributor II
  • 1457 Views
  • 3 replies
  • 0 kudos

map_keys() returns an empty array in Delta Live Table pipeline.

We are exploding a map type column into multiple columns based on the keys of the map column. Part of this process is to extract the keys of a map type column called json_map as illustrated in the snippet below. The code executes as expected when run...

  • 1457 Views
  • 3 replies
  • 0 kudos
Latest Reply
d_meaker
New Contributor II
  • 0 kudos

Hi @Suteja Kanuri​ , Thank you for you response and explanation. The code I have shown above is not the exact snippet we are using. Please find the exact snippet below. We are dynamically extracting the keys of the map and then using getitem() to mak...

  • 0 kudos
2 More Replies
Neerajkirola
by New Contributor
  • 866 Views
  • 0 replies
  • 0 kudos

Types of RAM: An In-Depth OverviewRandom Access Memory (RAM) is an essential component of any computer system, responsible for temporarily storing dat...

Types of RAM: An In-Depth OverviewRandom Access Memory (RAM) is an essential component of any computer system, responsible for temporarily storing data that the CPU (Central Processing Unit) needs to access quickly. It allows for faster data retrieva...

Head to Head Table
  • 866 Views
  • 0 replies
  • 0 kudos
burhanudinera20
by New Contributor II
  • 8076 Views
  • 3 replies
  • 0 kudos

Cannot import name 'Test' from partially initialized module 'databricks_test_helper'

I have done install, with this command ' pip install databricks_test_helper 'next get ImportError messages when i try running this code on cloud databricks ;from databricks_test_helper import *expected = set([(s, 'double') for s in ('AP', 'AT', 'PE'...

  • 8076 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Burhanudin Badiuzaman​ :The error message suggests that there may be a circular import happening within the databricks_test_helper module, which is preventing the Test class from being properly imported.One possible solution is to import the Test cl...

  • 0 kudos
2 More Replies
rsamant07
by New Contributor III
  • 3963 Views
  • 11 replies
  • 2 kudos

Resolved! DBT Job Type Authenticating to Azure Devops for git_source

we are trying to execute the databricks jobs for dbt task type but it is failing to autheticate to git. Problem is job is created using service principal but service principal don't seem to have access to the repo. few questions we have:1) can we giv...

  • 3963 Views
  • 11 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Rahul Samant​ I'm sorry you could not find a solution to your problem in the answers provided.Our community strives to provide helpful and accurate information, but sometimes an immediate solution may only be available for some issues.I suggest p...

  • 2 kudos
10 More Replies
sensanjoy
by Contributor
  • 3650 Views
  • 7 replies
  • 3 kudos

Authenticate Databricks REST API and access delta tables from external web service.

Hi All,We do have a requirement to access delta tables from external web service(Web UI). Presently we have tested it through jdbc connection and authenticated using PAT:Ex. jdbc:spark://[DATABRICKS_HOST]:443/default;transportMode=http;ssl=1;httpPath...

  • 3650 Views
  • 7 replies
  • 3 kudos
Latest Reply
sensanjoy
Contributor
  • 3 kudos

Hi @Suteja Kanuri​ , could you please help me with above queries.

  • 3 kudos
6 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels