cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

THeodor
by New Contributor II
  • 4254 Views
  • 11 replies
  • 4 kudos

Certificate and Badge not received

I have cleared my certification exam, Databricks Certified Data Engineer Associate on 07 April 2023. I haven't received any certification or badge yet.I sent an email to training databricks and they told me that this problem has been solved...Any Hel...

  • 4254 Views
  • 11 replies
  • 4 kudos
Latest Reply
Nadia1
Honored Contributor
  • 4 kudos

Hello all,We are going through a contract renewal with our vendor, Accredible. Once our new contract goes through you will get your badge this week. Thank you for understanding.

  • 4 kudos
10 More Replies
akashsharma7119
by Contributor
  • 21761 Views
  • 13 replies
  • 8 kudos

Resolved! Not able to generate Access Token for Service Principal using rest API

I am trying to generate a Databricks token for a service principal (SP). I have created the SP in Azure AD and have used the Databricks rest api to add it as an admin.When using the Databricks rest API "/api/2.0/token-management/on-behalf-of/tokens" ...

  • 21761 Views
  • 13 replies
  • 8 kudos
Latest Reply
callumwhite
New Contributor III
  • 8 kudos

Hi all,I believe I found a temporary fix for this -Generate an AAD token for the service principle in Azure. Follow this guide if you don't know how to -https://learn.microsoft.com/en-us/azure/databricks/dev-tools/api/latest/aad/service-prin-aad-toke...

  • 8 kudos
12 More Replies
Ligaya
by New Contributor II
  • 23872 Views
  • 3 replies
  • 2 kudos

ValueError: not enough values to unpack (expected 2, got 1)

Code:Writer.jdbc_writer("Economy",economy,conf=CONF.MSSQL.to_dict(), modified_by=JOB_ID['Economy'])The problem arises when i try to run the code, in the specified databricks notebook, An error of "ValueError: not enough values to unpack (expected 2, ...

  • 23872 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@Jillinie Park​ :The error message you are seeing ("ValueError: not enough values to unpack (expected 2, got 1)") occurs when you try to unpack an iterable object into too few variables. In your case, the error is happening on this line of code:schem...

  • 2 kudos
2 More Replies
killjoy
by New Contributor III
  • 4440 Views
  • 2 replies
  • 0 kudos

Unexpected failure while fetching notebook - What can we do from our side?

Hello!We got some pipelines running in Azure Data Factory that call Databricks Notebooks to run data transformations. This morning at 6:21 AM (UTC) we got an error " Unexpected failure while fetching notebook" inside a notebook that calls another one...

  • 4440 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Rita Fernandes​ :Based on the error message you provided, it seems like the issue might be related to the version mismatch between the ANTLR tool used for code generation and the current runtime version. Additionally, the error message suggests that...

  • 0 kudos
1 More Replies
d_meaker
by New Contributor II
  • 1467 Views
  • 3 replies
  • 0 kudos

map_keys() returns an empty array in Delta Live Table pipeline.

We are exploding a map type column into multiple columns based on the keys of the map column. Part of this process is to extract the keys of a map type column called json_map as illustrated in the snippet below. The code executes as expected when run...

  • 1467 Views
  • 3 replies
  • 0 kudos
Latest Reply
d_meaker
New Contributor II
  • 0 kudos

Hi @Suteja Kanuri​ , Thank you for you response and explanation. The code I have shown above is not the exact snippet we are using. Please find the exact snippet below. We are dynamically extracting the keys of the map and then using getitem() to mak...

  • 0 kudos
2 More Replies
Neerajkirola
by New Contributor
  • 877 Views
  • 0 replies
  • 0 kudos

Types of RAM: An In-Depth OverviewRandom Access Memory (RAM) is an essential component of any computer system, responsible for temporarily storing dat...

Types of RAM: An In-Depth OverviewRandom Access Memory (RAM) is an essential component of any computer system, responsible for temporarily storing data that the CPU (Central Processing Unit) needs to access quickly. It allows for faster data retrieva...

Head to Head Table
  • 877 Views
  • 0 replies
  • 0 kudos
burhanudinera20
by New Contributor II
  • 8179 Views
  • 3 replies
  • 0 kudos

Cannot import name 'Test' from partially initialized module 'databricks_test_helper'

I have done install, with this command ' pip install databricks_test_helper 'next get ImportError messages when i try running this code on cloud databricks ;from databricks_test_helper import *expected = set([(s, 'double') for s in ('AP', 'AT', 'PE'...

  • 8179 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Burhanudin Badiuzaman​ :The error message suggests that there may be a circular import happening within the databricks_test_helper module, which is preventing the Test class from being properly imported.One possible solution is to import the Test cl...

  • 0 kudos
2 More Replies
rsamant07
by New Contributor III
  • 4001 Views
  • 11 replies
  • 2 kudos

Resolved! DBT Job Type Authenticating to Azure Devops for git_source

we are trying to execute the databricks jobs for dbt task type but it is failing to autheticate to git. Problem is job is created using service principal but service principal don't seem to have access to the repo. few questions we have:1) can we giv...

  • 4001 Views
  • 11 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Rahul Samant​ I'm sorry you could not find a solution to your problem in the answers provided.Our community strives to provide helpful and accurate information, but sometimes an immediate solution may only be available for some issues.I suggest p...

  • 2 kudos
10 More Replies
sensanjoy
by Contributor
  • 3693 Views
  • 7 replies
  • 3 kudos

Authenticate Databricks REST API and access delta tables from external web service.

Hi All,We do have a requirement to access delta tables from external web service(Web UI). Presently we have tested it through jdbc connection and authenticated using PAT:Ex. jdbc:spark://[DATABRICKS_HOST]:443/default;transportMode=http;ssl=1;httpPath...

  • 3693 Views
  • 7 replies
  • 3 kudos
Latest Reply
sensanjoy
Contributor
  • 3 kudos

Hi @Suteja Kanuri​ , could you please help me with above queries.

  • 3 kudos
6 More Replies
Anonymous
by Not applicable
  • 6513 Views
  • 0 replies
  • 0 kudos

As companies grow and evolve, a Chief Technology Officer (CTO) becomes crucial in shaping the organization's technical direction and driving innov...

As companies grow and evolve, a Chief Technology Officer (CTO) becomes crucial in shaping the organization's technical direction and driving innovation. Regarding filling this critical leadership position, companies decide to either promote an existi...

  • 6513 Views
  • 0 replies
  • 0 kudos
Pien
by New Contributor II
  • 8634 Views
  • 5 replies
  • 0 kudos

Resolved! Getting date out of year and week

Hi all,I'm trying to get a date out of the columns year and week. The week format is not recognized.  df_loaded = df_loaded.withColumn("week_year", F.concat(F.lit("3"),F.col('Week'), F.col('Jaar')))df_loaded = df_loaded.withColumn("date", F.to_date(F...

  • 8634 Views
  • 5 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Pien Derkx​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers yo...

  • 0 kudos
4 More Replies
QuicKick
by New Contributor
  • 3005 Views
  • 2 replies
  • 0 kudos

How do I search for all the columns/field names starting with "XYZ"

I would like to do a big search on all field/columns names that contain "XYZ".I tried below sql but it's giving me an error.SELECT table_name,column_nameFROM information_schema.columnsWHERE column_name like '%<account>%'order by table_name, column_na...

  • 3005 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Ian Fox​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your ...

  • 0 kudos
1 More Replies
kaileena
by New Contributor
  • 1099 Views
  • 2 replies
  • 0 kudos

cannot install RMySQL "there is no package called ‘RMySQL’

cannot install RMySQL on databricks. i tried:install.packages("RMySQL")i got the error:Installing package into ‘/local_disk0/.ephemeral_nfs/envs/rEnv-c677bc4c-e6a3-40df-a5ab-bfd5d277e0c0’ (as ‘lib’ is unspecified) Warning: unable to access index for ...

  • 1099 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @miru miro​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers you...

  • 0 kudos
1 More Replies
Merchiv
by New Contributor III
  • 2419 Views
  • 4 replies
  • 0 kudos

Difference between Databricks and local pyspark split.

I have noticed some inconsistent behavior between calling the 'split' fuction on databricks and on my local installation.Running it in a databricks notebook givesspark.sql("SELECT split('abc', ''), size(split('abc',''))").show()So the string is split...

image.png
  • 2419 Views
  • 4 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Ivo Merchiers​ :The behavior you are seeing is likely due to differences in the underlying version of Apache Spark between your local installation and Databricks. split() is a function provided by Spark's SQL functions, and different versions of Spa...

  • 0 kudos
3 More Replies
arw1070
by New Contributor II
  • 1834 Views
  • 3 replies
  • 0 kudos

Databricks extension is not configuring in VScode

I am trying to install and work with the Databricks vscode extensions. I installed it a few weeks ago, and it initially worked, but I mistyped some of the configuration so I tried to restart, since then it has not worked. Whenever I install the exten...

  • 1834 Views
  • 3 replies
  • 0 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 0 kudos

@Anna Wuest​ I have Tried and not seeing any issues, which version of Vs code you are using. can you please try to update to latest Visual Studio Code version 1.77.1 and try to Install databricks plugin version and test .if you using windows--> pleas...

  • 0 kudos
2 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels