cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

ValueError: not enough values to unpack (expected 2, got 1)

Ligaya
New Contributor II

Code:

Writer.jdbc_writer("Economy",economy,conf=CONF.MSSQL.to_dict(), modified_by=JOB_ID['Economy'])

The problem arises when i try to run the code, in the specified databricks notebook, An error of "ValueError: not enough values to unpack (expected 2, got 1)",

here's the full error message:

ValueError: not enough values to unpack (expected 2, got 1)
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<command-75945229> in <cell line: 1>()
----> 1 Writer.jdbc_writer("Economy",economy,conf=CONF.MSSQL.to_dict(), modified_by=JOB_ID['Economy'])
      2 
      3 
 
<command-75945229> in jdbc_writer(table_name, df, conf, debug, modified_by)
     15       conf = conf.to_dict()
     16 
---> 17     schema, table = table_name.split('.')
     18     schema = schema[1:-1] if schema[0] == "[" else schema
     19     table = table[1:-1] if table[0] == "[" else table

And when I clicked the cell, this is the line of code:

class Writer:
  @staticmethod
  def jdbc_writer(table_name:str, 
                  df:SparkDataFrame, 
                  conf:Union[dict ,SqlConnect ], 
                  debug=False, 
                  modified_by = None,
                 ) -> No

I have searched for solutions regarding this particular problem but have never seem to find it, and your help would really benefit me.

3 REPLIES 3

User16752242622
Valued Contributor

Hello, Thank you for reaching out to us.

This looks like a general error message, Can you please share the Runtime version of the cluster that you are running the notebook on? You can find this detail under cluster configuration.

Also, Hav you checked this article?

https://stackoverflow.com/questions/52108914/python-how-to-fix-valueerror-not-enough-values-to-unpac...

Hi, this is the runtime version:

11.3 LTS (includes Apache Spark 3.3.0, Scala 2.12)

Anonymous
Not applicable

@Jillinie Park​ :

The error message you are seeing ("ValueError: not enough values to unpack (expected 2, got 1)") occurs when you try to unpack an iterable object into too few variables. In your case, the error is happening on this line of code:

schema, table = table_name.split('.')

Here, you are trying to unpack the results of the split() method into two variables (schema and table), but it looks like the split() method is only returning one value instead of two. To fix this error, you can check the value of table_name before calling the split() method to make sure it contains a dot (.) character. If it doesn't, you can handle the error accordingly (e.g. raise an exception or return an error message).

Here's an example of how you could modify the jdbc_writer() method to handle this error:

class Writer:
  @staticmethod
  def jdbc_writer(table_name:str, 
                  df:SparkDataFrame, 
                  conf:Union[dict ,SqlConnect ], 
                  debug=False, 
                  modified_by=None) -> No:
    
    if '.' not in table_name:
        raise ValueError(f"Invalid table name '{table_name}'. Table name should be in the format 'schema.table'.")
    
    if isinstance(conf, SqlConnect):
        conf = conf.to_dict()
    
    schema, table = table_name.split('.')
    schema = schema[1:-1] if schema[0] == "[" else schema
    table = table[1:-1] if table[0] == "[" else table
    
    # rest of the code goes here

In this modified version of the jdbc_writer() method, we first check if the table_name argument contains a dot (.) character. If it doesn't, we raise a ValueError with an appropriate error message. Otherwise, we proceed with the rest of the method as before.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.