How to deal with column name with .(dot) in pyspark dataframe??
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-24-2019 04:14 AM
- We are streaming data from kafka source with json but in some column we are getting .(dot) in column names.
- streaming json data:
df1 = df.selectExpr("CAST(value AS STRING)")
{"pNum":"A14","from":"telecom","payload":{"TARGET":"1","COUNTRY":"India","EMAIL.1":"test@test.com","PHONE.1":"1122334455"}}
- in above json we are getting (EMAIL.1,PHONE.1) with .(dot) name.
- we are extracting the json data with get_json_object like below but we are getting Email and phone values are null
df2 = df1.select(get_json_object(df1["value"], '$.pNum').alias('pNum'), get_json_object(df1["value"], '$.from').alias('from'), get_json_object(df1["value"], '$.payload.TARGET').alias('TARGET'), get_json_object(df1["value"], '$.payload.COUNTRY').alias('COUNTRY'), get_json_object(df1["value"], '$.payload.EMAIL.1').alias('EMAIL'), get_json_object(df1["value"], '$.payload.PHONE.1').alias('PHONE'))
then how to deal with this type of columns name??
- Labels:
-
JSON
-
Spark--dataframe
-
Streaming spark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-30-2019 03:27 AM
Hi @Mithu Wagh you can use backticks to enclose the column name.
df.select("`col0.1`")

