cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

I am trying to read nested data from json file to put it into streaming table using dlt

zero234
New Contributor III

So i have this nested data with more than 200+columns and i have extracted this data into json file 
when i use the below code to read the json files, if in data there are few columns which have no value at all it doest inclued those columns in schema .

from pyspark.sql import SparkSession
spark = (
SparkSession.builder.master("local[1]")
.config("spark.sql.jsonGenerator.ignoreNullFields", "false")
).getOrCreate()
# Create a SparkSession

df=spark.read.option("multiline","true").option("inferschema","true").json("file.json")
df.printSchema()

i can create schema and read it that will solve the issue i guess 
but wanted to know if there is any alternate approach to this 
Also can anyone help me with how to write this nested data to streaming table in bronze layer 

1 REPLY 1

zero234
New Contributor III

replying to my above question
we cannot use inferschema on streaming table we need to externally specify schema 
can anyone please suggest a way to write data in nested form to streaming table and if this is possible?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.