cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

issibra
by New Contributor III
  • 607 Views
  • 1 replies
  • 1 kudos

ReadStream & writeStream at gold layer level

Hello, I have seen in many places readStream and writeStream in gold layer, Is it correct to use readStream and writeStream for gold layer ? knowing that a gold table is no not valid for streaming.is there some logic when to use readStream/ writeStr...

  • 607 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Ibrahim ISSOUANI​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 1 kudos
Dave_Nithio
by Contributor
  • 4755 Views
  • 2 replies
  • 4 kudos

Resolved! Delta Live Table Schema Error

I'm using Delta Live Tables to load a set of csv files in a directory. I am pre-defining the schema to avoid issues with schema inference. This works with autoloader on a regular delta table, but is failing for Delta Live Tables. Below is an example ...

  • 4755 Views
  • 2 replies
  • 4 kudos
Latest Reply
shagun
New Contributor III
  • 4 kudos

i was facing similar issue in loading json files through autoloader for delta live tables.Was able to fix with this option .option("cloudFiles.inferColumnTypes", "True")From the docs "For formats that don’t encode data types (JSON and CSV), Auto Load...

  • 4 kudos
1 More Replies
LearnerShahid
by New Contributor II
  • 3279 Views
  • 6 replies
  • 4 kudos

Resolved! Lesson 6.1 of Data Engineering. Error when reading stream - java.lang.UnsupportedOperationException: com.databricks.backend.daemon.data.client.DBFSV1.resolvePathOnPhysicalStorage(path: Path)

Below function executes fine: def autoload_to_table(data_source, source_format, table_name, checkpoint_directory):  query = (spark.readStream         .format("cloudFiles")         .option("cloudFiles.format", source_format)         .option("cloudFile...

I have verified that source data exists.
  • 3279 Views
  • 6 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Autoloader is not supported on community edition.

  • 4 kudos
5 More Replies
William_Scardua
by Valued Contributor
  • 2059 Views
  • 4 replies
  • 2 kudos

Resolved! Error/Exception when a read websocket with readStream

Hi guys, how are you ? Can you help me ? that my situation When I try to read a websocket with readStream I receive a unknow error exception java.net.UnknownHostException That's my code wssocket = spark\ .readStream\ .forma...

  • 2059 Views
  • 4 replies
  • 2 kudos
Latest Reply
Deepak_Bhutada
Contributor III
  • 2 kudos

It will definitely create a streaming object. So, don't go by wssocket.isStreaming = Truepiece. Also, it will create the streaming object without any issue. Since lazy evaluation Now, coming to the issue, please put the IP directly, sometimes the sla...

  • 2 kudos
3 More Replies
Labels