Fixed length file from Databricks notebook ( Spark SQL)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-29-2022 10:15 AM
Hi ,
I need help writing data from azure databricks notebook into Fixed Length .txt.
notebook has 10 lakh rows and 86 columns. can anyone suggest me
- Labels:
-
Databricks notebook
-
Spark sql
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-30-2022 10:13 PM
Hi Sadiq, Could you please refer below and check if this helps.
https://k21academy.com/microsoft-azure/data-engineer/reading-and-writing-data-in-databricks/
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-03-2022 10:34 PM
this is not the resolution
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-03-2022 10:38 PM
Hi @sadiq vali Could you please elaborate a little more what are you expecting about writing data into fixed length.txt?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-03-2022 10:45 PM
I have data in databrick notebook which is in the form of columns and rows.
now I want to export data from notebook to .txt file with fixed length.
like each column has defined with legth.
support column1 it should occupy only first 10 lenth
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-15-2022 03:47 AM
Hi @sadiq vali
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!