cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

pra18
by New Contributor II
  • 820 Views
  • 2 replies
  • 0 kudos

Handling Binary Files Larger than 2GB in Apache Spark

I'm trying to process large binary files (>2GB) in Apache Spark, but I'm running into the following error:File format is : .mf4 (Measurement Data Format) org.apache.spark.SparkException: The length of ... is 14749763360, which exceeds the max length ...

  • 820 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @pra18, You can split and load the binary files using split command like this. ret = os.system("split -b 4020000 -a 4 -d large_data.dat large_data.dat_split_")

  • 0 kudos
1 More Replies
kivaniutenko
by New Contributor
  • 284 Views
  • 0 replies
  • 0 kudos

HTML Formatting Issue in Databricks Alerts

Hello everyone,I have recently encountered an issue with HTML formatting in custom templates for Databricks Alerts. Previously, the formatting worked correctly, but now the alerts display raw HTML instead of properly rendered content.For example, an ...

  • 284 Views
  • 0 replies
  • 0 kudos
skd217
by New Contributor
  • 970 Views
  • 3 replies
  • 0 kudos

Is there any way to connect polaris catalog from unity catalog?

Hi databricks community, I'd like to access data managed by polaris catalog through unity catalog to manage all data one place. But is there any way to connect? (I could access the data with all-purpose cluster without unity catalog.)

  • 970 Views
  • 3 replies
  • 0 kudos
Latest Reply
chandu402240
New Contributor II
  • 0 kudos

Can you provide info on how you access polaris catalog data from databricks cluster (without UC) ? any blog ?

  • 0 kudos
2 More Replies
shan-databricks
by New Contributor II
  • 625 Views
  • 2 replies
  • 0 kudos

Databricks Workflow Orchestration

I have 50 tables and will increase gradually, so I want to create a single workflow to orchestrate the job and run it table-wise. Is there an option to do this in Databricks workflow?

  • 625 Views
  • 2 replies
  • 0 kudos
Latest Reply
Edthehead
Contributor III
  • 0 kudos

Breakup these 50 tables logically or functionally and place them in their own workflows. A good strategy would be to group tables that are dependent in the same workflow. Then use a master workflow to trigger each child workflow. So it will be like a...

  • 0 kudos
1 More Replies
subhas_1729
by New Contributor II
  • 847 Views
  • 1 replies
  • 0 kudos

Dashboard

Hi       I want to design a dashboard that will show some variables of Spark-UI. Is it possible to access Spark-UI variables from my spark program. 

  • 847 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @subhas_1729, You can achieve this by leveraging Spark's monitoring and instrumentation APIs. Spark provides metrics that can be accessed through the SparkListener interface as well as the REST API. The SparkListener interface allows you to receiv...

  • 0 kudos
dbhavesh
by New Contributor II
  • 681 Views
  • 3 replies
  • 1 kudos

How to Apply row_num in DLT

Hi all,how to use row_num in DLT or What is the alternative for row_num function in DLT.We are looking for same functionality which row num is doing. Thanks in advance.

  • 681 Views
  • 3 replies
  • 1 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 1 kudos

@dbhavesh I apologize for the lack of explanation.The ROW_NUMBER function requires ordering over the entire dataset, making it a non-time-based window function. When applied to streaming data, it results in the "NON_TIME_WINDOW_NOT_SUPPORTED_IN_STREA...

  • 1 kudos
2 More Replies
sachin_kanchan
by New Contributor III
  • 1158 Views
  • 6 replies
  • 0 kudos

Unable to log in into Community Edition

So I just registered for the Databricks Community Edition. And received an email for verification.When I click the link, I'm redirected to this website (image attached) where I am asked to input email. And when I do that, it sends me a verification c...

db_fail.png
  • 1158 Views
  • 6 replies
  • 0 kudos
Latest Reply
sachin_kanchan
New Contributor III
  • 0 kudos

What a disappointment this has been

  • 0 kudos
5 More Replies
prasidataengine
by New Contributor II
  • 898 Views
  • 2 replies
  • 0 kudos

Issue when connecting with Databricks cluster 15.4 without unity catalog using databricks connect

Hi,I have a shared cluster created on databricks which uses 15.4 runtime.I dont want to enable the unity catalog for this cluster.Previously I used python 3.9.13 version to connect to 11.3 cluster using databricks connect 11.3Now my company has restr...

Data Engineering
Databricks
databricks-connect
  • 898 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @prasidataengine, For DBR runtime 13.3 LTS and above you must have Unity Catalog enabled to be able to use databricks-connect. A Databricks account and workspace that have Unity Catalog enabled. See Set up and manage Unity Catalog and Enable a wo...

  • 0 kudos
1 More Replies
vidya_kothavale
by Contributor
  • 456 Views
  • 2 replies
  • 0 kudos

MongoDB Streaming Not Receiving Records in Databricks

Batch Read (spark.read.format("mongodb")) works fine.Streaming Read (spark.readStream.format("mongodb")) runs but receives no records.Batch Read (Works):df = spark.read.format("mongodb")\.option("database", database)\.option("spark.mongodb.read.conne...

  • 456 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @vidya_kothavale, MongoDB requires the use of change streams to enable streaming. Change streams allow applications to access real-time data changes without polling the database. Ensure that your MongoDB instance is configured to support change...

  • 0 kudos
1 More Replies
Dianagarces8
by New Contributor
  • 230 Views
  • 1 replies
  • 0 kudos

The lifetime of files in the DBFS are NOT tied to the lifetime of our cluster

What happen so that the lifetime of files in the DBFS are NOT tied to the lifetime of our cluster?

  • 230 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

files in dbfs are typically not linked to a cluster or it's lifetime.There are tmp directories in dbfs so perhaps you are looking at those, but f.e. Filestore can definitely be used.However, I suggest not using dbfs but some data lake (S3/ADLS).

  • 0 kudos
MarkV
by New Contributor III
  • 1320 Views
  • 6 replies
  • 0 kudos

DLT, Automatic Schema Evolution and Type Widening

I'm attempting to run a DLT pipeline that uses automatic schema evolution against tables that have type widening enabled.I have code in this notebook that is a list of tables to create/update along with the schema for those tables. This list and spar...

  • 1320 Views
  • 6 replies
  • 0 kudos
Latest Reply
Sidhant07
Databricks Employee
  • 0 kudos

Alternatively, you can try using the INSERT INTO statement directly: def load_snapshot_tables(source_system_name, source_schema_name, table_name, spark_schema, select_expression): snapshot_load_df = ( spark.readStream .format("clou...

  • 0 kudos
5 More Replies
AbishekP
by New Contributor
  • 434 Views
  • 1 replies
  • 0 kudos

Unable to run selected lines in Databricks

I'm using SQL language in databricks. Basically I'm a tester and I'm trying to test the data load on tables by writing various queries. I'm unable to select a particular query and run. Ctrl+Shift+Enter shortcut is not working.Currently I need to open...

  • 434 Views
  • 1 replies
  • 0 kudos
Latest Reply
Edthehead
Contributor III
  • 0 kudos

You cannot do this from the notebooks. But you can do it via the SQL editor as shown below. 

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels