cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to perform Inner join using withcolumn

pramalin
New Contributor
 
3 REPLIES 3

daniel_sahal
Esteemed Contributor

@prudhvi ramalingamโ€‹ 

Here is example: https://stackoverflow.com/a/61029482

Nhan_Nguyen
Valued Contributor

@prudhvi ramalingamโ€‹ you could refer to this link: https://sparkbyexamples.com/spark/spark-sql-join-on-multiple-columns/

shan_chandra
Honored Contributor III
Honored Contributor III

@prudhvi ramalingamโ€‹ - Please refer to the below example code.

import org.apache.spark.sql.functions.expr
val person = Seq(
    (0, "Bill Chambers", 0, Seq(100)),
    (1, "Matei Zaharia", 1, Seq(500, 250, 100)),
    (2, "Michael Armbrust", 1, Seq(250, 100)))
  .toDF("id", "name", "graduate_program", "spark_status")
 
val graduateProgram = Seq(
    (0, "Masters", "School of Information", "UC Berkeley"),
    (2, "Masters", "EECS", "UC Berkeley"),
    (1, "Ph.D.", "EECS", "UC Berkeley"))
  .toDF("id", "degree", "department", "school")
 
val sparkStatus = Seq(
    (500, "Vice President"),
    (250, "PMC Member"),
    (100, "Contributor"))
  .toDF("id", "status")
 
person
  .withColumnRenamed("id", "personId")
  .join(sparkStatus, expr("array_contains(spark_status, id)"))
  .show()

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.