โ04-08-2016 09:57 AM
Hi,
My requirement is I need to create a Spark In-memory table (Not pushing hive table into memory) insert data into it and finally write that back to Hive table.
Idea here is to avoid the disk IO while writing into Target Hive table. There are lot of insert statements but I want to write that back to Hive table only after all execution is over.
Could you please let me know if that is possible. Please let me know if you have better solution
Thanks & Regards,
~Sri
โ04-08-2016 10:02 AM
1) Use sc.parallelize to create the table.
2) Register just a temporary table.
3) You can keep adding insert statements into this table. Note that Spark SQL supports inserting from other tables. So again, you might need to create temporary tables to insert into the first temporary table.
This table should not write out to disk until you run a "saveAsTable" or other type of command.
-Vida
โ04-11-2016 08:27 PM
Hi Vida,
Sorry for the late reply.
I tried creating 2 similar temp tables in spark based on a Hive table, 1 had data other one is empty
when I try to insert into empty table I get the below error
org.apache.spark.sql.AnalysisException: Inserting into an RDD-based table is not allowed.;
Please correct me if there are any issues with this approach. I tried exactly same as you said except I created the temp tables out of a Hive table
Regards,
~Sri
โ04-12-2016 06:31 AM
Got it - how about using a UnionAll? I believe this code snippet does what you'd want:
from pyspark.sql import Rowarray = [Row(value=1), Row(value=2), Row(value=3)] df = sqlContext.createDataFrame(sc.parallelize(array))
array2 = [Row(value=4), Row(value=5), Row(value=6)] df2 = sqlContext.createDataFrame(sc.parallelize(array2))
two_tables = df.unionAll(df2) two_tables.collect()
>> Out[17]: [Row(value=1), Row(value=2), Row(value=3), Row(value=4), Row(value=5), Row(value=6)]
โ04-12-2016 07:21 AM
Vida,
Thank you very much for your help
That works good but problem is I have to insert data from multiple queries. I have to declare collection of dataframes to store data from each query at the end I can union all dataframes and insert into an Hive table.
I tried to create collection of dataframe in scala , I am new to scala still struggling
Please let me know the syntax for declaring collection/array of dataframes
Regards,
~Sri
โ04-12-2016 08:49 AM
Hi Sri,
It's probably worth your time going through some DataFrame tutorials. Here's a good one from us on the basics of DataFrames. This material should help you get a sense for how you might create a collection of DataFrames and learn a bit more of the scala nuances!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group