- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-01-2022 03:19 AM
For timeseries feature tables, an inner join is made at the creation of the feature table. For the other type of feature tables, a left join is made, so NaN values can show up in the training set. Can the inner join in create_training_set() method be implemented with a parameter?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-01-2022 12:32 PM
create_training_set performs left join. It is just a simple function which select data from Spark SQL database used by feature store. You can just write own code with inner join:
customer_features_df = spark.sql("SELECT * FROM recommender_system.customer_features")
product_features_df = spark.sql("SELECT * FROM recommender_system.product_features")
training_df.join(
customer_features_df,
on=[training_df.cid == customer_features_df.customer_id,
training_df.transaction_dt == customer_features_df.dt],
how="inner"
).join(
product_features_df,
on="product_id",
how="inner"
)

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-01-2022 08:57 AM
Hello, @Thibault Daoulas! My name is Piper, and I'm a moderator here in the community. It's nice to meet you and welcome to the community. Thank you for your question!
We'll give the community some time to respond, and then we will come back if we need to. 🙂
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-01-2022 12:32 PM
create_training_set performs left join. It is just a simple function which select data from Spark SQL database used by feature store. You can just write own code with inner join:
customer_features_df = spark.sql("SELECT * FROM recommender_system.customer_features")
product_features_df = spark.sql("SELECT * FROM recommender_system.product_features")
training_df.join(
customer_features_df,
on=[training_df.cid == customer_features_df.customer_id,
training_df.transaction_dt == customer_features_df.dt],
how="inner"
).join(
product_features_df,
on="product_id",
how="inner"
)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-02-2022 12:11 AM
Thank you Hubert, that's a good alternative, I just thought I'd stick to the api as much as possible, but this solves it.

