02-08-2023 02:37 AM
Hi everyone,
Would it be possible to change the default storage path of deature store, during creation and/or after creation? If you could also provide the python script to that I would appreciate. The current default path is:
"dbfs/user/hive/warehouse/<feature database name>/<feature table name>”
With many thanks in advance! 🙂
02-09-2023 11:53 PM
I found the solution, but there is still one question remained unanswered, which is, how to change the directory after creating FS. To change the default path, all is needed to set the path argument in
fs = feature_store.FeatureStoreClient()
fs.create_table(name= '', primary_keys = [], .., path = "user given path")
02-10-2023 10:15 AM
The config you're looking for is "spark.sql.warehouse.dir"
Here is the documentation
@Saeid Hedayati is correct that you can set the path when creating the feature store. It's an optional parameter listed here.
Once the table is created, you can't change the path because it's got all the files in it. You can potentially clone the table but migration would require moving all the files.
02-17-2023 07:29 AM
@Joseph Kambourakis thank you for your response.
02-17-2023 07:54 AM
No problem. Glad I could help!
04-08-2023 12:40 AM
Hi @Saeid Hedayati
Hope everything is going great.
Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we can help you.
Cheers!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group