by
Yashir
• New Contributor III
- 1882 Views
- 5 replies
- 4 kudos
If not, then I believe that it will be beneficial because the feature tables contain engineered features that its a good idea to document their calc logic for the benefit of other data scientists. Also, even non-engineered features are many times no...
- 1882 Views
- 5 replies
- 4 kudos
Latest Reply
I also would like to see support added for feature description get/set methods.
4 More Replies
- 1082 Views
- 1 replies
- 4 kudos
Hi, I tried to deploy a Feature Store packaged model into Delta Live Table using mlflow.pyfunc.spark_udf in Azure Databricks. This model is built by Databricks autoML with joined Feature Table inside it.And I'm trying to make prediction using the fol...
- 1082 Views
- 1 replies
- 4 kudos
Latest Reply
Hi @Chengcheng Guo​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.
- 3495 Views
- 2 replies
- 1 kudos
Hi, I'm trying to process a small dataset (less than 300 Mb) composed by five queries that run with spark. The end result of those queries is parsed using python and merged into a data frame. Then I try to write this to a delta lake table using featu...
- 3495 Views
- 2 replies
- 1 kudos
Latest Reply
Hello, we have recently found that it's my user in particular that casues the memory issue. Two other users in my organization can run the same notebook without problems, but my user consistenly consumes all available ram and crashes the cluster... a...
1 More Replies
by
mrcity
• New Contributor II
- 1132 Views
- 2 replies
- 1 kudos
I've got data stored in feature tables, plus in a data lake. The feature tables are expected to lag the data lake by at least a little bit. I want to filter data coming out of the feature store by querying the data lake for lookup keys out of my inde...
- 1132 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Stephen Wylie​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...
1 More Replies
by
lewit
• New Contributor II
- 919 Views
- 2 replies
- 1 kudos
Rather than joining features from different tables, I just wanted to use a single feature store table and select some of its features, but still log the model in the feature store. The problem I am facing is that I do not know how to create the train...
- 919 Views
- 2 replies
- 1 kudos
Latest Reply
Hi, Could you please refer https://docs.databricks.com/machine-learning/feature-store/train-models-with-feature-store.html#create-a-trainingset-using-the-same-feature-multiple-times and let us know if this helps.
1 More Replies
- 1140 Views
- 3 replies
- 2 kudos
I'm getting this message with the following code:from databricks import feature_store
fs = feature_store.FeatureStoreClient()
fs.create_table(
name='feature_store.user_login',
primary_keys=['user_id'],
df=df_x,
description='user l...
- 1140 Views
- 3 replies
- 2 kudos
Latest Reply
Yes, it's a nice thing to do. You can report it here: https://community.databricks.com/s/topic/0TO3f000000CnKrGAK/bug-report and if it's more urgent or blocking for you, you can also open a ticket to the help center: https://docs.databricks.com/resou...
2 More Replies
- 1214 Views
- 2 replies
- 5 kudos
Hi! can anyone please help me with a documentation which can help me set up integration between data bricks with AWS without a QuickStart default cloud formation template. I would want to use my own CFT rather than using the default due to security ...
- 1214 Views
- 2 replies
- 5 kudos
Latest Reply
Pat
Honored Contributor III
Hi @Atul S​ ,I think that terraform is recommended way to go with Databricks deployment. I mean it's also supported now by the Databricks support.I haven't look much on the CloudFormation setup, because we decided to go with the Terraform in the comp...
1 More Replies
by
Nath
• New Contributor II
- 1381 Views
- 3 replies
- 2 kudos
I access databricks feature store outside databricks with databricks-connect on my IDE pycharm.The problem is just outside Databricks, not with a notebook inside Databricks.I use FeatureLookup mecanism to pull data from Feature store tables in my cus...
- 1381 Views
- 3 replies
- 2 kudos
Latest Reply
Also, Please refer to the below KB for additional resolution - https://learn.microsoft.com/en-us/azure/databricks/kb/dev-tools/dbconnect-protoserializer-stackoverflow
2 More Replies
by
Alex_G
• New Contributor II
- 1197 Views
- 3 replies
- 5 kudos
Hello!I am attempting to move some machine learning code from a databricks notebook into a mlflow git repository. I am utilizing the databricks feature store to load features that have been processed. Currently I cannot get the databricks library to ...
- 1197 Views
- 3 replies
- 5 kudos
Latest Reply
Hi @Alex Graff​ , Just a friendly follow-up. Do you still need help, or @Sean Owen​ 's response help you to find the solution? Please let us know.
2 More Replies
- 721 Views
- 1 replies
- 0 kudos
And do I have any control over where and how it's saved?
- 721 Views
- 1 replies
- 0 kudos
Latest Reply
The offline store is backed by Delta tables . In AWS we support Amazon Aurora (MySQL-compatible) & Amazon RDS MySQL and in Azure we support Azure Database for MySQL and Azure SQL Database as as online stores https://docs.microsoft.com/en-us/azure/d...