Hi Databricks Community,
I'm currently trying to create a feature table under the Hive metastore in the source workspace and migrate it later to Unity Catalog. Below is the code I am using:
from pyspark.sql.types import StructType, StructField, StringType, Integer Type
from databricks.feature_store import FeatureStoreClient
def compute_customer_features(data):
schema = StructType([
StructField("customer_id", StringType(), True),
StructField("feature1", IntegerType(), True),
StructField("feature2", IntegerType(), True)
])
data = [("cust1", 10, 100), ("cust2", 20, 200)]
return spark.createDataFrame(data, schema)
# Assuming df is defined earlier in your code. Replace with actual DataFrame.
df1 = None # Placeholder. Replace with actual DataFrame initialization.
customer_features_df1 = compute_customer_features(df1)
fs = FeatureStoreClient()
# Create the database if it does not exist
spark.sql("CREATE DATABASE IF NOT EXISTS recommender_system")
customer_feature_table = fs.create_table(
name='recommender_system.customer_features_01',
primary_keys=['customer_id'],
schema=customer_features_df1.schema,
description='Customer features'
)
fs.write_table(
name='recommender_system.customer_features_01',
df=customer_features_df1,
mode='overwrite'
)
display(customer_features_df1) But getting this below error
"
Exception: {'error_code': 'NOT_FOUND', 'message': 'Workspace Feature Store has been deprecated in the current workspace. Databricks recommends using Feature Engineering in Unity Catalog. Please reach out to system admin if you have any more questions.'}
So can you please help me to understand this issue, and can you help me to understand how to create feature table under hive metastore
Thank you