I have this script:
from databricks import sql
import os
import pandas as pd
databricksToken = os.environ.get('DATABRICKS_TOKEN')
connection = sql.connect(server_hostname = "",
http_path = "",
access_token = databricksToken)
def databricks_write(name, racf, action):
global connection # Use the global connection variable
cursor = connection.cursor()
df = pd.DataFrame({'operatorName': name, 'racf': racf, 'action': action, 'datetime': [pd.Timestamp.now()]})
columns = ', '.join(df.columns)
print(columns)
values = ', '.join([f"'{value}'" for value in df.values[0]])
print(values)
query = f"INSERT INTO okta_log.log ({columns}) VALUES ({values})"
cursor.execute(query)
connection.commit()
This does successfully add a row to the table and I can see subsequent rows also getting added after running this function each time. But when I reload the table and run the select * again I don't see my rows of data. It seems that the row of data persists only for a short duration and is not permanent
How do I fix this?