from pyspark.sql import functions as F
df = spark.read.option("sep", "|").option("header", "true").csv("/tmp/file.csv")
display(df.groupBy("projectNo").agg(F.expr("collect_list(EmployeeNo)").alias("employees")))
Hi Sravan, Apache Ranger is commonly used for fine-grained access controls. In your description, it sounds like you might be able to leverage Databricks audit logs, which would allow you to see user-level actions: https://docs.databricks.com/administ...