Hello @December ,
You should contact your account team if you need more info on this feature which is not GA for the time being.
Another option would be to use Iceberg REST interface: https://medium.com/@flyws1993/oss-trino-to-read-tables-from-databr...
Yes, this feature is still in Private Preview. Please contact your account manager and they will be able to enable this feature for you to test. You just need to share the workspace ids you want this feature enabled on.Regards,Mathieu
Hello @thecodecache ,
Have a look the SQLGlot project: https://github.com/tobymao/sqlglot?tab=readme-ov-file#faq
It can easily transpile SQL to Spark SQL, like that:
import sqlglot
from pyspark.sql import SparkSession
# Initialize Spark session
spar...
Hello @halox6000,
You could temporarily redirect console output to a null device for these write operations.
Try this out:
@contextlib.contextmanager
def silence_dbutils():
with contextlib.redirect_stdout(io.StringIO()):
yield
# Usage in...
Hello @Nathant93,
You could use dbutils.fs.ls and iterate on all the directories found to accomplish this task.
Something like this:
def find_empty_dirs(path):
directories = dbutils.fs.ls(path)
for directory in directories:
if directo...