Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
I have a process on databricks when I need to upload a CSV file everyday manually.
I would like to know if there is a way to import this data (as panda in python, for example) with no necessary to upload this file everyday manually utilizing UI.
You can use our DBFS Put REST API endpoint. Note that this uploads the data but you would still probably need a Databricks job to load the data into a table.
You could also use the REST APIs or Python client for the cloud storage you are using as well.
You can use our DBFS Put REST API endpoint. Note that this uploads the data but you would still probably need a Databricks job to load the data into a table.
You could also use the REST APIs or Python client for the cloud storage you are using as well.
if you're using Azure databricks you can also try Autoloader this will upload you data into a spark dataframe or directly into a panda's dataframe. Be aware that you'll need setup your workspace with the necessary permissions.