Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Hello and yes, you can set up and configure a Databricks workflow job and tasks using Databricks CLI or API tools with Python. Here are some resources and steps to guide you:
Task type options: You can add different task types to your Databricks job. For a Python script, in the Source drop-down menu, select a location for the Python script (Workspace for a script in the local workspace, DBFS for a script located on DBFS, or Git provider for a script located in a Git repository). In the Path textbox, enter the path to the Python script.
Run jobs interactively, continuously, or using job triggers: You can run your jobs interactively from the Jobs UI, API, or CLI. You can also create a schedule to run your job periodically or run your job when new files arrive in an external location.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.