Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Trying to use displayHTML from w/in a Python module gets a Python exception:NameError: name 'displayHTML' is not definedand I've found no way around this. It seems to be something at the UI layer or something, not a Python function that can be refere...
Holy Guacamole Batman! It works finally!!!! Wow, thanks @ptweir That's awesome! I can go back and update my doc (and code, to just use databricks the same, now, and Jupyter!) and it'll work by default. It's great they fixed it, shame they never told ...
I want to import a Python function stored in the following file path:`<repo>/lib/lib_helpers.py`I want to import the function from any file in my repo. For instance from these:`<repo>/notebooks/etl/bronze/dlt_bronze_elt``<repo>/workers/job_worker`It ...
Ok, I figured it out. If you just make it a Python module by adding an empty `__init__.py`, Databricks will load it on start. Then, you can just import it.
Not sure if I'm missing something here, but running a task outside of a python function runs much much quicker than executing the same task inside a function. Is there something I'm missing with how spark handles functions? 1) def task(x): y = dostuf...
have a small use case where we need to query the sql database with 1 million values(dynamically returned from python code) in the condition from python function. eg: select * from id in (1,2,23,33........1M). I feel this is very bad approach. Is ther...
You can also create a temporary view with the output from python code (one id = one row) and then inner join the view to the table. IMO will improve readability of your code.
Hi all!Before we used Databricks Repos we used the run magic to run various utility python functions from one notebook inside other notebooks, fex like reading from a jdbc connections. We now plan to switch to repos to utilize the fantastic CI/CD pos...
Thats...odd. I was sure I had tried that, but now it works somehow. I guess it has to be that now I did it with double quotation marks. Thanks anyway! Works like a charm.
Databricks Repos allows you to sync your work in Databricks with a remote Git repository. This makes it easier to implement development best practices. Databricks supports integrations with GitHub, Bitbucket, and GitLab. Using Repos you can bring you...