- 1251 Views
- 0 replies
- 0 kudos
Dario Schiraldi Deutsche Bank Executive, known for his strong leadership in the financial and banking sector. Dario Schiraldi brings 20 years of leadership experience to major worldwide organizations where his expertise extends into both market acqui...
- 1251 Views
- 0 replies
- 0 kudos
- 1800 Views
- 0 replies
- 1 kudos
SAS to PY is an AI/ML-based Accelerator designed for "SAS to Python or PySpark" code migration. This Accelerator is engineered to convert SAS legacy proprietary codes to the more flexible, open-source Python or PySpark environment with 95% automatica...
- 1800 Views
- 0 replies
- 1 kudos
- 798 Views
- 1 replies
- 0 kudos
Hi everyone,I am Dario Schiraldi, CEO of Travel Works, and I am reaching out to the community for some insights. We are in the process of integrating Databricks with AWS for a new project, and I have love to hear from anyone who has experience with t...
- 798 Views
- 1 replies
- 0 kudos
Latest Reply
Hello Dario Good to meet you. You can connect with your account manager of databricks. Also Azure provides first partner assistance to databricks. you can check Azure services as well. Thank you.
- 5793 Views
- 4 replies
- 0 kudos
Hi,I'm trying to set up a local development environment using python / vscode / poetry. Also, linting is enabled (Microsoft pylance extension) and the python.analysis.typeCheckingMode is set to strict.We are using python files for our code (.py) whit...
- 5793 Views
- 4 replies
- 0 kudos
Latest Reply
How did you solve the type error checks on `pyspark.sql ` ? mypy doesn't create the missing stubs for that one?
3 More Replies
- 2883 Views
- 1 replies
- 1 kudos
I have a Databricks pipeline that pulls data from AWS, which takes ~90 minutes. After this, I need to refresh a series of Power BI dataflows (~45 mins) and then datasets (~45 mins).I want to trigger the Power BI refresh automatically from Databricks ...
- 2883 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @chandataeng ,The current Power BI task that is available in databricks workflow will wait for refresh process to return correct status (whether it succeeded or failed).But you can start refresh process by using asynchronous REST API call. The ref...
- 4538 Views
- 2 replies
- 0 kudos
We started migrating databases from hive_metastore into unity catalog back in October 2024 and ive noticed that periodically the Catalog UI will not show columns or a data preview for some tables, but not all of them that were migrated. After some di...
- 4538 Views
- 2 replies
- 0 kudos
Latest Reply
this is definitely a bug related to older instances of azure databricks that were upgraded to use unity platform. after going back and forth with MS support for 2+ months, we made the decision to just spin up a new instance of azure databricks and co...
1 More Replies
- 8607 Views
- 8 replies
- 0 kudos
I was recently given a credential file to access shared data via delta sharing. I am following the documentation from https://docs.databricks.com/en/data-sharing/read-data-open.html. The documentation wants the contents of the credential file in a fo...
- 8607 Views
- 8 replies
- 0 kudos
Latest Reply
Hi, the most feasible way would be to convert the contents of your key file into base64 and only mention the spark config as below: credentials <base 64 encoded code>
7 More Replies