Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
Dzięki Venicold G@l naprawa naczyń krwionośnych w nogach następuje natychmiast. Ten nieinwazyjny, szybko działający lek rozpuszcza skrzepy krwi, co zatrzymuje krwawienie, ale prowadzi do upośledzenia przepływu krwi, cieńszych ścianek naczyń, wyższej ...
This video has a good overview of Databricks features, both existing and incoming: https://youtu.be/N9f_6Aoxeqg?si=LBkEJKvCY1IFCYZmGood for both those new and experienced on the platform, as new changes have added features. If anyone has a Keyboard s...
Thanks @szymon_dybczak - got those shortcuts now and trying to memorize some of them. Maybe Ill try and get these on a sheet of paper so I can keep it by my desk.
Hi there,Have a simple question. Not sure if Databricks supports this, but I'm wondering if there's a way to store the results of a sql cell into a spark dataframe? Or vice-versa, is there a way to take a sql query in python (saved as a string variab...
Hi @ChristianRRL, results of sql cell are automatically made available as a python dataframe using the _sqldf variable. You can read more about it here. For the second part not sure why you would need it when you can simply run the query like:spark.s...
Hey Everyone, Hope you’re all doing great!A friend of mine recently went through a data engineering interview at Walmart and shared his experience with me. I thought it would be really useful to pass along what he encountered. The interview had some ...
Hi Team,Can you share the best practices for designing the autoloader data processing?We have data from 30 countries data coming in various files. Currently, we are thinking of using a root folder i.e country, and with subfolders for the individual ...
Hi @Phani1 ,Structure of folders that you are going to use make sense to me. Since you've mentioned that there will be thousands of files, the best practice will be to use autoloader with file notification mode. Also, you can read about databricks r...
I am using the Unity Catalog Cluster. I have a requirement to read the files placed by the source team in a specific location (landing) in S3. I am already using a metastore pointing to a different bucket. Do I need to use an external location pointi...
did anyone get any solution on this topic? I am also facing the challenges reading the file from s3 using the boto3 with unity enabled cluster, created the s3 external location and granted the enough access. any help on this ?same path and data acce...
Hello,I'm trying to connect to our databricks instance using the vscode extension. However, when following this guide we cannot get the configuration to proceed past the point that it asks for our instance URL.when i am creating profile [DEFAULT]host...
Hi there,I have a few alerts for a query where if the column is set to >= 1, it will send an email notification to me and 2 others. The 2 individuals are already added as a destination.The query works because the results are populating that column wi...
I'm facing the same issue.Don't see anything missing but still no emails even when status of the alert is triggered.@DangEdmond did you find the solution?
Hi, I was using databricks and it was working fine. Im using community edition. Suddenly, it logged me out. I also clicked on forgot password, changed the password and when i tried to login again, i keeps me redirecting to login page without any erro...
Hi @Cert-Team, My Databricks certification test has been suspended. Not even turned my head a bit or looked away from screen but got suspended. Can you guys assist here.Request no - #00520224 ref:!00D610JGc4.!500Vp0A9m4Y:ref
I'm running a scheduled workflow with a dbt task on Azure Databricks. We want to export the dbt-output from the dbt task to a storage container for our Slim CI setup and data observability. The issue is, that the Databricks API (/api/2.1/jobs/runs/ge...
I am running dbt on a databricks job. It saves all documentation: manifest.json, run_results.json, etc in "Download Artifacts" in a job. I am not able to find out a way to read those in codes, transform and save on databricks. Tried job API. The arti...
Hello everyone,We are currently facing performance issues when using Databricks as a transactional system with our .NET application via the Simba ODBC driver. Specifically, queries are taking 30 seconds to 1 minute to insert data into 8 to 10 tables,...