Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.
Keynote: Data Warehouse presente...
Hi,I think I have a similar issue to the one in this post, but the answer isn't detailed enough for me.I have a list defined in my first task, which contains the items I want to iterate through [1,2,3,4]. When I use it as Inputs to the For Each frami...
Thank you both very much, I've nailed it . I have accepted Walter_C's answer as solution because Step 2 is what I was missing. Thanks MariuszK as well for your contribution.
For Databricks SQL connector for python, the list of fields returned by Cursor.columns() is listed in here (like TABLE_CAT, TABLE_SCHEM, TABLE_NAME, COLUMN_NAME). Could someone please share an exhaustive list of fields (including short description ...
Hi everyone,I hope you're all doing well.I'm experiencing some challenges with Databricks SQL, and I wanted to reach out to see if others have encountered similar issues or have suggestions for troubleshooting. Below is a summary of the problems I'm ...
Hi @Walter_C,Thank you for your input and support regarding the challenges I’ve been experiencing with Databricks SQL.I followed up with support, and they confirmed that these are known issues currently under review. Here’s a summary of the response:...
Using workflows, is there a way to obtain the task name from within a task?EX: I have a workflow with a notebook task. From within that notebook task I would like to retrieve the task name so I can use it for a variety of purposes.Currently, we're re...
Hi @EWhitley,Would {{task.name}} help in getting the current task name?https://docs.databricks.com/en/workflows/jobs/parameter-value-references.htmlPass context about job runs into job t
We are experiencing the following issues.Description:I encountered an issue while executing a Spark SQL query in Databricks, and it seems to be related to the query optimization phase. The error message suggests an internal bug within Spark or the Sp...
Update:Response from the Databricks Team.SymptomsInternal Error During Spark SQL Phase Optimization.CauseDataBricks PG Engineering team confirmed that this is indeed a bug in CASE WHEN optimization & they are working on the fix for this issue.Resolut...
I have a Python script workflow with 2 tasks: Task A and Task B.When task A has data, this is shared to Task B via createOrReplaceGlobalTempView with no issues.The goal is: When A has no data, skip the Task B and also set the workflow status to "Skip...
To achieve the goal of setting the workflow status to "Skipped" when Task A has no data, you can use the "Run if" conditional task type in Databricks Jobs. This allows you to specify conditionals for later tasks based on the outcome of other tasks.ht...
I have some legacy software that only runs on Windows, but that can be driven via Python. Is it possible to set up compute resources that run Databricks Container Service on a windows base image, so that I can then add this legacy software and work w...
Unfortunately this is not possible, as part of the requirements you need to use an Ubuntu image: https://docs.databricks.com/en/compute/custom-containers.html#option-2-build-your-own-docker-base
Hello EveryoneRecently, my team integrated an AWS Redshift database as a foreign catalog in Azure Databricks. We can successfully run SELECT queries and create regular views on top of the foreign catalog table. However, when attempting to create a ma...
Hi @Dnirmania,
Materialized views in SQL often use serverless Delta Live Tables pipelines, which might be causing the connection timeout due to IP whitelisting restrictions. Serverless compute might not be able to connect to the federated source if t...
Hello all,I'm quite new to Databricks world, and currently in the process of analyzing a migration from Oracle on-premise (with a lot of SQL, PL/SQL, custom things, etc.) to Databricks.Let's try to illustrate my situation in Oracle (summary):. Let's ...
I have to migrate the data from Azure Synapse Analytics to Databricks. Could anyone share the different approaches to migrate data, and from those, which is the best approach to use?
Hello all,I want to install Oracle Instant Client to be able to use python-oracledb in Thick-Mode because one of our databases is old and cannot be reached in Thin-Mode. I have tried the solution from this post, but it doesn't help me. It seems that ...
Hi All,I'd like to do some benchmarking and I need to turn off caching on my SQL Warehouse Server. However, whatever I try I receive quite high level of caching after running my queries (>60%). I tried to turn off my server, but it automatically wake...
We set up the SQL warehouse IAM role in the settings option. This is applied to all warehouses. How do I create sql warehouses with multiple IAM roles to maintain access control.
Unfortunately there is no way to restrict the access the compute has, the restrictions are being performed via the users permissions. Only option here will be to submit a feature request through https://docs.databricks.com/en/resources/ideas.html#ide...
Hello Databricks Community,I have a hard time understanding how is Databricks SQL different from microsoft SQL ? Also, why does databricks provide spark SQL ? If you direct me to a well-written webpage or document its of immense help!Thanks,
Databricks SQL and Spark SQL are built for distributed big data analytic. Databricks SQL is great for business intelligence tools and uses Delta Lake for efficient data storage. Spark SQL works with Spark's programming features for data processing. U...
Hello there,Wasn't sure if this was just an error on my part, but I'm using a Databricks Pro SQL warehouse and unity catalogue to pull some data from my tables. I'm having this issue where whenever I try and use a wildcard operator with my LIKE claus...