- 1124 Views
- 4 replies
- 1 kudos
Currently in unity catalog enabled workspace users with "Workspace access" can create workflows/jobs, there is no access control available to restrict users from creating jobs/workflows.Use case: In production there is no need for users, data enginee...
- 1124 Views
- 4 replies
- 1 kudos
Latest Reply
We have the "allow unrestricted cluster creation" box deselected for all groups and have users creating jobs in production so we are looking for a way to disable this. I cannot believe this isn't an option. Did anyone find a solution for this?
3 More Replies
- 1375 Views
- 2 replies
- 0 kudos
I'm delving into the challenges of ETL transformations, particularly moving from traditional platforms like Informatica to Databricks. Given the complexity of legacy ETLs, I'm curious about the approaches others have taken to integrate these with Dat...
- 1375 Views
- 2 replies
- 0 kudos
Latest Reply
@Kaniz Do you know if Informatica Cloud Modernization can convert mappings into Delta Live Tables? Do we have to use Informatica Cloud for this, or can we use it as a one time migration and maintain the artifacts in DataBricks?Alternatively, we are l...
1 More Replies
- 429 Views
- 2 replies
- 0 kudos
For AWS databricks, i have configured 1 worker and 1 driver node with same node type. In the AWS console, all details are same for the two instance, as instance id only different. How to identify which instance id is for worker and which one is for d...
- 429 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Kaniz Thanks for your response.There is no instance pool ID while configured the cluster, then how will i able to differentiate then.Could you give any alternative way for finding the driver instance id and worker instance id in the AWS console.
1 More Replies
- 821 Views
- 2 replies
- 1 kudos
Hi, I have started using databricks recently, and I'm not able find a right solution in the documentations. i have linked multiple repos in my databricks workspace in the repos folders, and I wanted to update the repos with remote AzureDevops reposit...
- 821 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Ha2001 , Good Day!
Databricks API has a limit of 10 per second for the /repos/* combined requests in the workspace. You can check the below documentation for the API limit:
https://docs.databricks.com/en/resources/limits.html#:~:text=Git%20fold...
1 More Replies
- 1718 Views
- 4 replies
- 1 kudos
Hi!I am experiencing something that I cannot find in the documentation: in databricks, using the databricks runtime 13.X, when I start a streaming query (using .start method), it creates a new query and while it is running I can execute other code in...
- 1718 Views
- 4 replies
- 1 kudos
Latest Reply
You can use the help portal: https://help.databricks.com/s/
3 More Replies
- 335 Views
- 1 replies
- 1 kudos
I try to understand how Databricks computes the model drift when the baseline table is available. What I understood from the documentation is Databricks processes both the primary and the baseline tables according to the specified granularities in th...
- 335 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @MohsenJ, Let’s delve into how Databricks handles model drift calculation when the baseline table lacks a timestamp column.
Baseline Table without Timestamp:
When your baseline table doesn’t have a timestamp column, Databricks employs best-eff...
by
AniP
• New Contributor II
- 191 Views
- 1 replies
- 0 kudos
Hi,I created first workspace successfully using AWS quickstart method.Now I want to create one more workspace, but quickstart method is not asking IAM role and bucket name and cloudformation stack is failing.
- 191 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @AniP, It seems like you’re encountering some issues while creating another workspace using the AWS Quick Start method.
Let’s troubleshoot this step by step:
IAM Role:
By default, an Amazon CloudFormation stack runs with the permissions of th...
- 274 Views
- 1 replies
- 0 kudos
I've lost count of how many different domains and accounts Databricks is requiring for me to use their services. Every domain is requiring its own account username, password, etc., and nothing is synced. I can't even keep track of which email addre...
- 274 Views
- 1 replies
- 0 kudos
Latest Reply
Pluscustomer-academy.databricks.comaccounts.cloud.databricks.comdatabricks.my.site.com
- 217 Views
- 1 replies
- 2 kudos
I have a workflow in Databricks and an AutoML pipeline in it.I want to deploy that pipeline in production, but I want to use the shared cluster in production, since AutoML is not compatible with the shared clusters, what can be the workaround.(Is it ...
- 217 Views
- 1 replies
- 2 kudos
Latest Reply
Hi @HiraNisar ,Deploying an AutoML pipeline in production while utilizing a shared cluster can be a bit tricky, but there are some workarounds you can consider:
Dedicated Cluster for AutoML:
Create a dedicated single-user cluster specifically for...
- 249 Views
- 1 replies
- 0 kudos
Does anyone have a voucher code they do not intend to use and willing to share. I recently lost my job so hard time. I really want to utilize my learnings and give the exam. Thank you.
- 249 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @databricks0601, Hi there!
Thank you for reaching out to the community. We understand that times are tough, and we want to support you in any way we can. To expedite your request for a voucher code, please list your concerns on our ticketing porta...
- 157 Views
- 1 replies
- 0 kudos
Hi Community i am try to create lzo-codec in my dbfs using:https://docs.databricks.com/en/_extras/notebooks/source/init-lzo-compressed-files.htmlbut i am facing the errorCloning into 'hadoop-lzo'... The JAVA_HOME environment variable is not defined c...
- 157 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @yatharth, It appears that you’re encountering an issue related to the LZO codec while working with Databricks and Hadoop.
Let’s address this step by step:
JAVA_HOME Environment Variable:
The error message indicates that the JAVA_HOME environ...
- 343 Views
- 1 replies
- 0 kudos
Hi.Struggling to add description by script to the comment column within the catalog view in Databricks, particularly for foreign/external tables sourced from Azure SQL.I have no issue doing that for delta tables. Also for information schema columns f...
- 343 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Leon_K, Adding comments to columns within the catalog view in Databricks can be quite useful for documentation and understanding your data. While it’s straightforward for Delta tables, handling foreign/external tables sourced from Azure SQL requi...
- 360 Views
- 1 replies
- 0 kudos
HelloI have a table in my catalog, and I want to have it as a pandas or spark df. I was using this code to do that before, but I don't know what is happened recently that the code is not working anymore. from pyspark.sql import SparkSession
spark = S...
- 360 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Pbr, To work around this, you can create a temporary view using SQL in a separate cell (e.g., a %%sql cell) and then reference that view from your Python or Scala code.
Here’s how you can achieve this:
First, create a temporary view for your tab...
- 940 Views
- 2 replies
- 0 kudos
I am attempting to create a Databrick Repo in a workspace via Terraform. I would like the Repo and the associated Git Credential to be associated with a Service Principal. In my initial run, the Terraform provider is associated with the user defined ...
- 940 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Kinger, The Databricks Terraform provider does not support creating a git credential associated with a Service Principal (SP) and associating the SP with the Repo creation.• However, you can create a Service Principal and associate it with a git ...
1 More Replies
- 808 Views
- 6 replies
- 0 kudos
I'm running a scheduled workflow with a dbt task on Azure Databricks. We want to export the dbt-output from the dbt task to a storage container for our Slim CI setup and data observability. The issue is, that the Databricks API (/api/2.1/jobs/runs/ge...
- 808 Views
- 6 replies
- 0 kudos
Latest Reply
We're having the same issue, I get the output from a task but not the dbt_output. We're running 13.3 LTS and dbt 1.7.11
5 More Replies