Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best prac...
Explore discussions on Databricks administration, deployment strategies, and architectural best prac...
Join discussions on data engineering best practices, architectures, and optimization strategies with...
Join discussions on data governance practices, compliance, and security within the Databricks Commun...
Explore discussions on generative artificial intelligence techniques and applications within the Dat...
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithm...
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Communi...
I have a job with some tasks. Some of the tasks are pipeline_task's some are notebook_task's.When I run the job with "Run now with different parameters" and enter a new key-value, I see that the key-value is available in the notebook_task's with dbut...
As per docs it seems that pipeline task type is currently not supported to pass parameters: https://docs.databricks.com/en/jobs/create-run-jobs.html#pass-parameters-to-a-databricks-job-taskYou could create a notebook task that runs before your pipeli...
Hi - I would like to hide function definition from user, what is the best way to do it in unity catalog
The list of available ACL's for alerts suggests there's a way to set "can run" and "can manage" permissions but neither the REST API nor the Databricks CLI show options for setting permissions. Is there a way to set acl's on alerts? If so, how?Alert ...
The API call needed to set the ACLs for an alert and other object types as (alerts | dashboards | data_sources | queries) is:https://docs.databricks.com/api/workspace/dbsqlpermissions/set
Hi all,we are parameterizing environment specific catalog names (like `mycatalog_dev` vs. `mycatalog_prd`) in Lakeview dashboard queries like this:SELECT * FROM IDENTIFIER(:catalog_name || '.myschema.mytable')Which works fine in most cases. We have o...
I've had quite a bit of fun with UC and view permissions. I don't think this is specific to using the IDENTIFIER() function, but I suspect it's related to UC permissions. What you'll need to ensure:The user or group who owns the view on catalog_b h...
Hi allI want to access on-prem Oracle Database data from the python notebooks. However, the install of the jar (ojdbc8.jar) results in an error, which occurs while the cluster is starting up.The error message:"Library installation attempted on the dr...
The error message suggests that the jar file located at abfss:/jars/ojdbc8.jar has an invalid authority. This could be due to a number of reasons such as incorrect file path, insufficient permissions, or network restrictions. Here are a few steps you...
Hi,I was able to deploy an endpoint using legacy serving (It's the only option we have to deploy endpoints in DB). Now I am having trouble querying the endpoint itself. When I try to query it I get the following error: Here is the code I am using ...
Hello,I receive a very weird error when attempting to connect my workflows tasks to a remote git on azure devops.As per documentation: "For a Git repository, the path relative to the repository root."Then, I use directly the name of the notebook file...
Ah, I have had that same error before when cloning from git. I'm guessing you got the Repo URL by hitting the "clone" button in ADO and copy/pasting it in to Databricks. One thing I've done in the past is in the Repository URL, remove the "org@" pa...
Hello, we have a need to uninstall older versions of whl files from personal cluster via databricks CLI - could you please provide the exact command to be used here. We tried with many found on the documentations but none of them worked to do the act...
Hi @Visakh_Vijayan ,Did you try to use databricks libraries uninstall? It's exactly crafted for this purposedatabricks libraries uninstall --json YOUR_JSON_WITH_REQUEST_BODYAlso, when you uninstall a library from a cluster, the library is removed onl...
Hello, I am trying to write a simple upsert statement following the steps in the tutorials. here is what my code looks like:from pyspark.sql import functions as Fdef upsert_source_one(self df_source = spark.readStream.format("delta").table(self.so...
I figured out the error is hiding an underlying issue with the code, which you can get to if you deploy the bundle (if you are using asset bundles) and run from a notebook in a browser.So the issue is more about the debugger not being able to stop on...
Hi,I'm testing the latest version of the databricks runtime but I'm getting errors doing a simple dropDuplicates.Using the following codedata = spark.read.table("some_table") data.dropDuplicates(subset=['SOME_COLUMN']).count() I'm getting this error....
Wanted to add to this thread. Seeing the same issue. This appears to be recent problem.
Hi,I'm trying to setup CICD pipeline for Delta Live Table jobs using Databricks Bundles. I have a problem with path to notebook in pipeline. According to this example:https://docs.databricks.com/en/delta-live-tables/tutorial-bundles.htmlYAML file sho...
I had this error once.you need to specify the extension of your file. If you set the notebook to be python, then it must be .py at the end, likewise .sql if you used SQL libraries: - notebook: path: ${workspace.file_path}/datab...
Hi all,Just wanted to raise a question regarding Databricks workbooks and viewing the results in the cells. For the example provided in the screenshot I want to view the results of an excel formula that has been applied to a cell in our workbooks. Fo...
@Kurtis_R do you want to display the value of 45 or formula of how 45 is achieved.?
As primary/foreign key constraints are now supported/available in Databricks, how are foreign key constraints handled in a dlt pipeline, i.e if a foreign key constraint is violated, is the record logged as a data quality issue and still added to the ...
Hi @Mario_D ,I don't think primary/foreign constraint are supported by DLT. At least I can't find anything in documentation. But you obtain same result using DLT expectations:Manage data quality with Delta Live Tables - Azure Databricks | Microsoft L...
Hi all, Just wanted to ask if there is an announcement regarding changes on Databricks REST API? Because I had weird experiences in using the REST API. Specifically for Query History API Last Aug 29, 2024. I created a script to pull get request in t...
Yeah, totally agree with you. It should be documented/mentioned somewhere. Or maybe the API should be versioned, so if they introduce a new version, it won't break existing workflows.Anyway, thanks for sharing!
Hello, I am trying to deal with something I thought would be straightforward, but am hitting some walls. Basically the original user associated with my Databricks account has left my organisation and, when trying to remove / disable this user, I am m...
Just to follow-up; I can administer & disable OTHER account admins and do the needful in every other case, but this one user account is particularly privileged.
User | Count |
---|---|
1781 | |
825 | |
468 | |
310 | |
293 |