Hello Team,I've encountered an issue while attempting to read a CSV data file into a pandas DataFrame by uploading it into DBFS in the community version of Databricks. Below is the error I encountered along with the code snippet I used:import pandas...
I'm looking to find way to export notebook dashboards as HTML files.We will be scheduling the notebook via workflows, so I'm not sure if we'd be looking at exporting something from the workflow via API or if there's a better way to do this.I'm also c...
not able to see linkage/lineage details for delta live tables and MVs.Getting pipeline error : use live.table_name instead of catalog.schema.table_name
Hi,I am migrating from dbx to Databricks Asset Bundles (DAB) a deployment setup where I have specific parameters per environment. This was working well with dbx, and I am trying now to define those parameters defining targets (3 targets : dev, uat, p...
Something must have changed in the meantime on Databricks side. I have only updated databricks CLI to 016 and now, using a git / branch under each target deploys this setup, where feature-dab is the branch I want the job to pull sources from, I see t...
Hi Team,My databricks exam got suspended on 28h march 2024 and it is still in the suspended state. I have raised a support request using the below linkhttps://help.databricks.com/s/contact-us?ReqType=trainingand my request Id is #00454905I would real...
Hi Team,I have created devops pipeline for databricks deployment on different environments and which got succussed but recently i have implemented the PEP's on databricks and devops pipeline getting failed with below error.Error: JSONDecodeError: Exp...
Problem Statement: We are currently utilizing customer-managed keys for Databricks compute encryption at the workspace level. As part of our key rotation strategy, we find ourselves needing to bring down the entire compute/clusters to update storage ...
Maybe you can use azure key vault to store customer-managed keyshttps://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes#--create-an-azure-key-vault-backed-secret-scope
I have access to databricks cloud UI. I use the notebooks for my experiments (not MLFlow).I install libraries using the notebooks by using this command - !pip install LIBRARY_NAMEI am unable to install tkinter and use it in notebook as. Is there a wa...
I want to get whether photon was used for a job or not. The api lets me get this for maybe 40% of jobs through the runtime_engine field, but the majority of jobs are unspecified. How do I get whether photon was used for those cases? The docs mention ...
Hi I am new to databricks and need some inputs.I am trying to create Delta External table in databricks using existing path which contains csv files.What i observed is below code will create EXTERNAL table but provider is CSV.------------------------...
@tajinder123 - can you please modify the syntax as below to create as a delta table
CREATE TABLE employee123
USING DELTA
LOCATION '/path/to/existing/delta/files';
Im using databricks communtiy edition for learning purpose and im whenever im running notebook, im getting:Exception when creating execution context: java.util.concurrent.TimeoutException: Timed out after 15 seconds databricks.I have deleted cluster ...
Hello, Databricks Team,My students are reporting none of them are able to use DBCE, and are running into this same error when they spin up an instance with defaults (DBR 12.2 LTS). Some have reported seeing this error since last night (3/26 ET). Coul...
When I attach a notebook to my cluster and run a cell the notebook is detached.Cell execution states:Waiting for compute to be readyThen the attached message is shown.Notebook detachedException execution context: java.net.SocketTimeoutException: Conn...
Whenever I try validating a pipeline that already runs productively without any issue, it throws me the following error:BAD_REQUEST: Failed to load notebook '/Repos/(...).sql'. Only SQL and Python notebooks are supported currently.
Hi @vieiradsousa,
Verify that the path to your notebook is correctly specified. The error message mentions ‘/Repos/(…).sql’, which seems to be a placeholder.Make sure the actual path points to the correct location of your notebook file.
We have a cluster running on 13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12).We want to test with a different type of cluster (14.3 LTS (includes Apache Spark 3.5.0, Scala 2.12))And all of a sudden we get errors that complain about a casting a Big...
I have logged the issue with Microsoft last week and they confirmed it is a Databricks bug. A fix is supposedly being rolled out at the moment across Databricks regions. As anticipated, we have engaged the Databricks core team to further investigate ...