cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

EDDatabricks
by Contributor
  • 844 Views
  • 1 replies
  • 0 kudos

Lakeview dashboard dynamically change filter values

Greetings everyone,We are trying to implement a series of visualizations. All these visualizations have queries assigned to them in the form of “ Select * from test table where timestamp between :Days start and :Days end”. The is also a filter applie...

EDDatabricks_0-1712826834624.png EDDatabricks_1-1712826864074.png
Community Discussions
dashboard
filters
Lakeview
parameters
  • 844 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @EDDatabricks, Thank you for sharing your requirements and observations regarding the Lakeview dashboards. Let’s address each of your points: Dynamic Filters Based on Current Timestamp: To achieve dynamic filtering based on the current timesta...

  • 0 kudos
Miguel_Grafana
by New Contributor
  • 323 Views
  • 1 replies
  • 0 kudos

Azure Oauth Passthrough with the Go Driver

Can anyone point me towards some resources for achieving this? I already have the token.Trying with: dbsql.WithAccessToken(settings.Token)But I'm getting the following error:Unable to load OAuth Config: request error after 1 attempt(s): unexpected HT...

  • 323 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Miguel_Grafana, It seems you’re encountering an error related to OAuth configuration when using the dbsql.WithAccessToken method with your token. Let’s troubleshoot this issue. Here are a couple of things you can check: Grant Type Parameter: ...

  • 0 kudos
Kasen
by New Contributor II
  • 835 Views
  • 1 replies
  • 0 kudos

Resolved! Unable to grant catalog access to service principal

Hi everyone,I created a service principals called TestServicePrincipal. I tried to grant the catalog access to the service principals, but the error mentioned that it could not find principal with name TestServicePrincipal. If I grant the access to s...

Kasen_0-1715058248230.png Kasen_1-1715058284642.png
Community Discussions
grant access
service principals
  • 835 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Kasen, It seems you’re encountering an issue where your newly created service principal isn’t immediately discoverable. Let’s explore some possible reasons and solutions: Delayed Propagation: Sometimes, there can be a delay in the propagation...

  • 0 kudos
otara_geni
by New Contributor
  • 381 Views
  • 1 replies
  • 0 kudos

How to Resolve ConnectTimeoutError When Registering Models with MLflow

Hello everyone,I'm trying to register a model with MLflow in Databricks, but encountering an error with the following command: model_version = mlflow.register_model(f"runs:/{run_id}/random_forest_model", model_name)   The error message is as follows:...

  • 381 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @otara_geni, The ConnectTimeoutError you’re encountering when registering a model with MLflow in Databricks is related to a timeout issue while connecting to the specified endpoint URL. The error message indicates that there’s a timeout when co...

  • 0 kudos
benitoski
by New Contributor II
  • 1164 Views
  • 1 replies
  • 1 kudos

Resolved! Workspace FileNotFoundExecption

I have a model created with catboost and exported in onnx format in workspace and I want to download that model to my local machine.I tried to use the Export that is in the three points to the right of the model, but the model weighs more than 10 Mb ...

  • 1164 Views
  • 1 replies
  • 1 kudos
Latest Reply
feiyun0112
Contributor III
  • 1 kudos

you need put file to FileStorehttps://docs.databricks.com/en/dbfs/filestore.html#save-a-file-to-filestore 

  • 1 kudos
Benedetta
by New Contributor III
  • 987 Views
  • 2 replies
  • 1 kudos

What happened to the ephemeral notebook links????? and the job ids????

Hey Databricks,      Why did you remove the ephemeral notebook links and job Ids from the parallel runs? This has created a huge gap for us. We can no longer view the ephemeral notebooks, and also the Jobids are missing from the output. Waccha doing?...

  • 987 Views
  • 2 replies
  • 1 kudos
Latest Reply
Benedetta
New Contributor III
  • 1 kudos

Hi Kaniz,    It's funny you mention these things - we are doing some of those - the problem now is that the JobId is obscured from the output meaning we can't tell which ephemeral notebook goes with which JobId.  It looks like the ephemeral notebook ...

  • 1 kudos
1 More Replies
FelipeRegis
by New Contributor
  • 1049 Views
  • 1 replies
  • 0 kudos

Not able to access data registered in Unity Catalog using Simba ODBC driver

Hi folks, I'm working on a project with Databricks using Unity Catalog and a connection to SSIS (SQL Server Integration Services).My team is trying to access data registered in Unity Catalog using Simba ODBC driver version 2.8.0.1002. They mentioned ...

odbc_driver_unity_catalog_issue.png
  • 1049 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @FelipeRegis, It seems you’re encountering issues with accessing data registered in Unity Catalog using the Simba ODBC driver. Let’s explore some possible solutions: Delta Lake Native Connector: Consider using Delta Lake’s native Delta JDBC/OD...

  • 0 kudos
Benedetta
by New Contributor III
  • 1504 Views
  • 1 replies
  • 0 kudos

What happened to the JobIds in the parallel runs (again)????

Hey Databricks,      Why did you take away the jobids from the parallel runs? We use those to identify which output goes with which run. Please put them back.Benedetta 

  • 1504 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Benedetta,  Thank you for reaching out. I understand your concern regarding the jobids in parallel runs. I will look into this matter and get back to you with more information as soon as possible.

  • 0 kudos
dm7
by New Contributor II
  • 744 Views
  • 1 replies
  • 0 kudos

DLT CDC/SCD - Taking the latest ID per day

Hi I'm creating a DLT pipeline which uses DLT CDC to implement SCD Type 1 to take the latest record using a datetime column which works with no issues:@dlt.view def users(): return spark.readStream.table("source_table") dlt.create_streaming_table(...

  • 744 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @dm7, Thank you for providing the details of your DLT pipeline and the desired outcome! It looks like you’re trying to implement a Slowly Changing Dimension (SCD) Type 2 behaviour where you want to capture historical changes over time. Let’s br...

  • 0 kudos
BenCCC
by New Contributor
  • 1076 Views
  • 1 replies
  • 0 kudos

Installing R packages for a customer docker container for compute

Hi,I'm trying to create a customer docker image with some R packages re-installed. However, when I try to use it in a notebook, it can't seem to find the installed packages. The build runs fine.FROM databricksruntime/rbase:14.3-LTS## update system li...

  • 1076 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @BenCCC, Here are a few things you can check: Package Installation in Dockerfile: In your Dockerfile, you’re using the RUN R -e 'install.packages(...)' command to install R packages. While this approach works, there are alternative methods th...

  • 0 kudos
groch_adam
by New Contributor
  • 340 Views
  • 1 replies
  • 0 kudos

Usage of SparkMetric_CL, SparkListenerEvent_CL and SparkLoggingEvent_CL

I am wondering If can retrieve any information from Azure Log Analytics custom tables (already set) for Azure Databricks. Would like to retrieve information about query and data performance for SQL Warehouse Cluster. I am not sure If I can get it fro...

  • 340 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @groch_adam, Retrieving information from Azure Log Analytics custom tables for Azure Databricks is possible.   Let me guide you through the process. Azure Databricks Monitoring Library: To send application logs and metrics from Azure Databric...

  • 0 kudos
liormayn
by New Contributor III
  • 554 Views
  • 1 replies
  • 3 kudos

Error while encoding: java.lang.RuntimeException: org.apache.spark.sql.catalyst.util.GenericArrayDa

Hello:)we are trying to run an existing working flow that works currently on EMR, on databricks.we use LTS 10.4, and when loading the data we get the following error:at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:...

  • 554 Views
  • 1 replies
  • 3 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 3 kudos

Hi @liormayn, It seems you’re encountering an issue related to the schema of your data when running your existing workflow on Databricks. Let’s explore some potential solutions: Parquet Decimal Columns Issue: The error message you’re seeing might...

  • 3 kudos
afdadfadsfadsf
by New Contributor
  • 523 Views
  • 1 replies
  • 0 kudos

Create Databricks model serving endpoint in Azure DevOps yaml

Hello,I need to create and destroy a model endpoint as part of CI/CD. I tried with mlflow deployments create-endpoint, giving databricks as --target however it errors saying that --endpoint is not a known argument when clearly --endpoint is required....

  • 523 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @afdadfadsfadsf, Creating and managing model endpoints as part of your CI/CD pipeline is essential for deploying machine learning models. I can provide some guidance on how to set up a CI/CD pipeline using YAML in Azure DevOps. You can adapt th...

  • 0 kudos
scottbisaillon
by New Contributor
  • 652 Views
  • 1 replies
  • 0 kudos

Databricks Running Jobs and Terraform

What happens to a currently running job when a workspace is deployed again using Terraform? Are the jobs paused/resumed, or are they left unaffected without any down time? Searching for this specific scenario doesn't seem to come up with anything and...

  • 652 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @scottbisaillon, When deploying a workspace again using Terraform, the behaviour regarding currently running jobs depends on the specific Terraform version and the platform you are using.   Let’s explore the details: Terraform Cloud (form...

  • 0 kudos
TinasheChinyati
by New Contributor
  • 1587 Views
  • 1 replies
  • 0 kudos

Stream to stream join NullPointerException

I have a DLT pipeline running in continous mode. I have a stream to stream join which runs for the first 5hrs but then fails with a Null Pointer Exception. I need assistance to know what I need to do to handle this. my code is structured as below:@dl...

  • 1587 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @TinasheChinyati, It looks like you’re encountering a Null Pointer Exception in your DLT pipeline when performing a stream-to-stream join. Let’s break down the issue and explore potential solutions: The error message indicates that the query te...

  • 0 kudos
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!