- 2399 Views
- 3 replies
- 3 kudos
Resolved! Metric Tables not created automatically
I was trying the new lakehouse monitoring feature for one of my unity tables, and when I create a monitoring dashboard for my table, the 2 metric tables({output_schema}.{table_name}_profile_metrics and {output_schema}.{table_name}_drift_metrics) are ...
- 2399 Views
- 3 replies
- 3 kudos
- 3 kudos
Any success/workaround on this topic? I have the same issue
- 3 kudos
- 1449 Views
- 2 replies
- 1 kudos
DataEngineer Associate Exam got Suspended. Raised Ticket. No Response
Hi Team,This is the second time I am tryng to post a discussion in the community. The first time it was marked SPAM. It has been a terrible experience so far with my Databricks Data Engineer Associate exam getting suspended by the proctor for reasons...
- 1449 Views
- 2 replies
- 1 kudos
- 1 kudos
@RohitB91 the support team is awaiting your response for a date / time you are available to resume your exam. Your registration confirmation email should contain this detail: Preparation for taking your Online proctored Exam Understand the computer r...
- 1 kudos
- 2560 Views
- 1 replies
- 0 kudos
Convert pkl file
I am looking to convert pkl file to parquet format. I am reading this data from Azure Data Lake and convert and create parquet file outputs to store them in my hive metastore.
- 2560 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Innov, To convert a Pickle (pkl) file to Parquet format and store it in your Azure data lake, you can follow these steps: Azure Data Factory: If you’re using Azure Data Factory, it supports the Parquet format for various connectors, including...
- 0 kudos
- 679 Views
- 1 replies
- 0 kudos
search engine functionality in Databricks
Hi Team,Good morning, I am Sai Rekhawar, currently residing in Hamburg, Germany. I am reaching out regarding an issue with the search engine functionality in Databricks.I have created several catalogs and tables within the catalog, and I am the owner...
- 679 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @SaiRekhawar , try using the filters in under the table bar in ctrl+P as below
- 0 kudos
- 2505 Views
- 3 replies
- 1 kudos
User Agent Verfication
Hello Folks, We are recently working on Databricks integration in to our products and one of the Best practices suggested is to send user-agent information for any REST API or JDBC connect we make from the product to Databricks. We have made all the ...
- 2505 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @bdraitcak , Here are the steps to be followed to get the logs: Log in to your Azure portal.Go to / Search for "Log analytics workspace."Create a new Log Analytics workspace by specifying your Resource group and Instance details. [subscription + r...
- 1 kudos
- 921 Views
- 1 replies
- 0 kudos
Unable to login to Azure Databricks
I am trying to login to Azure Databricks as usual but it waits for 5 10 seconds then redirects me to https://westeurope-c2.azuredatabricks.net/aad/redirect The system is up and all of my colleagues can login to the system. I have tried in incognito m...
- 921 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @egndz , If it was a one-time issue we suspect it was an issue within Azure or Databricks services. Please follow the status page to keep track of the maintenance/outages. https://status.azuredatabricks.net/
- 0 kudos
- 867 Views
- 1 replies
- 0 kudos
Failed Synchronization of Files Using Databricks Extension in VS Code
hi,I am trying to set up Databricks Extension in VS Code. I follow the steps as per the guide belowhttps://docs.databricks.com/en/dev-tools/vscode-ext/tutorial.html When i move to step 6 (see in the above guide) i follow the steps and i create succes...
- 867 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Anto23 , Greetings from Databricks. Based on the above information, it seems Files in Workspace is currently disabled for your Databricks environment. This feature allows storing and accessing non-notebook files alongside your notebooks and cod...
- 0 kudos
- 1639 Views
- 1 replies
- 0 kudos
Decimal Precision error
when i try to create an dataframe like this lstOfRange = list() lstOfRange = [ ['CREDIT_LIMIT_RANGE',Decimal(10000000.010000),Decimal(100000000000000000000000.000000),'>10,000,000','G'] ] RangeSchema = StructType([StructField("rangeType",St...
- 1639 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @sai_sathya, The issue you’re encountering with the value in the rangeTo column of your DataFrame is related to the precision of floating-point numbers. Let’s break down what’s happening: Floating-Point Precision: Computers represent floating...
- 0 kudos
- 1444 Views
- 2 replies
- 0 kudos
Oracle table load from Databricks
I am trying to load a dataframe from Databricks to target Oracle table using the write method and using JDBC api. I have the right drivers. The job and its corresponding stages are getting completed and the data is getting loaded in Oracle target tab...
- 1444 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks for the response. Can you please elaborate on the Apache Spark JDBC Connector. I am using ojdbc8 driver as per the Databricks documentation. I am not using Delta Lake. I have the data in a dataframe and using write method to insert the data to...
- 0 kudos
- 2401 Views
- 2 replies
- 0 kudos
error installing igraph library
Hi, I am trying to install the "igraph" and "networkD3" CRAN packages for use within a notebook, but am receiving the below error.Could someone please assist?Thanks! * installing *source* package ‘igraph’ ... ** package ‘igraph’ successfully unpacked...
- 2401 Views
- 2 replies
- 0 kudos
- 0 kudos
Based on this igraph github issue https://github.com/igraph/rigraph/issues/490#issuecomment-966890059, I followed the instructions to install glpk. After installing glpk, I was able to install igraph.
- 0 kudos
- 1652 Views
- 2 replies
- 1 kudos
Resolved! Error installing the igraph and networkD3 R libraries
Hi,I am trying to install the igraph and networkD3 CRAN packages for use within a notebook. However, I am receiving the attached installation error when attempting to do so.Could someone please assist?Thank you!
- 1652 Views
- 2 replies
- 1 kudos
- 763 Views
- 1 replies
- 0 kudos
What causes affect QB Won't Open and how to fix it?
What Can Be Causing QB Won't Open Issue and How Can I Fix It? I need help immediately to fix this annoying issue! Has anybody else had such problems with QB refusing to open? My personal attempts at troubleshooting have yielded no results. I would be...
- 763 Views
- 1 replies
- 0 kudos
- 0 kudos
@markwilliam8506 If your QB won't open even after multiple tries, you might be facing some common error messages. This scenario can be a result of damaged program files or a faulty installation process, among other possible reasons. The error message...
- 0 kudos
- 3727 Views
- 2 replies
- 0 kudos
Serving GPU Endpoint, can't find CUDA
Hi everyone !I'm encountering an issue while trying to serve my model on a GPU endpoint.My model is using deespeed that needs I got the following error : "An error occurred while loading the model. CUDA_HOME does not exist, unable to compile CUDA op(...
- 3727 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @kfab, It seems you’re encountering an issue related to CUDA while serving your model on a GPU endpoint. Let’s troubleshoot this step by step. CUDA_HOME Not Found: The error message you received, “CUDA_HOME does not exist, unable to compile C...
- 0 kudos
- 1438 Views
- 1 replies
- 0 kudos
Deploy mlflow model to Sagemaker
Hi,I am trying to deploy mlflow model in Sagemaker. My mlflow model is registered in Databrick.Followed below url to deploy and it need ECR for deployment. For ECR, either I can create custom image and push to ECR or its mentioned in below url to get...
- 1438 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @sanjay, Deploying an MLflow model to Amazon SageMaker is a great way to scale your machine learning inference containers. MLflow simplifies the deployment process by providing easy-to-use commands without requiring you to write complex container...
- 0 kudos
- 1150 Views
- 1 replies
- 0 kudos
SQL query on information_schema.tables via service principal
Hi,I have a simple python notebook with below code ----query = "select table_catalog, table_schema, table_name from system.information_schema.tables where table_type!='VIEW' and table_catalog='TEST' and table_schema='TEST'"test = spark.sql(query)disp...
- 1150 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Sanky, It seems you’re encountering an issue where your Spark job, running as a service principal, doesn’t return any results when querying the same code that works in your workspace. Let’s troubleshoot this: Service Principal Permissions: Y...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
3 -
Azure databricks
3 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Chatgpt
1 -
Community
7 -
Community Edition
3 -
Community Members
2 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Data Processing
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
11 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
3 -
Delta
10 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
11 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
4 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »