- 1247 Views
- 1 replies
- 0 kudos
GCP hosted Databricks - DBFS temp files - Not Found
I've been working on obtaining DDL at the schema level in Hive Metastore within GCP-hosted Databricks. I've implemented a Python code that generates SQL files in the dbfs/temp directory. However, when running the code, I'm encountering a "file path n...
- 1247 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, the error code snippet with the whole error may help to determine the issue, also, considering the above points may also work as a fix.
- 0 kudos
- 1949 Views
- 1 replies
- 0 kudos
Unable to find permission button in Sql Warehouse for providing Access
Hi Everyone, am unable to see the permission button in sql warehouse to provide access to other users.I have admin rights and databricks is premium subscription.
- 1949 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, Could you please provide a screenshot of the SQL warehouse? Also, you can go through: https://docs.databricks.com/en/security/auth-authz/access-control/sql-endpoint-acl.htmlAlso, please tag @Debayan with your next comment which will notify me. Th...
- 0 kudos
- 856 Views
- 1 replies
- 1 kudos
SQL Serverless - cost view
Hi,Anyone knows how I'm able to monitor cost of the SQL Serverless? I'm using Databricks in Azure and I'm not sure where to find cost generated by compute resources hosted on Databricks.
- 856 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi, You can calculate the pricing in https://www.databricks.com/product/pricing/databricks-sql also, https://azure.microsoft.com/en-in/pricing/details/databricks/#:~:text=Sign%20in%20to%20the%20Azure,asked%20questions%20about%20Azure%20pricing. For A...
- 1 kudos
- 2740 Views
- 1 replies
- 0 kudos
Monitoring job metrics
Hi,We need to monitor Databricks jobs and we have made a setup where are able to get the prometheus metrics, however, we are lagging an overview of which metrics refer to what.Namely, we need to monitor the following:failed jobs : is a job failedtabl...
- 2740 Views
- 1 replies
- 0 kudos
- 0 kudos
I have reposted this post in "Administation and Architecture"Monitoring job metrics - Databricks - 42956
- 0 kudos
- 765 Views
- 0 replies
- 0 kudos
ListBucket
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::<s3-bucket-name>" ] }, { "Effect": "Allow", "Action": [ "s3:Pu...
- 765 Views
- 0 replies
- 0 kudos
- 7279 Views
- 6 replies
- 1 kudos
Resolved! Unable to create table with primary key
Hi Team,Getting below error while creating a table with primary key,"Table constraints are only supported in Unity Catalog."Table script : CREATE TABLE persons(first_name STRING NOT NULL, last_name STRING NOT NULL, nickname STRING,CONSTRAINT persons_...
- 7279 Views
- 6 replies
- 1 kudos
- 1 kudos
Hi, this needs further investigation, could you please raise a support case with Databricks?
- 1 kudos
- 1366 Views
- 1 replies
- 0 kudos
Problem starting cluster
I try to start cluster that i used to start it 7 times before and it gave me this error Cloud provider is undergoing a transient resource throttling. This is retryable. 1 out of 2 pods scheduled. Failed to launch cluster in kubernetes in 1800 seconds...
- 1366 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, This error "GCE out of resources" typically means that Google compute engine is out of resources as in out of nodes (can be a quota issue or can be node issues in that particular region in GCP). Could you please raise a google support case on thi...
- 0 kudos
- 916 Views
- 1 replies
- 0 kudos
Fail start cluster
I try to start cluster that i used to start it 7 times before and it gave me this error Cloud provider is undergoing a transient resource throttling. This is retryable. 1 out of 2 pods scheduled. Failed to launch cluster in kubernetes in 1800 seconds...
- 916 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, This error "GCE out of resources" typically means that Google compute engine is out of resources as in out of nodes (can be a quota issue or can be node issues in that particular region in GCP). Could you please raise a google support case on thi...
- 0 kudos
- 2180 Views
- 1 replies
- 1 kudos
Resolved! Run tasks conditionally "Always" condition missing?
Does the new feature 'Run If' that allows you to run tasks conditionally lack the 'ALWAYS' option? In order to execute the task both when there is OK and error from the dependencies
- 2180 Views
- 1 replies
- 1 kudos
- 1 kudos
You can choose the All Done option to run the task in both the scenarios
- 1 kudos
- 562 Views
- 0 replies
- 0 kudos
Analyze data metodoly
Hello,I have an ETL process that ingests data into bronze tables, transforms the data, and then ingests it into silver tables before finally populating the gold tables. This workflow is executed every 5 minutes. When I want to analyze the data or app...
- 562 Views
- 0 replies
- 0 kudos
- 5590 Views
- 7 replies
- 0 kudos
Delta Sharing - Alternative to config.share
I was recently given a credential file to access shared data via delta sharing. I am following the documentation from https://docs.databricks.com/en/data-sharing/read-data-open.html. The documentation wants the contents of the credential file in a fo...
- 5590 Views
- 7 replies
- 0 kudos
- 0 kudos
Hi, the most feasible way would be to convert the contents of your key file into base64 and only mention the spark config as below: credentials <base 64 encoded code>
- 0 kudos
- 923 Views
- 1 replies
- 0 kudos
I can't access to my account
Hi, I can't access to my account, and need to book an exam. I completed my registration at: https://www.webassessor.com/form/createAccount.do, and when I try to login I have this error: "Login or Password is incorrect"Please help me with this issue. ...
- 923 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @DavidValdez Looks like you were able to schedule your exam. If you experience any other issues you can request support here.We also have a new FAQ: https://www.databricks.com/learn/certification/faq
- 0 kudos
- 11413 Views
- 1 replies
- 0 kudos
Accessing TenantId via secret to connect to Azure Data Lake Storage Gen2 doesn't work
Hello,I'm following instructions in this article to connect to ADLS gen2 using Azure service principal. I can access service principal's app id and secret via Databricks key vault backed secret scope. However, this doesn't work for directory-id and I...
- 11413 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Retired_mod , Thanks for the prompt reply. As per the document, the syntax is the text highlighted in red below for accessing keys from secret scope in spark config. I used the same for app id too and that works. But I if use the same syntax for ...
- 0 kudos
- 1007 Views
- 1 replies
- 0 kudos
Snowflake Data Formatting Issue
I'm loading snowflake data to delta tables in databricks, few columns in snowflake data have datatype as Number (20,7) after loading to delta table it is taking as decimal (20,7), for example, if the value is 0.0000000 in snowflake then it is showing...
- 1007 Views
- 1 replies
- 0 kudos
- 0 kudos
explicit casting seems like the way to go.First try with one column, to see if that solves your issue.If so, you can write a function that casts all decimal columns to a certain precision, something like this:def convert_decimal_precision_scale(df, p...
- 0 kudos
- 2242 Views
- 4 replies
- 1 kudos
Why is importing python code supported in Repos but not in Workspaces ?
Hi, we currently use a one repo approach which does not require a local development environment (we utilize azure dev ops and nutter for automated tests). We also have shared code accross pipelines and started with %run-sytle modularization and have ...
- 2242 Views
- 4 replies
- 1 kudos
- 1 kudos
the why is most probably because of different development tracks/teams between workspace and repos.If they will consilidate in functionality? Can't tell, only Databricks knows that; but it seems reasonable to assume the files will also be added to w...
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
12.2 LST
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API
2 -
API Documentation
2 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
2 -
Azure data disk
1 -
Azure databricks
10 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
4 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Best Practices
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Bug
1 -
Catalog
1 -
Certification
2 -
Certification Exam
1 -
Certification Voucher
1 -
CICD
2 -
cleanroom
1 -
Cli
1 -
Cloud_files_state
1 -
cloudera sql
1 -
CloudFiles
1 -
Cluster
3 -
clusterpolicy
1 -
Code
1 -
Community Group
1 -
Community Social
1 -
Compute
3 -
conditional tasks
1 -
Connection
1 -
Cost
2 -
Credentials
1 -
CustomLibrary
1 -
CustomPythonPackage
1 -
DABs
1 -
Data Engineering
2 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
DataAISummit2023
1 -
DatabrickHive
1 -
databricks
2 -
Databricks Academy
1 -
Databricks Alerts
1 -
databricks app
1 -
Databricks Audit Logs
1 -
Databricks Certified Associate Developer for Apache Spark
1 -
Databricks Cluster
1 -
Databricks Clusters
1 -
Databricks Community
1 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
1 -
Databricks JDBC
1 -
Databricks Job
1 -
Databricks jobs
2 -
Databricks Lakehouse Platform
1 -
Databricks notebook
1 -
Databricks Notebooks
2 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks SQL
1 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
dbdemos
2 -
DBRuntime
1 -
DDL
1 -
deduplication
1 -
Delt Lake
1 -
Delta
13 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
6 -
Delta Sharing
2 -
deltaSharing
1 -
denodo
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
DLTCluster
1 -
Documentation
2 -
Dolly
1 -
Download files
1 -
dropduplicatewithwatermark
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
glob
1 -
Good Documentation
1 -
Google Bigquery
1 -
hdfs
1 -
Help
1 -
How to study Databricks
1 -
I have a table
1 -
informatica
1 -
Jar
1 -
Java
2 -
Jdbc
1 -
JDBC Connector
1 -
Job Cluster
1 -
Job Task
1 -
Kubernetes
1 -
LightGMB
1 -
Lineage
1 -
LLMs
1 -
Login
1 -
Login Account
1 -
Machine Learning
1 -
MachineLearning
1 -
manage account databricks unity catalog
1 -
masking
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Metastore
1 -
MlFlow
2 -
Mlops
1 -
Model Serving
1 -
Model Training
1 -
Mount
1 -
Networking
1 -
nic
1 -
Okta
1 -
ooze
1 -
os
1 -
Password
1 -
Permission
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
policies
1 -
PostgresSQL
1 -
Pricing
1 -
pubsub
1 -
Pyspark
1 -
Python
2 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
RBAC
1 -
react.js
1 -
Read data
1 -
Repos Support
1 -
required versus current
1 -
Reserved VM's
1 -
Reset
1 -
run a job
1 -
runif
1 -
S3
1 -
SAP SUCCESS FACTOR
1 -
Schedule
1 -
SCIM
1 -
Serverless
1 -
Service principal
1 -
Session
1 -
Sign Up Issues
2 -
Significant Performance Difference
1 -
Spark
2 -
sparkui
2 -
Splunk
1 -
sqoop
1 -
Start
1 -
Stateful Stream Processing
1 -
Storage Optimization
1 -
Structured Streaming ForeachBatch
1 -
suggestion
1 -
Summit23
2 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
tabrikck
1 -
Tags
1 -
Training
1 -
trajectory
1 -
Troubleshooting
1 -
ucx
2 -
Unity Catalog
1 -
Unity Catalog Error
2 -
Unity Catalog Metastore
1 -
UntiyCatalog
1 -
Update
1 -
user groups
1 -
Venicold
3 -
volumes
2 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
with open
1 -
Women
1 -
Workflow
2 -
Workspace
2
- « Previous
- Next »