- 9426 Views
- 4 replies
- 1 kudos
column "id" is of type uuid but expression is of type character varying.
Hello,I'm trying to write to Azure PostgreSQL flexible database from Azure Databricks, using PostgreSQL connector in Databricks Runtime in 12.2LTS.I'm using df.write.format("postgresql").save() to write to PostgreSQL database, but getting the follow...
- 9426 Views
- 4 replies
- 1 kudos
- 1 kudos
Yes, this stack overflow was my reference too and adding below option made load go with no error on UUID data type in postgres columnSpoiler.option(stringtype, "unspecified").option(stringtype, "unspecified")https://stackoverflow.com/questions/409739...
- 1 kudos
- 786 Views
- 0 replies
- 0 kudos
databricks job compute price w.r.t running time
I have two workflows (jobs) in data bricks (AWS) with below cluster specs (job base cluster NOT general purpose)Driver: i3.xlarge · Workers: i3.xlarge · 2-8 workers Job 1 takes 10 min to completeJob 2 takes 50 min to completeQuestions:DBU cost is sam...
- 786 Views
- 0 replies
- 0 kudos
- 929 Views
- 1 replies
- 0 kudos
Difference between delete token API and revoke token API Databricks
Hi Community, I am trying to understand the difference between:Delete token API: DELETE /api/2.0/token-management/tokens/{token_id}Revoke token API: POST /api/2.0/token/deleteAs, when I create more than 600 tokens - I am getting QUOTA_EXCEEDED error....
- 929 Views
- 1 replies
- 0 kudos
- 0 kudos
Delete token API doc link: https://docs.databricks.com/api/workspace/tokenmanagement/deleteRevoke token API doc link: https://docs.databricks.com/api/workspace/tokens/revoketoken
- 0 kudos
- 1301 Views
- 1 replies
- 0 kudos
Using libpostal in Databricks
Hi,I am trying to work on address parsing and would like to use libpostal in Databricks.I have used the official python bindings: GitHub - openvenues/pypostal: Python bindings to libpostal for fast international address parsing/normalizationpip insta...
- 1301 Views
- 1 replies
- 0 kudos
- 0 kudos
I managed to install pylibpostal via the Cluster Library. but I cannot seem to download the data needed to run it.Please help. Thank you.
- 0 kudos
- 2062 Views
- 1 replies
- 0 kudos
unable to see AI playground in Machine Learning in Dashboard
unable to see AI playground in Machine Learning in Dashboard
- 2062 Views
- 1 replies
- 0 kudos
- 803 Views
- 0 replies
- 0 kudos
Number of tokens generated for a service principal
Hi community, Is there any API or option to view all PAT tokens generated by a Databricks service principal?
- 803 Views
- 0 replies
- 0 kudos
- 1463 Views
- 0 replies
- 0 kudos
Join Our Databricks Free Trial Experience feedback AMA on Friday March 29, 2024!
We're looking for feedback on the Databricks free trial experience, and we need your help! Whether you've used it for data engineering, data science, or analytics, Sujit Nair, our Product Manager on the free trial experience, and our journey archite...
- 1463 Views
- 0 replies
- 0 kudos
- 4463 Views
- 1 replies
- 1 kudos
source set to GIT for Databricks Asset Bundle notebook_task - git authentication fails on run
My post was marked as Spam after trying to post the description of my issue so now I have posted the question on stackoverflow.
- 4463 Views
- 1 replies
- 1 kudos
- 1954 Views
- 1 replies
- 0 kudos
DLT SQL demo pipeline issue
Hi, First foray into DLT and following code exerts from the sample-DLT-notebook.I'm creating a notebook with the SQL below:CREATE STREAMING LIVE TABLE sales_orders_rawCOMMENT "The raw sales orders, ingested from /databricks-datasets."TBLPROPERTIES ...
- 1954 Views
- 1 replies
- 0 kudos
- 0 kudos
If you change the notebook default language as opposed to using magic command. I normally have it set to Python, I've wrongly assumed DLT would transpose as can't use magic command but have to change default in order for it to work.
- 0 kudos
- 1352 Views
- 0 replies
- 0 kudos
Download event and run logs
how can I download the run and event logs? spark UI is loading them from somewhere but I couldn't find them in dbfs nor on s3
- 1352 Views
- 0 replies
- 0 kudos
- 2882 Views
- 1 replies
- 0 kudos
using the api for getting cost in usd
I'm trying to use the API of billable usage and I do get a report but I have not been able to get the usd cost report, only the dbuHours. I guess I've to change the meter_name but I cannot find the key for that parameter anywhere
- 2882 Views
- 1 replies
- 0 kudos
- 8617 Views
- 3 replies
- 1 kudos
Fuzzy Match on PySpark using UDF/Pandas UDF
I'm trying to do fuzzy matching on two dataframes by cross joining them and then using a udf for my fuzzy matching. But using both python udf and pandas udf its either very slow or I get an error. @pandas_udf("int")def core_match_processor(s1: pd.Ser...
- 8617 Views
- 3 replies
- 1 kudos
- 1 kudos
I'm now getting the error: (SQL_GROUPED_AGG_PANDAS_UDF) is not supported on clusters in Shared access mode.Even though this article clearly states that pandas udf is supported for shared cluster in databrickshttps://www.databricks.com/blog/shared-clu...
- 1 kudos
- 5672 Views
- 2 replies
- 1 kudos
Resolved! Okta and Unified login
Hey Folks anyone put Databricks behind Okta and enabled Unified Login with workspaces that have a Unity Catalog metastore applied and some that don't?There are some workspaces we can't move over yet and it isn't clear in documentation if Unity Catalo...
- 5672 Views
- 2 replies
- 1 kudos
- 1 kudos
Yes, users should be able to use a single Okta application for all workspaces, regardless of whether the Unity Catalog metastore has been applied or not. The Unity Catalog is a feature that allows you to manage and secure access to your data across a...
- 1 kudos
- 7129 Views
- 5 replies
- 7 kudos
Incremental ingestion of Snowflake data with Delta Live Table (CDC)
Hello,I have some data which are lying into Snowflake, so I want to apply CDC on them using delta live table but I am having some issues.Here is what I am trying to do: @dlt.view() def table1(): return spark.read.format("snowflake").options(**opt...
- 7129 Views
- 5 replies
- 7 kudos
- 7 kudos
The CDC for delta live works fine for delta tables, as you have noticed. However it is not a full blown CDC implementation/software.If you want to capture changes in Snowflake, you will have to implement some CDC method on Snowflake itself, and read...
- 7 kudos
- 4307 Views
- 0 replies
- 0 kudos
Database: Delta Lake or PostgreSQL
Hey all,I am searching for a non-political answer to my database questions. Please know that I am a data newbie and litteraly do not know anything about this topic, but I want to learn, so please be gentle. Some context: I am working for an OEM that...
- 4307 Views
- 0 replies
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
12.2 LST
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API
2 -
API Documentation
2 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
2 -
Azure data disk
1 -
Azure databricks
10 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
4 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Best Practices
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Bug
1 -
Catalog
1 -
Certification
2 -
Certification Exam
1 -
Certification Voucher
1 -
CICD
2 -
cleanroom
1 -
Cli
1 -
Cloud_files_state
1 -
cloudera sql
1 -
CloudFiles
1 -
Cluster
3 -
clusterpolicy
1 -
Code
1 -
Community Group
1 -
Community Social
1 -
Compute
3 -
conditional tasks
1 -
Connection
1 -
Cost
2 -
Credentials
1 -
CustomLibrary
1 -
CustomPythonPackage
1 -
DABs
1 -
Data Engineering
2 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
DataAISummit2023
1 -
DatabrickHive
1 -
databricks
2 -
Databricks Academy
1 -
Databricks Alerts
1 -
databricks app
1 -
Databricks Audit Logs
1 -
Databricks Certified Associate Developer for Apache Spark
1 -
Databricks Cluster
1 -
Databricks Clusters
1 -
Databricks Community
1 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
1 -
Databricks JDBC
1 -
Databricks Job
1 -
Databricks jobs
2 -
Databricks Lakehouse Platform
1 -
Databricks notebook
1 -
Databricks Notebooks
2 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks SQL
1 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
dbdemos
2 -
DBRuntime
1 -
DDL
1 -
deduplication
1 -
Delt Lake
1 -
Delta
13 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
6 -
Delta Sharing
2 -
deltaSharing
1 -
denodo
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
DLTCluster
1 -
Documentation
2 -
Dolly
1 -
Download files
1 -
dropduplicatewithwatermark
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
1 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
glob
1 -
Good Documentation
1 -
Google Bigquery
1 -
hdfs
1 -
Help
1 -
How to study Databricks
1 -
I have a table
1 -
informatica
1 -
Jar
1 -
Java
2 -
Jdbc
1 -
JDBC Connector
1 -
Job Cluster
1 -
Job Task
1 -
Kubernetes
1 -
LightGMB
1 -
Lineage
1 -
LLMs
1 -
Login
1 -
Login Account
1 -
Machine Learning
1 -
MachineLearning
1 -
manage account databricks unity catalog
1 -
masking
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Metastore
1 -
MlFlow
2 -
Mlops
1 -
Model Serving
1 -
Model Training
1 -
Mount
1 -
Networking
1 -
nic
1 -
Okta
1 -
ooze
1 -
os
1 -
Password
1 -
Permission
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
policies
1 -
PostgresSQL
1 -
Pricing
1 -
pubsub
1 -
Pyspark
1 -
Python
2 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
RBAC
1 -
react.js
1 -
Read data
1 -
Repos Support
1 -
required versus current
1 -
Reserved VM's
1 -
Reset
1 -
run a job
1 -
runif
1 -
S3
1 -
SAP SUCCESS FACTOR
1 -
Schedule
1 -
SCIM
1 -
Serverless
1 -
Service principal
1 -
Session
1 -
Sign Up Issues
2 -
Significant Performance Difference
1 -
Spark
2 -
sparkui
2 -
Splunk
1 -
sqoop
1 -
Start
1 -
Stateful Stream Processing
1 -
Storage Optimization
1 -
Structured Streaming ForeachBatch
1 -
suggestion
1 -
Summit23
2 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
tabrikck
1 -
Tags
1 -
Training
1 -
trajectory
1 -
Troubleshooting
1 -
ucx
2 -
Unity Catalog
1 -
Unity Catalog Error
2 -
Unity Catalog Metastore
1 -
UntiyCatalog
1 -
Update
1 -
user groups
1 -
Venicold
3 -
volumes
2 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
with open
1 -
Women
1 -
Workflow
2 -
Workspace
2
- « Previous
- Next »