- 4520 Views
- 5 replies
- 1 kudos
Access RDS Postgres DB via SSH Tunnel
Hello, How can I configure a foreign catalog connection to use SSH tunneling? I want to be able to use unity catalog
- 4520 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi, in addition to our previous message, you can try https://docs.databricks.com/en/query-federation/foreign-catalogs.html and https://grant-6562.medium.com/connecting-to-sql-server-through-an-ssh-tunnel-with-python-17de859caca5. Also please tag @Deb...
- 1 kudos
- 6981 Views
- 1 replies
- 2 kudos
Using Azure Event Grid for structured streaming
Can anyone point me to any Databricks documentation (or other resources) for configuring structured streaming to use Azure Event Grid for a source/sink? I found examples for Kafka and EventHubs but Azure Event Grid is different than Azure Event Hubs....
- 6981 Views
- 1 replies
- 2 kudos
- 2 kudos
I must be missing something. I don't see how the examples in the referenced document can be applied to Azure Event Grid.Is there another example that shows how to subscribe to an Event Grid topic?
- 2 kudos
- 2608 Views
- 1 replies
- 0 kudos
Unity Catalog - Invalid configuration value detected for fs.azure.account.key
Hi there,I am having issue with writing a df to a table or display it. I have three dataframes that I have unioned and after I have done the union I cannot display the dataframe.df_table1 = spark.sql(f'SELECT * FROM {sql_full_name}')df_table2 = ...df...
- 2608 Views
- 1 replies
- 0 kudos
- 1448 Views
- 1 replies
- 0 kudos
sql end point and jdbc driver
When we try and connect to a sql warehouse endpoint with the databricks jdbc driver our query is failing if we use first_value(). We've rewritten the query to use limit 1, but we would like to understand if this is a gap in the simba/databricks driv...
- 1448 Views
- 1 replies
- 0 kudos
- 0 kudos
A sample error message when using first_value() is:An error occurred while calling o132.csv. [Databricks][JDBC](10140) Error converting value to BigDecimal.
- 0 kudos
- 1599 Views
- 1 replies
- 0 kudos
Databricks certified Data engineer Associate exam got suspended_need immediate help(10/09/2023)
Hi team,I've scheduled my exam on 10th September,2023 at 15:15hrs Asia Calcutta timing but my exam got suspended stating no proper environment by the proctor.Please check this issue as I didn't attempt one single question too, this is not fair as I l...
- 1599 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @sirishavemula20 We have addressed the case. If the issue is not resolved, please reply back to the case notes.
- 0 kudos
- 1313 Views
- 0 replies
- 0 kudos
Use a single cluster policy for multiple teams and capture team name via custom tags
I am planning to add team names in custom tags and was hoping can do it with allowList and then have the user choose from the list. I am trying to avoid having multiple policy files per team.Has anybody found a good way to do this? May be using globa...
- 1313 Views
- 0 replies
- 0 kudos
- 2213 Views
- 2 replies
- 0 kudos
Cannot see 'Databricks Lakehouse Platform" on prod AWS Cost Explorer
Hi, Currently, we have two different AWS accounts: dev and prod. We also have two different workspaces: one for dev and another for prod. The strange thing is that prod costs are being added to the dev account on AWS Cost Explorer ("Databricks Lakeho...
- 2213 Views
- 2 replies
- 0 kudos
- 0 kudos
Databricks uses tags and AWS CloudTrail logs to connect and report costs to AWS.Tags can be used to monitor costs and attribute Databricks usage to different business units and teams.AWS CloudTrail logs can be used to calculate the exact cost of API ...
- 0 kudos
- 2360 Views
- 1 replies
- 0 kudos
Snowflake filter Query giving empty results in Databricks, same query working in Snowflake.
I am trying to fetch filter data based on date format on a date column. Below is the query forming in Databricks. In Databricks giving empty results.SELECT * FROM Test.TestSchema.Address where TO_DATE(TO_VARCHAR(MODIFIEDDATE,'YYYY-MM-DD')) = '2023-09...
- 2360 Views
- 1 replies
- 0 kudos
- 1094 Views
- 1 replies
- 0 kudos
Sql warehouse setting permission
How do you set permission to just execute a query but cannot modify settings on the sql warehouse
- 1094 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, Have you tried this: https://docs.databricks.com/en/security/auth-authz/access-control/sql-endpoint-acl.html Please let us know if this helps. Thanks!
- 0 kudos
- 2153 Views
- 1 replies
- 0 kudos
Resolved! Structured streaming from Azure Event Hub, authenticating without SAS keys
Using SAS keys is a security issue that we would like to avoid, how do we utilize structured streaming from Event Hub while authenticating to Azure AD (client_id and secret).We know that we can use pythons Event Hub library, but that will make have t...
- 2153 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, Could you please try structured streaming event hubs integration? https://docs.databricks.com/en/_extras/notebooks/source/structured-streaming-event-hubs-integration.html
- 0 kudos
- 1468 Views
- 1 replies
- 1 kudos
How to connect to database in azure databricks from vs2022 console application to access the data
Hi Team,Am new to databricks and please find my below question and do the favor.I have created a cluster then database, tables and data also inserted, Now I want to access this table data from my dotnet console application which is in 6.0 framework(v...
- 1468 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi, Databricks extension will help you to do so. https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext Please let us know if this helps. Thanks!
- 1 kudos
- 4397 Views
- 3 replies
- 3 kudos
I created the workspace successfully using QuickStart(recommended) method, now I want to create one more workspace and it is showing an error.
Hi,I created the workspace using QuickStart(recommended) method and at the time of creating workspace, it asked following parameters - AccountId -AWSRegionBucketNameDataS3BucketIAMRole.PasswordUsernameWorkspaceNameThe workspace was created successf...
- 4397 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi, could you please elaborate on the error code here? There is some misconfiguration which is causing the error. Thanks!
- 3 kudos
- 2766 Views
- 2 replies
- 0 kudos
What's going wrong in my attempt to start with DataBricks?
I'm trying to get going with DataBricks for the first time. It's told me to create a workspace, which takes me to AWS (I'm also new to AWS). Following the instructions through there gets it to start creating something, but then it just gets stuck on ...
- 2766 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi, this looks like a workspace creation failure. Would like to know about the error details more. Thanks!
- 0 kudos
- 984 Views
- 0 replies
- 0 kudos
Hive Metastore permission on DBX 10.4
I've been working on creating a schema in the Hive Metastore using the following command:spark.sql(f'CREATE DATABASE IF NOT EXISTS {database}')The schema or database is successfully created, but I encountered an issue where it's only accessible for m...
- 984 Views
- 0 replies
- 0 kudos
- 1909 Views
- 0 replies
- 0 kudos
Tags to run S3 lifecycle rules
Hello,Is it possible to utilize S3 tags when writing a DataFrame with PySpark? Or is the only option to write the dataframe and then use boto3 to tag all the files?More information about S3 object tagging is here: Amazon S3 Object Tagging.Thank you.
- 1909 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Delta Lake
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
97 | |
52 | |
42 | |
30 |