- 1439 Views
- 1 replies
- 1 kudos
How to connect to database in azure databricks from vs2022 console application to access the data
Hi Team,Am new to databricks and please find my below question and do the favor.I have created a cluster then database, tables and data also inserted, Now I want to access this table data from my dotnet console application which is in 6.0 framework(v...
- 1439 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi, Databricks extension will help you to do so. https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext Please let us know if this helps. Thanks!
- 1 kudos
- 4338 Views
- 3 replies
- 3 kudos
I created the workspace successfully using QuickStart(recommended) method, now I want to create one more workspace and it is showing an error.
Hi,I created the workspace using QuickStart(recommended) method and at the time of creating workspace, it asked following parameters - AccountId -AWSRegionBucketNameDataS3BucketIAMRole.PasswordUsernameWorkspaceNameThe workspace was created successf...
- 4338 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi, could you please elaborate on the error code here? There is some misconfiguration which is causing the error. Thanks!
- 3 kudos
- 2727 Views
- 2 replies
- 0 kudos
What's going wrong in my attempt to start with DataBricks?
I'm trying to get going with DataBricks for the first time. It's told me to create a workspace, which takes me to AWS (I'm also new to AWS). Following the instructions through there gets it to start creating something, but then it just gets stuck on ...
- 2727 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi, this looks like a workspace creation failure. Would like to know about the error details more. Thanks!
- 0 kudos
- 953 Views
- 0 replies
- 0 kudos
Hive Metastore permission on DBX 10.4
I've been working on creating a schema in the Hive Metastore using the following command:spark.sql(f'CREATE DATABASE IF NOT EXISTS {database}')The schema or database is successfully created, but I encountered an issue where it's only accessible for m...
- 953 Views
- 0 replies
- 0 kudos
- 1878 Views
- 0 replies
- 0 kudos
Tags to run S3 lifecycle rules
Hello,Is it possible to utilize S3 tags when writing a DataFrame with PySpark? Or is the only option to write the dataframe and then use boto3 to tag all the files?More information about S3 object tagging is here: Amazon S3 Object Tagging.Thank you.
- 1878 Views
- 0 replies
- 0 kudos
- 1912 Views
- 0 replies
- 1 kudos
VS code 2023
Do I need to save the data locally and run the plotting locally as well or does anyone have a smart solution to this
- 1912 Views
- 0 replies
- 1 kudos
- 1443 Views
- 0 replies
- 0 kudos
Change the Admin Owner Email Account for Databricks cloud standard account?
As stated. My company changed name and the email address has migrated. I need to change it to the new one. And there is no way to open a support ticket to address that from what I saw (Standard Plan)Please do not tell me to contact AWS as they have ...
- 1443 Views
- 0 replies
- 0 kudos
- 930 Views
- 0 replies
- 0 kudos
Plotting using Databricks in VS code
Hi,I am quite new to working with Databricks in VS code. I am trying to figure out the best way to plot my data, when running on a cluster. I would like to have the possibility to zoom and move the plot as I have when plotting locally with Matplotlib...
- 930 Views
- 0 replies
- 0 kudos
- 11137 Views
- 6 replies
- 3 kudos
Resolved! Databricks voucher code error
Hi Team, I am getting error that voucher code is invalid error when trying to register for "Databricks Certified Associate Data Engineer Associate. I got this issue once page was reloaded due to slowness of the internet before checkout. and the vouch...
- 11137 Views
- 6 replies
- 3 kudos
- 3 kudos
No Worries. Contact help and support system , also raise the ticket.
- 3 kudos
- 4974 Views
- 2 replies
- 1 kudos
Change default catalog
It seems that when I am connecting to Databricks Warehouse, it is using the default catalog which is hive_metastore. Is there a way to define unity catalog to be the default?I know I can run the queryUSE CATALOG MAINAnd then the current session will ...
- 4974 Views
- 2 replies
- 1 kudos
- 1 kudos
Thanks Brian2. Is there an equivalent config parameter for a SQL Warehouse?
- 1 kudos
- 1053 Views
- 0 replies
- 0 kudos
ronaldo is back
create table SalesReport(TerritoryName NVARCHAR(50), ProductName NVARCHAR(100), TotalSales DECIMAL(10,2), PreviousYearSales DECIMAL(10,2), GrowthRate DECIMAL(10,2)); create table ErrorLog( ErrorID int, ErrorMessage nvarchar(max),ErrorDate datetime);...
- 1053 Views
- 0 replies
- 0 kudos
- 1662 Views
- 0 replies
- 0 kudos
Save dataframe to the same variable
I would like to know if there is any difference if I save dataframe during tranformation to itself as first code or to new dataframe as second example.Thankslog_df = log_df.withColumn("process_timestamp",from_utc_timestamp(lit(current_timestamp()),"E...
- 1662 Views
- 0 replies
- 0 kudos
- 2218 Views
- 0 replies
- 0 kudos
iceberg
Hi fellasi am working on databricks using icebergat first i have configured my notebook as belowspark.conf.set("spark.sql.catalog.spark_catalog","org.apache.iceberg.spark.SparkCatalog")spark.conf.set("spark.sql.catalog.spark_catalog.type", "hadoop")s...
- 2218 Views
- 0 replies
- 0 kudos
- 2095 Views
- 1 replies
- 1 kudos
Resolved! threads leakage when getConnection fails
Hi,we are using databricks jdbc https://mvnrepository.com/artifact/com.databricks/databricks-jdbc/2.6.33it seems like there is a thread leakage when getConnection failscould anyone advice?can be reproduced with @Test void databricksThreads() {...
- 2095 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi,none of the above suggestion will not work...we already contacted databricks jdbc team, thread leakage was confirmed and was fixed in version 2.6.34https://mvnrepository.com/artifact/com.databricks/databricks-jdbc/2.6.34this leakage still exist if...
- 1 kudos
- 15969 Views
- 2 replies
- 0 kudos
Resolved! Access Foreign Catalog using Python in Notebook
Hello - I have a foreign catalog which I can access fine in SQL. However, I can't access it from from python notebook.i.e. this works just fine if I have notebook using a Pro SQL Warehouse%sqlUSE CATALOG <my_foreign_catalog_name>;USE SCHEMA public;S...
- 15969 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi, Are you using this in single user cluster? Also, please tag @Debayan with your next response so that I will get notified.
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
88 | |
42 | |
42 | |
30 |