- 2051 Views
- 1 replies
- 1 kudos
Resolved! How to pass a dynamic query to source server from databricks
I have this usecase wherein i am supposed to pass a dynamic query to get data from source I have tried the query option but its giving error SparkConnectGrpcException: (com.microsoft.sqlserver.jdbc.SQLServerException) Incorrect syntax near the keywor...
- 2051 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi thanks for your reply,I have used foreign catalouge to fetch required data from information schema then i am creating the dynamic query in databricks and then passing in query this is working for me! @Retired_mod
- 1 kudos
- 2200 Views
- 2 replies
- 0 kudos
Resolved! How can I increase the hard capacity of the master node?
I'm not sure if this is the right place to post my question. If not, please let me know where I should post my question. I want to download large files from the web from Databricks' master(driver) node. For example, I fetch a file over 150GB via API ...
- 2200 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @himanmon,If you 100% sure that you can't download this file to storage account configured with unity catalog and you want it directly on driver node local storage, then why can't you just increase local disk space by choosing a larger instance ty...
- 0 kudos
- 4142 Views
- 5 replies
- 0 kudos
S60 Eliminate SPN secrets - Connect Azure Databricks to ADLS Gen2 , Gen1 via custom AD token
Hi Team,In Azure Databricks, we currently use Service Principal when creating Mount Points to Azure storage ( ADLS Gen1, ADLS Gen 2 and Azure Blob Storage).As part of S360 action to eliminate SPN secrets, we were asked to move to SPN+certificate / MS...
- 4142 Views
- 5 replies
- 0 kudos
- 0 kudos
@ramesitexp Yes @szymon_dybczak is correct for now only valid option is below : OAuth 2.0 with a Microsoft Entra ID service principalShared access signatures (SAS)Account keys For now we are using OAuth 2.0 with a Microsoft Entra ID service principal...
- 0 kudos
- 1238 Views
- 0 replies
- 0 kudos
Creating a table in ADB SQL for multiple JSON files and selecting all the rows from all the files
HiI have multiple json files stored in my ADLS2 and I want to create a table in which will directly read all the data from ADLS without mounting the files. When I create the table, i cannot select all the data How can i achieve this.ADLS Path : /dwh/...
- 1238 Views
- 0 replies
- 0 kudos
- 2538 Views
- 2 replies
- 0 kudos
OAuth user-to-machine (U2M) authentication
I am trying to use OAuth user-to-machine (U2M) authentication from azure databricks CLI.When I run databricks auth login --host ,I get a web browser open and get authentication sucessfull message and My profile also save successfully with auth-type...
- 2538 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Aria , Good Day! Which CLI version you are using here? Can you try to update the CLI version to a newer version by referring to this document : https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/install#--homebrew-update-for-linux-...
- 0 kudos
- 1010 Views
- 1 replies
- 0 kudos
Free trail account
Hi Can I able to create unity catalog using free trail account?
- 1010 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Spyro_3 ,Yeah, you should be able to create unity catalog. Trial version allows you to create premium workspace which is required for unity catalog. Notice, that to setup metastore you also need to have Global Administrator permission (if we are ...
- 0 kudos
- 6240 Views
- 2 replies
- 1 kudos
Issue with Private PyPI Mirror Package Dependencies Installation
I'm encountering an issue with the installation of Python packages from a Private PyPI mirror, specifically when the package contains dependencies and the installation is on Databricks clusters - Cluster libraries | Databricks on AWS. Initially, ever...
- 6240 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @hugodscarvalho ,I am also at this point, where the transitive dependencies (available in jfrog) are not getting installed in my job cluster. Could you please elaborate a bit on what exactly needed to be changed in the JFrog setup for this to work...
- 1 kudos
- 1176 Views
- 1 replies
- 0 kudos
Using model serving from databricks privacy issue
When using Databrick's model serving to query Llama3, I noticed the endpoint URL is my databricks instance. Does that still mean data is sent to databricks process by databricks? If so, does databricks keep/use any of the data sent to model serving ...
- 1176 Views
- 1 replies
- 0 kudos
- 0 kudos
Related question, when databricks process requests for foundational model, I noticed the latency is pretty small, wondering what kind of processing power is used on databricks side? I am interested in hosting model ourselves so wondering what type of...
- 0 kudos
- 2884 Views
- 1 replies
- 1 kudos
Vector Search index not indexing the whole Delta table
I have a Delta table that I’m trying to index but when I try to create a vector search index with either the UI or the Python SDK, it only indexes 1 row out of my 3000 rows. I have tried using different vector search endpoints. I have verified the fo...
- 2884 Views
- 1 replies
- 1 kudos
- 3541 Views
- 1 replies
- 0 kudos
Displaying Dataframes with ipywidgets.Output is Adding Unexpected Commas
I am currently working in a databricks notebook and using an ipywidgets.Output to display a pandas Dataframe. Because spark.DataFrame cannot be displayed in an ipywidgets.Output widget, I have been using:import pandas as pd import numpy as np import ...
- 3541 Views
- 1 replies
- 0 kudos
- 0 kudos
I have this issue also, and the listed steps do not resolve it.
- 0 kudos
- 6752 Views
- 7 replies
- 3 kudos
Writing a single huge dataframe into Azure SQL Database using JDBC
Hi All,I am currently trying to read data from a materialized view as a single dataframe which contains around 10M of rows and then write it into an Azure SQL database. However, I don't see the spark job moving a bit even an hour is passed. I have al...
- 6752 Views
- 7 replies
- 3 kudos
- 3 kudos
Hi @yeungcase , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedba...
- 3 kudos
- 1215 Views
- 1 replies
- 0 kudos
Mounts in Databricks
How is it possible to prohibit a certain user from being able to see the mounts created in Databricks. Even if the user issues the %fs ls /mnt command, it doesn't return anything.
- 1215 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Elcio , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedback n...
- 0 kudos
- 7922 Views
- 6 replies
- 3 kudos
Driver and worker node utalisation
Hi all! Can anyone tell me if having a worker node(s) of the same type as my driver node make a difference performance-wise if I am running normal python code on a notebook as a job on this cluster? I am running mostly machine learning libraries such...
- 7922 Views
- 6 replies
- 3 kudos
- 3544 Views
- 2 replies
- 1 kudos
Resolved! Profile setup for databricks Account
Hi Team,we usually do setup profile for databrikcs at workspace level, and i have done this using host url and token which is working fine like below.[DEFAULT]host = workspaceurlusername = ****password = tokenNow my question is how we set for databri...
- 3544 Views
- 2 replies
- 1 kudos
- 1 kudos
You cannot create a PAT token for you Account Console authentication, on this case you will have 2 options to authenticate:Using Basic auth, on which on your profile you will set you username and password to log in, same as you do to log in in the br...
- 1 kudos
- 1280 Views
- 0 replies
- 0 kudos
Databricks Asset Bundle Error 'KEY of the resource to run"
Hello Team,I am new to DAB and running it for the first time through the Databricks CLI.The bundle validation is successful but while running it errors out error="expected a KEY of the resource to run".Can anyone help me on what to check to resolve t...
- 1280 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
2 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 119 | |
| 57 | |
| 42 | |
| 34 |