- 2960 Views
- 1 replies
- 0 kudos
Resolved! ETL Advice for Large Transactional Database
I have a SQL server transactional database on an EC2 instance, and an AWS Glue job that pulls full tables in parquet files into an S3 bucket. There is a very large table that has 44 million rows, and records are added, updated and deleted from this t...
- 2960 Views
- 1 replies
- 0 kudos
- 0 kudos
If you have a CDC stream capability, you can use the APPLY CHANGES INTO API to perform SCD1, or SCD2 in a Delta Lake table in Databricks. You can find more information here. This is the best way to go if CDC is a possibility.If you do not have a CD...
- 0 kudos
- 9909 Views
- 2 replies
- 0 kudos
Connect to Databricks using Java SDK through proxy
I'm trying to connect to databricks from java using the java sdk and get cluster/sqlWarehouse state. I'm able to connect and get cluster state from my local. But, once I deploy it to the server, my company's network is not allowing the connection. We...
- 9909 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Nagasundaram You can make use of the below init script inorder to use a proxy server with Databricks cluster. The content of the init script can be added at "Workspace/shared/setproxy.sh" ================================================== v...
- 0 kudos
- 1584 Views
- 1 replies
- 0 kudos
Can I use databricks principals on databricks connect 12.2?
Hi community,Is it possible to use Databricks service principals for authentication on Databricks connect 12.2 to connect my notebook or code to Databricks compute, rather than using personal access token? I checked the docs and got to know that upgr...
- 1584 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Retired_modThanks for your response. I was able to generate the token of the service principal following this doc, later saved it in the <Databricks Token> variable prompted when running databricks-connect configure command in terminal. And was a...
- 0 kudos
- 10912 Views
- 1 replies
- 0 kudos
How to add instance profile permission to all user via databricks-sdk workspace client
How to add instance profile permission to all user via databricks-sdk workspace client. Just like terraform where we can give "users" for all users , how can we don same using databricks-sdk workspace-client. I cannot find permission for instance pro...
- 10912 Views
- 1 replies
- 0 kudos
- 1100 Views
- 0 replies
- 0 kudos
How managed tables are useful in Madalian Archtecture
I am having basic question, As managed tables doesnt store their data into ADLS Gen2. But in our architcture we created 3 containers in ADLS Gen2 (Bronze, Silver and Gold) . If I chose managed tables then neither metadata nor data doesnt store into ...
- 1100 Views
- 0 replies
- 0 kudos
- 11035 Views
- 2 replies
- 1 kudos
UCX Installation
We aim to streamline the UCX installation process by utilizing Databricks CLI and automating the manual input of required details at each question level .Could you please guide us how can we achieve to automate the parameter's while installation? wha...
- 11035 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi Team, We don't see option at UCX command level passing parameters as json /config file, could you please help me in this case how we can automate the installation.
- 1 kudos
- 1276 Views
- 1 replies
- 0 kudos
Error Handling for Web Data Retrieval and Storage in Databricks UNITY Clusters
The following code works well in a normal Databricks cluster, where it passes a null JSON and retrieves content from the web link. However, in a Unity cluster, it produces the following error: 'FileNotFoundError: [Errno 2] No such file or directory: ...
- 1276 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @nidhin , Good Day! The reason behind the below error while trying to access the external dbfs mount file using "with open" is that you are using a shared access mode cluster. 'FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/mnt/ra...
- 0 kudos
- 2434 Views
- 1 replies
- 0 kudos
Databricks GPU utilization not to full extent
Hi Everyone, I have been running below code. However Im getting CUDA out of memory error even though I have 4 GPUs in cluster which should ideally have 64 GB GPU , but the code is failing with 16 GB. I assume that the code is not utilizing all 4 GPU ...
- 2434 Views
- 1 replies
- 0 kudos
- 0 kudos
Your code is loading the full model into a single GPU so having multiple GPUs does not prevent out of memory errors. By default, transformer models only have DDP (distributed data parallel) so each GPU has a copy of your model for speeding up trainin...
- 0 kudos
- 3856 Views
- 0 replies
- 0 kudos
UCX Installation without CLI
Hi Team, Can we install UCX Toolkit into databricks workspace without installing it via Databricks CLI? If its possible then How ?https://github.com/databrickslabs/ucx
- 3856 Views
- 0 replies
- 0 kudos
- 5000 Views
- 1 replies
- 0 kudos
Resolved! Unittesting databricks.sdk.runtime
How to mock a code that uses dbutils from"from databricks.sdk.runtime import dbutils"it shows databricks-sdk has no attribute runtime
- 5000 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @samarth_solanki , I Hope you are doing well! Based on the information you have shared, it seems like you're trying to import dbutils from databricks.sdk.runtime, but you're encountering an error that says "databricks-sdk has no attribute runtime...
- 0 kudos
- 15132 Views
- 6 replies
- 9 kudos
Notebook scrolling as I select
When I select text in a notebook cell the whole notebook scrolls up as a select. This happens when I use the mouse wheel and with shift+arrow key. It varies by cell - happens in some cells, but not other cells within the same notebook. When I refresh...
- 15132 Views
- 6 replies
- 9 kudos
- 9 kudos
I'm relieved to know that I'm not the only one experiencing this issue.Please, address this as soon as possible. It's significantly impacting my productivity.
- 9 kudos
- 2195 Views
- 1 replies
- 3 kudos
Resolved! How do i fix QB Error code 6000 301?
When I try to access my company files, I keep getting the error 6000 301 in QB. Please assist me in fixing this mistake.
- 2195 Views
- 1 replies
- 3 kudos
- 3 kudos
Are you facing QB error 6000 301 while trying to log in to your company file in QB Desktop and don’t know what should be done next? If yes, then you should not panic at all because through this post I am going to tell you everything you need to know ...
- 3 kudos
- 3303 Views
- 2 replies
- 0 kudos
Resolved! Create Storage Credential 500 Response
I'm trying to create storage credentials for an Azure Databricks Connector at the workspace level with a service principal that has the CREATE_STORAGE_CREDENTIAL but is NOT an account admin. For this test, the SP has the owner role on the connector.I...
- 3303 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @ledbutter , hope you are doing well today! I have gone through the details and this issue might be related to https://github.com/databricks/cli/issues/1080 Please refer to this for more details: https://github.com/databricks/cli/issues/1108 Plea...
- 0 kudos
- 1724 Views
- 2 replies
- 0 kudos
Cells' outputs getting appended at each run - Databricks Notebook
Hello Community,I have the following issue. When I am running cells from a notebook, I have the print outputs from the previous cells that are appended to the current print output (meaning running cell 1 gives output 1, running cell 2 gives output 1 ...
- 1724 Views
- 2 replies
- 0 kudos
- 0 kudos
This seems to be linked by installing pycaret
- 0 kudos
- 2344 Views
- 0 replies
- 1 kudos
cost finding and optimization
Hi Team,Could you please suggest the best way to track the cost of Databricks objects/components? Could you please share any best practices for optimizing costs and conducting detailed cost analysis?Regards,Phanindra
- 2344 Views
- 0 replies
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
4 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
Aws databricks
1 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
20 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 129 | |
| 72 | |
| 57 | |
| 42 |