- 21413 Views
- 6 replies
- 0 kudos
Renaming the database Name in Databricks
Team,Initially our team created the databases with the environment name appended. Ex: cust_dev, cust_qa, cust_prod.I am looking to standardize the database name as consistent name across environments. I want to rename to "cust". All of my tables are ...
- 21413 Views
- 6 replies
- 0 kudos
- 0 kudos
You can also use “CASCADE” to drop schema and tables as well. It is recursive.
- 0 kudos
- 1093 Views
- 0 replies
- 0 kudos
How to confirm a workspace ID via an api token?
Hello! We are integrating with Databricks and we get the API key, workspace ID, and host from our users in order to connect to Databricks. We need the to validate the workspace ID because we do need it outside of the context of the API key (with webh...
- 1093 Views
- 0 replies
- 0 kudos
- 1255 Views
- 1 replies
- 0 kudos
Custom python package iin Notebook task using bundle
Hi mates!I'n my company, we are moving our pipelines to Databricks bundles, our pipelines use a notebook that receives some parameters.This notebook uses a custom python package to apply the business logic based on the parameters that receive.The thi...
- 1255 Views
- 1 replies
- 0 kudos
- 1267 Views
- 0 replies
- 0 kudos
Spot label in pool even though the configuration selected is all on-demand
Why is there a spot label in pool even though the configuration selected is all on-demand? can someone explain?
- 1267 Views
- 0 replies
- 0 kudos
- 627 Views
- 0 replies
- 0 kudos
Is there any specific error you are receiving when running the init script? Does the run complete st
Is there any specific error you are receiving when running the init script? Does the run complete start up or fail due to the init script?
- 627 Views
- 0 replies
- 0 kudos
- 723 Views
- 0 replies
- 0 kudos
Eorror in perform a merge inside a streaming foreachbatch using the command: microBatchDF._jdf.s
I'm trying to perform a merge inside a streaming foreachbatch using the command: microBatchDF._jdf.sparkSession().sql(self.merge_query)Streaming
- 723 Views
- 0 replies
- 0 kudos
- 930 Views
- 0 replies
- 0 kudos
ipywidgets not loading in DataBricks Community Edition
Hi All,I am trying to run commands with ipywidgets but it just says :Loading widget...Same error occurs even when I re run the cell.DataBricks version used : 14.2
- 930 Views
- 0 replies
- 0 kudos
- 12305 Views
- 2 replies
- 1 kudos
numpy.ndarray size changed, may indicate binary incompatibility
Hi All,I have installed the following libraries on my cluster (11.3 LTS that includes Apache Spark 3.3.0, Scala 2.12):numpy==1.21.4flair==0.12 on executing `from flair.models import TextClassifier`, I get the following error:"numpy.ndarray size chan...
- 12305 Views
- 2 replies
- 1 kudos
- 1 kudos
You have changed the numpy version, and presumably that is not compatible with other libraries in the runtime. If flair requires later numpy, then use a later DBR runtime for best results, which already has later numpy versions
- 1 kudos
- 1658 Views
- 1 replies
- 0 kudos
Databricks Asset Bundle Behaviour for new workflows and existing workflows
Dear Community Members -I am trying to deploy a workflow using DAB. After deploying if I am updating the same workflow with different bundle name it is creating a new workflow instead of updating the existing workflow. Also when I am trying to use sa...
- 1658 Views
- 1 replies
- 0 kudos
- 0 kudos
@nicole_lu_PM : Do you have any suggestions or feedback for the above question ? It will be really helpful if we can get some insights.
- 0 kudos
- 1478 Views
- 0 replies
- 0 kudos
Restore committed changes to Azure Databricks Git after abandoned pull request
I want to restore the committed changes (before and after view) in my branch. As this pull request was abandoned in Azure DevOps, then the branch was not merged. Therefore, the modified notebooks still exist but not the commits.How can I retrieve aga...
- 1478 Views
- 0 replies
- 0 kudos
- 626 Views
- 0 replies
- 0 kudos
Hi, how can one go about assessing the value created due to the implementation of DBRX?
Hi, how can one go about assessing the value created due to the implementation of DBRX?
- 626 Views
- 0 replies
- 0 kudos
- 715 Views
- 0 replies
- 0 kudos
How does Datbricks differ from snowflake with respect to AI tooling
How does Datbricks differ from snowflake with respect to AI tooling
- 715 Views
- 0 replies
- 0 kudos
- 1738 Views
- 1 replies
- 0 kudos
org.apache.spark.SparkException: Job aborted due to stage failure:
org.apache.spark.SparkException: Job aborted due to stage failure:
- 1738 Views
- 1 replies
- 0 kudos
- 0 kudos
Along with Job aborted due to stage failure: if you see slave lost... then it is due to less memory allocated for executors, more cores per executor more memory required or the other possibility is you have used max cpu available in cluster and the d...
- 0 kudos
- 1071 Views
- 0 replies
- 0 kudos
UC_COMMAND_NOT_SUPPORTED.WITHOUT_RECOMMENDATION in shared access mode?
I'm using a shared access cluster and am getting this error while trying to upload to Qdrant. #embeddings_df = embeddings_df.limit(5) options = { "qdrant_url": QDRANT_GRPC_URL, "api_key": QDRANT_API_KEY, "collection_name": QDRANT_COLLEC...
- 1071 Views
- 0 replies
- 0 kudos
- 2651 Views
- 1 replies
- 0 kudos
Facing issue with Databricks Context Limit Exceeded
We have a use case , where there may be chance of 200 jobs executing at once. But Few notebooks are failing with an issue "run failed with error message Too many execution contexts are open right now.(Limit set currently to 150)." Can anyone help how...
- 2651 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello, you can refer to the following KB article for information and best practices in regards the issue you are facing: https://kb.databricks.com/en_US/notebooks/too-many-execution-contexts-are-open-right-now Best practices Use a job cluster instea...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
4 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
Aws databricks
1 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free Edition
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
2 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
JSON Object
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
2 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
2 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Speakers
1 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Vnet
1 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 129 | |
| 57 | |
| 42 | |
| 42 |