- 5152 Views
- 3 replies
- 1 kudos
Setting up Unity Catalog in Azure
Trying to create a metastore that will be connected to an external storage (ADLS) but we don't have the option to create a new metastore in 'Catalog' tab in the UI. Based on some research, we see that we'll have to go into "Manage Account" and then c...
- 5152 Views
- 3 replies
- 1 kudos
- 1 kudos
I have been wrestling with this question for days now. I seem to be the only one with this question so I am sure I am doing something wrong. I am trying to create a UC metastore but there is not an option in "Catalog" to create a metastore. This s...
- 1 kudos
- 1225 Views
- 0 replies
- 0 kudos
Failed deploying bundle via gitlab - Request failed for POST
I'm encountering an issue in my .gitlab-ci.yml file when attempting to execute databricks bundle deploy -t prod. The error message I receive is: Error: Request failed for POST <path>/state/deploy.lockInterestingly, when I run the same command locally...
- 1225 Views
- 0 replies
- 0 kudos
- 2024 Views
- 2 replies
- 1 kudos
Data in dataframe is also getting deleted when we are trying to delete records from underlying table
Hi , We are trying to load data from a delta table to a dataframe(a copy of original table) . Initially delta table has count 911 . The dataframe in which the data is loaded also has the same count .Now, we are deleting some records from the delta...
- 2024 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi, There is a way to retain the copy of data frame, even if the data in underling table is manipulated but that's a memory expensive operation, be careful while using it.df1 = spark.createDataFrame(df.rdd.map(lambda x: x), schema=df.schema)Here we a...
- 1 kudos
- 19169 Views
- 6 replies
- 0 kudos
Renaming the database Name in Databricks
Team,Initially our team created the databases with the environment name appended. Ex: cust_dev, cust_qa, cust_prod.I am looking to standardize the database name as consistent name across environments. I want to rename to "cust". All of my tables are ...
- 19169 Views
- 6 replies
- 0 kudos
- 0 kudos
You can also use “CASCADE” to drop schema and tables as well. It is recursive.
- 0 kudos
- 915 Views
- 0 replies
- 0 kudos
How to confirm a workspace ID via an api token?
Hello! We are integrating with Databricks and we get the API key, workspace ID, and host from our users in order to connect to Databricks. We need the to validate the workspace ID because we do need it outside of the context of the API key (with webh...
- 915 Views
- 0 replies
- 0 kudos
- 1001 Views
- 1 replies
- 0 kudos
Custom python package iin Notebook task using bundle
Hi mates!I'n my company, we are moving our pipelines to Databricks bundles, our pipelines use a notebook that receives some parameters.This notebook uses a custom python package to apply the business logic based on the parameters that receive.The thi...
- 1001 Views
- 1 replies
- 0 kudos
- 1083 Views
- 0 replies
- 0 kudos
Spot label in pool even though the configuration selected is all on-demand
Why is there a spot label in pool even though the configuration selected is all on-demand? can someone explain?
- 1083 Views
- 0 replies
- 0 kudos
- 509 Views
- 0 replies
- 0 kudos
Is there any specific error you are receiving when running the init script? Does the run complete st
Is there any specific error you are receiving when running the init script? Does the run complete start up or fail due to the init script?
- 509 Views
- 0 replies
- 0 kudos
- 601 Views
- 0 replies
- 0 kudos
Eorror in perform a merge inside a streaming foreachbatch using the command: microBatchDF._jdf.s
I'm trying to perform a merge inside a streaming foreachbatch using the command: microBatchDF._jdf.sparkSession().sql(self.merge_query)Streaming
- 601 Views
- 0 replies
- 0 kudos
- 746 Views
- 0 replies
- 0 kudos
ipywidgets not loading in DataBricks Community Edition
Hi All,I am trying to run commands with ipywidgets but it just says :Loading widget...Same error occurs even when I re run the cell.DataBricks version used : 14.2
- 746 Views
- 0 replies
- 0 kudos
- 10004 Views
- 2 replies
- 0 kudos
numpy.ndarray size changed, may indicate binary incompatibility
Hi All,I have installed the following libraries on my cluster (11.3 LTS that includes Apache Spark 3.3.0, Scala 2.12):numpy==1.21.4flair==0.12 on executing `from flair.models import TextClassifier`, I get the following error:"numpy.ndarray size chan...
- 10004 Views
- 2 replies
- 0 kudos
- 0 kudos
You have changed the numpy version, and presumably that is not compatible with other libraries in the runtime. If flair requires later numpy, then use a later DBR runtime for best results, which already has later numpy versions
- 0 kudos
- 1290 Views
- 1 replies
- 0 kudos
Databricks Asset Bundle Behaviour for new workflows and existing workflows
Dear Community Members -I am trying to deploy a workflow using DAB. After deploying if I am updating the same workflow with different bundle name it is creating a new workflow instead of updating the existing workflow. Also when I am trying to use sa...
- 1290 Views
- 1 replies
- 0 kudos
- 0 kudos
@nicole_lu_PM : Do you have any suggestions or feedback for the above question ? It will be really helpful if we can get some insights.
- 0 kudos
- 1213 Views
- 0 replies
- 0 kudos
Restore committed changes to Azure Databricks Git after abandoned pull request
I want to restore the committed changes (before and after view) in my branch. As this pull request was abandoned in Azure DevOps, then the branch was not merged. Therefore, the modified notebooks still exist but not the commits.How can I retrieve aga...
- 1213 Views
- 0 replies
- 0 kudos
- 538 Views
- 0 replies
- 0 kudos
Hi, how can one go about assessing the value created due to the implementation of DBRX?
Hi, how can one go about assessing the value created due to the implementation of DBRX?
- 538 Views
- 0 replies
- 0 kudos
- 614 Views
- 0 replies
- 0 kudos
How does Datbricks differ from snowflake with respect to AI tooling
How does Datbricks differ from snowflake with respect to AI tooling
- 614 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
88 | |
42 | |
42 | |
30 |