- 13094 Views
- 3 replies
- 0 kudos
Space in Column names when writing to Hive
All,I have the following code.df_Warehouse_Utilization = ( spark.table("hive_metastore.dev_ork.bin_item_detail") .join(df_DIM_Bins,col('bin_tag')==df_DIM_Bins.BinKey,'right') .groupby(col('BinKey')) .agg(count_distinct(when(col('serial_lo...
- 13094 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi,I have faced this issue a few times. When we are overwriting the dataframes to hive catalog in databricks, it doesn't naturally allow for column names to have spaces or special characters. However, you can add an option statement to bypass that ru...
- 0 kudos
- 14024 Views
- 11 replies
- 1 kudos
Cannot create an account to try Community Edition
Hi,Whenever I try to signup for an account, I keep getting the following message - "an error has occurred. please try again later" when I click on the button "get started with databricks community edition".Could you please let me know why this could...
- 14024 Views
- 11 replies
- 1 kudos
- 1 kudos
I got the same problem if I try to register or login through Community Edition link. But I tried by clicking the "Try Databricks" button on top right corner of the https://www.databricks.com/ home page, I was able to register and login successfully j...
- 1 kudos
- 938 Views
- 2 replies
- 0 kudos
spark_partition_id() - User does not have permission SELECT on anonymous function
I'm trying to verify the partitions assigned to rows.I'm running something like this:from pyspark.sql.functions import spark_partition_id df = spark.read.table("some.uc.table").limit(10) df = df.repartition(2) df = df.withColumn("partitionid", spar...
- 938 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @jes, I have validate your failure internally and found that there is already an internal request to address this behavior. Are you using a shared access mode cluster? As this behavior does not look to be observed when using single access mode...
- 0 kudos
- 1256 Views
- 4 replies
- 1 kudos
Connection type 'SALESFORCE' is not enabled. Please enable the connection to use it.
I'm trying to connect to Salesforce in databricks, I'm following this:https://learn.microsoft.com/en-us/azure/databricks/query-federation/salesforce-data-cloud#sql-1and when I run the "Create Catalog..." I see this error, how would I enable salesforc...
- 1256 Views
- 4 replies
- 1 kudos
- 1 kudos
The reason they're getting this error is because workspace is not enabled for the LakeFlow Connect preview. Could you please file a ticket with us, as we might required additional details. Please refer to: https://docs.databricks.com/en/resources/s...
- 1 kudos
- 1278 Views
- 1 replies
- 0 kudos
Databricks User Group
Are there any Databricks User Group Meetups in UK?
- 1278 Views
- 1 replies
- 0 kudos
- 0 kudos
You can find some of the groups in EMEA here: https://community.databricks.com/t5/europe-middle-east-and-africa/ct-p/EMEA
- 0 kudos
- 2204 Views
- 1 replies
- 0 kudos
Resolved! Using Autoloader with merge
Hi Everyone, I have been trying to use autoloader with foreach so that I could able to use merge into in databricks, but while using I have been getting below error.error-Found error inside foreachBatch Python processMy code-from delta.tables import ...
- 2204 Views
- 1 replies
- 0 kudos
- 0 kudos
It seems the columns of your join condition are not found. Are they in the dataframes/table?Also try to put the whole join condition in a single string:"s.JeHeaderId = t.JeHeaderId and s.JeLineId = t.JeLineId"
- 0 kudos
- 2090 Views
- 1 replies
- 0 kudos
Connect to SQL Developer using Custom JDBC
Hello,I'm trying to connect databricks SQL to the SQL Developer using custom JDBC.I'm getting errorjdbc:databricks:<server>:443;HttpPath=<HttpPath>;UID=token;PWD=<password> RegardsNaga
- 2090 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi,We are trying to test if we can connect sql developer to databricks. did it work for you?Regards, shalini
- 0 kudos
- 1772 Views
- 6 replies
- 0 kudos
Unity Catalog About Metastore
Registered on 2024/10 from AWS marketplace.We have created a customer management VPC and manually created the workspace.No specific metastore settings were made when the workspace was created.In the catalog screen of the account console,unity catalog...
- 1772 Views
- 6 replies
- 0 kudos
- 0 kudos
Hello @tyorisoo,I hope you are doing well!Metastore manages metadata, not catalog information, schema information, table information, function information, access control information, etc. In the current state, the metastore configuration is not done...
- 0 kudos
- 1766 Views
- 3 replies
- 0 kudos
Databricks CI/CD
Hi Team,How can we implement CI/CD in Databricks and its best practices, and tools to consider?Regards,Janga
- 1766 Views
- 3 replies
- 0 kudos
- 0 kudos
Thank you so much for sharing the link.
- 0 kudos
- 2396 Views
- 2 replies
- 1 kudos
What is the quota limit for using create user token api?
Hi Community, I was going through this doc: https://docs.databricks.com/api/workspace/tokens/create to and got to know, that there is a quota limit to how many token one can generate using the api: POST /api/2.0/token/create, having breached the thre...
- 2396 Views
- 2 replies
- 1 kudos
- 1 kudos
Hello@Surajv, Q1: What is and how to find out the quota limit? The quota limit for creating user tokens via the API (POST /api/2.0/token/create) is essential to manage token usage. Each user can have multiple personal access tokens in a Databricks wo...
- 1 kudos
- 1926 Views
- 2 replies
- 0 kudos
Resolved! Can I give different git branches in the same repo for different tasks in a data bricks workflow
I have 2 tasks (T1 &T2) that run in branch B1 of Repo1.I have created a new task (depends on T2 ) which points to a different branch B2 of same Repo1.Is it possible to run them in the same workflow pipeline? When I tried to set this up, databricks c...
- 1926 Views
- 2 replies
- 0 kudos
- 0 kudos
I was to able to find a workaround. Created separate jobs for those that need to be in different branch (testing tasks) and then ran all of them from a new job.
- 0 kudos
- 1725 Views
- 1 replies
- 0 kudos
Resolved! Setting a preset list of values in a task parameter in databricks job
I want to be able to have a user select from a preset list of values for a task parameter when they kick off a job with the "Run now with different parameters" option. In a notebook I am able to use dbutils.widgets.dropdown() to set the list of value...
- 1725 Views
- 1 replies
- 0 kudos
- 0 kudos
Unfortunately providing a job params dropdown list is not currently available, you can alway do a Run with different params, but the user will have to change them manually and not with a predefined list.
- 0 kudos
- 721 Views
- 1 replies
- 0 kudos
New icon for SQL Editor looks like a broken image
Hey - I may be showing my age here, but I felt compelled to point out that at a glance, the new icon for a SQL Editor tab in the Databricks UI looks an awful lot like a broken image link icon, from the days of Internet Explorer. This, subconsciously,...
- 721 Views
- 1 replies
- 0 kudos
- 0 kudos
Is this still showing broking image? Is this only happening in Explorer, if you try Chrome for example does it work?Can you share an screenshot of your workspace to better understand how it shows?
- 0 kudos
- 829 Views
- 1 replies
- 0 kudos
Databricks Initial Costs AWS
I have a new premium account. I set up a cost dashboard (see attached) after I create a new workspace using AWS Quickstart, where I see some costs. Why do I have this If I am not using Databricks at all? How can I save the costs?
- 829 Views
- 1 replies
- 0 kudos
- 0 kudos
Are you seeing this data from the Usage tab in the Account console? Does it allow you to filter it by SKU?
- 0 kudos
- 1547 Views
- 2 replies
- 1 kudos
Resolved! Internal Error with MERGE Command in Spark SQL
I'm trying to perform a MERGE between two tables (customers and customers_update) using Spark SQL, but I’m encountering an internal error during the planning phase. The error message suggests it might be a bug in Spark or one of the plugins in use.He...
- 1547 Views
- 2 replies
- 1 kudos
- 1 kudos
The issue you encountered with the MERGE statement in Spark SQL, which was resolved by specifying the database and metastore, is likely related to how Spark handles table references during the planning phase. The internal error you faced suggests a b...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Delta Lake
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
90 | |
42 | |
42 | |
30 |