- 8924 Views
- 8 replies
- 0 kudos
Unable to reactive an inactive user
Hi all,I am facing an issue with reactivating an inactive user i tried the following json with databricks cli run_update = { "schemas": [ "urn:ietf:params:scim:api:messages:2.0:PatchOp" ], "Operations": [ { "op": "replace", "path": "ac...
- 8924 Views
- 8 replies
- 0 kudos
- 0 kudos
@FunkybunchOO Thank you for your response! I will look into other connections, but we are not currently using SCIM. There must be something similar blocking the activation.
- 0 kudos
- 633 Views
- 0 replies
- 0 kudos
UCX
Hey folks! I want to know what are the features that UCX does not provides in UC or specially Hive to UC Migration that can be done manually but not using UCX. As UCX is currently in developing mode so there are so many drawbacks, can someone share t...
- 633 Views
- 0 replies
- 0 kudos
- 472 Views
- 0 replies
- 0 kudos
databricks billing
Is there a way to use a business checking account to pay for Databricks services?
- 472 Views
- 0 replies
- 0 kudos
- 1336 Views
- 2 replies
- 0 kudos
Resolved! Translating XMLNAMESPACE in SQL Databricks
We are loading a data source that contains XML. I am translating their queries to create views in Databricks. They use 'XMLNAMESPACES' to construct/parse XML. Below is an example. What is best practice for translating 'XMLNAMESPACES' in Databricks?...
- 1336 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @TinaN, To handle XMLNAMESPACES in Databricks, use the from_xml function for parsing XML data, where you can define namespaces within your parsing logic. Start by reading the XML data using spark.read.format("xml"), then apply the from_xml functio...
- 0 kudos
- 705 Views
- 1 replies
- 0 kudos
Can I load the files based on the data in my table as variable without iterating through each row?
Hi,I have created this table which contains the data that I need for my source path and target table. source_path: /data/customer/sid={sid}/abc=1/attr_provider={attr_prov}/source_data_provider_code={src_prov}/So basically, the value of each row are c...
- 705 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @zll_0091, To efficiently load only the necessary files without manually iterating through each row of your table, you can use Spark's DataFrame operations. First, read your table into a DataFrame and determine the maximum key value. Then, filter ...
- 0 kudos
- 1370 Views
- 2 replies
- 0 kudos
Databricks Certification exam got Suspended - Need Support
Hello Team, @Cert-Team , @Cert-TeamOPS I faced a very bad experience while attempting my 1st DataBricks certification.I was asked to exit the exam multiple times by the support team saying technical issues. My test got rescheduled multiple times with...
- 1370 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @ozbieG, I'm sorry to hear your exam was suspended. Thank you for filing a ticket with our support team. Please allow the support team 24-48 hours to resolve. In the meantime, you can review the following documentation: Room requirements Behaviour...
- 0 kudos
- 1151 Views
- 1 replies
- 0 kudos
What API Testing Tool Do You Use?
Hi Databricks!I am a relatively new developer that's looking for a solid API testing tool. I am interested in hearing about other developers, new or experienced, about their experiences with API testing tools, regardless if they are good or bad. I've...
- 1151 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @bytetogo,In my daily work I use Postman. It has user-friendly interface, supports automated testing and has support for popular patterns and libraries. It is also compatible with Linux, MacOs, Windows.
- 0 kudos
- 4789 Views
- 1 replies
- 0 kudos
Databricks book recommendations
Hi all,I am very new to databricks. I am looking for any good book recommendations that can help me get started. I know there is a vast resource available online but I feel a book will give me a structured approach to get startedAny book recommendati...
- 4789 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @uniqueusername ,I would start with books that teach you spark.Learning Spark, 2nd Edition by Jules S. Damji, Brooke Wenig, Tathagata Das, Denny LeeData Analysis with Python and PySpark by Jonathan Rioux (Author)After you learn spark foundation, o...
- 0 kudos
- 1326 Views
- 3 replies
- 0 kudos
Unable to create a workspace
Noel Nosse <nnosse@my.wgu.edu> 9:03 PM (0 minutes ago) to Databricks To complete a tutorial requires a workspace. The directions for the quickstart are outdated and do not match AWS. AWS has their own guide but cloudformation requires email ...
- 1326 Views
- 3 replies
- 0 kudos
- 0 kudos
Now I get: Redirecting to: https://accounts.cloud.databricks.com/login/password?next_url=%2Fapi%2F2.
- 0 kudos
- 1147 Views
- 1 replies
- 0 kudos
Job Cluster best practices for production workloads
Hi All,Can you please share the best practices for job clusters configurations for production workloadsand which is good when compared to serverless and job cluster in production in terms of cost and performance?Regards,Phani
- 1147 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Phani1, For configuring job clusters for production workloads in Databricks, follow these best practices: match cluster size to workload needs, enable autoscaling for dynamic adjustment of worker nodes, use spot instances with a fallback to on-de...
- 0 kudos
- 1119 Views
- 0 replies
- 0 kudos
Databricks visualization Data labels sssssss
I have been using visualization for a lot of different usecases and has been working for instead of using 3rd party libraries. Recently I had a need to customize the data labels but I haven't seen anything in the documentation that how to do that. If...
- 1119 Views
- 0 replies
- 0 kudos
- 1470 Views
- 3 replies
- 1 kudos
How to return the function result instead of the function syntax of a variable?
Hi,I'm trying to get the certain value of my variable in the for loop but it's returning the syntax instead of the value. Also, is it possible to covert this value to an integer? Thanks
- 1470 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @zll_0091 ,Could you provide more code? What's inside dsfd variable? What's your expected outcome?
- 1 kudos
- 1082 Views
- 1 replies
- 1 kudos
Azure Databricks add repo
While adding arepo in databricks workspace I am getting error as 'Error creating repoThe Azure container does not exist'Please see the attached screenshot.Anyone please suggest the fix.
- 1082 Views
- 1 replies
- 1 kudos
- 1 kudos
There are three possible causes. The Azure container might not have been properly created when the workspace was provisioned.The Azure container might have been deleted or moved after it was created.There might be a problem with the permissions or r...
- 1 kudos
- 3169 Views
- 6 replies
- 4 kudos
Cannot Create databricks account
I am trying to create databricks community account but after providing all the information and completing the puzzle it is showing me an error occurred: I also recorded the the network request for the error:Header:Request URL:https://www.databricks.c...
- 3169 Views
- 6 replies
- 4 kudos
- 4 kudos
@AllI am trying it from Bangladesh. Is there any country wise restrictions??
- 4 kudos
- 2098 Views
- 4 replies
- 0 kudos
Cannot sign-in at accounts.cloud.databricks.com
Hi,I have registered for Community Edition and can access it with no problems trough: https://community.cloud.databricks.com/login.htmlNow, I'm interested in completing the free "lakehouse fundamentals" training here and taking the quiz to get the ba...
- 2098 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @slechtd , @qiuqiu You can't log in because this is login page for databricks customers. You should use login at community edition, like on the bottom left side of the below screen Furthermore, to get databricks fundamentals accreditation you need...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
115 | |
56 | |
42 | |
34 |