- 1188 Views
- 1 replies
- 1 kudos
Instance profiles are not working in Shared access mode
I’m trying to fetch billing data from an AWS account using boto3 to assume a role that has access to this information. This operation works fine in No Isolation and Single User access modes, but it fails in Shared access mode. Since I need to store t...
- 1188 Views
- 1 replies
- 1 kudos
- 1 kudos
Instance Profiles are open for all, they won't work in Shared access mode. Using Storage credentials, create External Locations. Cleaner way to Govern the access.
- 1 kudos
- 4521 Views
- 4 replies
- 1 kudos
Resolved! Can we get notebook owner using notebook path as parameter in api ?
I need to get the notebook owner using api or some other way by passing notebook path as parameter.
- 4521 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi @RahulChaubey Did you find any solution to your problem?
- 1 kudos
- 4002 Views
- 2 replies
- 2 kudos
Associating a Git Credential with a Service Principal using Terraform Provider (AWS)
I am attempting to create a Databrick Repo in a workspace via Terraform. I would like the Repo and the associated Git Credential to be associated with a Service Principal. In my initial run, the Terraform provider is associated with the user defined ...
- 4002 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi Kinger and Debi-Moha, Do the steps in the "Use a service principal with Databricks Git folders" documentation work for you? Specifically for Terraform: https://docs.databricks.com/en/repos/ci-cd-techniques-with-repos.html#terraform-integration Th...
- 2 kudos
- 4160 Views
- 6 replies
- 3 kudos
Resolved! Data loss after writing a transformed pyspark dataframe to delta table in unity catalog
Hey guys, after some successful data preprocessing without any errors, i have a final dataframe shape with the shape of ~ (200M, 150). the cluster i am using has sufficient ram + cpus + autoscaling, all metrics look fine after the job was done.The pr...
- 4160 Views
- 6 replies
- 3 kudos
- 3 kudos
@szymon_dybczak i could resolve it now! basically, i broke the process down into further subprocesses, for each sub process, i cached and wrote them all into delta table (without overwritting), the next subprocess needs to read data in the delta tabl...
- 3 kudos
- 4399 Views
- 3 replies
- 3 kudos
Display data as multi-line in dashboard table
I am displaying a table in a notebook dashboard. One column of the data is conceptually a list of strings. I can originate or convert the list as whatever format would be useful (as a string representing a JSON array, as an ARRAY struct, etc.). I w...
- 4399 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi @DavidKxx ,What you can do is convert your array to into an HTML formatted string with bullet points.Here is the code: # Sample data with an array column data = [ (1, ['Apple', 'Banana', 'Cherry']), (2, ['Dug', 'Elephant']), (3, ['Fish...
- 3 kudos
- 2379 Views
- 1 replies
- 0 kudos
Word wrap in dashboards
When I'm displaying a Table-style visualization in a notebook dashboard, is there a setting I can apply to a text column so that it automatically word-wraps text longer than the display width of the column?For example, in the following dashboard disp...
- 2379 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @DavidKxx ,That is quite similar question to one about displaying array as bullet list. Since you were successful in implementing displayHTML, what do you think about doing similar in this case? # Sample DataFrame with long text data = [ (1, '...
- 0 kudos
- 630 Views
- 0 replies
- 0 kudos
Databricks integration with Canada Post
hello everyone,I want to validate whether the postal code in my data is a valid postal code as per Canada postal code directory. Assuming, we have subscription to Canada Post APIs, how can we bring the postal code data into Databricks?Thanks
- 630 Views
- 0 replies
- 0 kudos
- 553 Views
- 0 replies
- 0 kudos
setting defaultValue for runtime_engine policy
Hi,I want to set runtime_engine default value to STANDARD but a user can also select PHOTON if they want. Something like below. Can anyone please verify if that works?"runtime_engine": {"type": "allowlist","values": ["STANDARD","PHOTON"],"defaultValu...
- 553 Views
- 0 replies
- 0 kudos
- 1844 Views
- 3 replies
- 0 kudos
Lost untracked and gitignore files when push new commit
Hi guys, our team uses Databricks clusters to develop with Python using Jupyter Notebooks.Recently we had a serious problem in our repos folder ( mainly Jupyter Notebook and Python scripts ).After my teammate committed in the branch, all the tracked ...
- 1844 Views
- 3 replies
- 0 kudos
- 0 kudos
Yes we have a subscription. Please tell me which information you need to check. I want to dive in this problem. If possible, pls contact through my email.
- 0 kudos
- 21908 Views
- 4 replies
- 0 kudos
Connecting live google sheets data to Databricks
Hi! So we have a live google sheets data that gets updated on an hourly/daily basis, and we want to bring it to databricks as a live/scheduled connection for further analysis, together with other tables and views present there. Do you have any sugges...
- 21908 Views
- 4 replies
- 0 kudos
- 0 kudos
Thanks @Ajay-Pandey ! Appreciate your reply. I am new to Databricks, apologies, but I wonder if it's possible to put this live data into a table under a specific catalog-schema? Such that the table will reflect the live data in google sheet?
- 0 kudos
- 2295 Views
- 5 replies
- 1 kudos
My Databricks Exam got suspended
Hi Databricks Team,My Certification exam got suspended today, I started my exam as normal and then my exam was put on hold quoting "team from support needs to talk with you", I connected with the support team and showed my Exam area and everything ve...
- 2295 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi @Cert-TeamOPS ,My exam was rescheduled today and I gave my exam and still it got suspended, it's very sad that it's happening again to meI have complied with all the rules set by the exam.The question and options are on the complete left of the sc...
- 1 kudos
- 5689 Views
- 3 replies
- 1 kudos
#Table object name limits
Hi, is there a limit on the number of characters a table object name can be? If so, please provide the source of where this information can be found.
- 5689 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @Chughes1408 ,The table name can have up to 255 characters:The source:Names - Azure Databricks - Databricks SQL | Microsoft Learn
- 1 kudos
- 1684 Views
- 1 replies
- 0 kudos
Databricks Data engineering associates exam got suspended need urgent help
I am writing to request a review of my recently suspended exam. I believe that my situation warrants reconsideration, and I would like to provide some context for your understanding.I applied for Databricks Certified: Data Engineer Associate certific...
- 1684 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @trivedi5678 , Thank you for reaching out to Databricks Community support! We know how frustrating it must be for you. Thank you for filing a ticket with our support team. Please allow the support team 24-48 hours for a resolution. In the meant...
- 0 kudos
- 857 Views
- 0 replies
- 1 kudos
Testing and Issues Related to Admin Role Changes
Hello,I would like to ask a question regarding user permissions.Currently, all team members are admins. Recently, we plan to change the admin roles so that only I and another user, A, will be admins. The other members will retain general usage permis...
- 857 Views
- 0 replies
- 1 kudos
- 921 Views
- 1 replies
- 0 kudos
Unable to sign up for databricks coummunity edition.
HI, I a getting error an error has occured .Please try again later while creating data bricks community edition account steps i followed are1. provided the datails like name , email etc,2. in the next page upon clicking get started with coummunity ed...
- 921 Views
- 1 replies
- 0 kudos
- 0 kudos
I tried creating with personal email, and it worked smoothly. Please try it in incognito mode if you have any extensions that are stopping it from working.
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
2 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 119 | |
| 57 | |
| 42 | |
| 35 |