- 1306 Views
- 6 replies
- 3 kudos
Resolved! Data loss after writing a transformed pyspark dataframe to delta table in unity catalog
Hey guys, after some successful data preprocessing without any errors, i have a final dataframe shape with the shape of ~ (200M, 150). the cluster i am using has sufficient ram + cpus + autoscaling, all metrics look fine after the job was done.The pr...
- 1306 Views
- 6 replies
- 3 kudos
- 3 kudos
@szymon_dybczak i could resolve it now! basically, i broke the process down into further subprocesses, for each sub process, i cached and wrote them all into delta table (without overwritting), the next subprocess needs to read data in the delta tabl...
- 3 kudos
- 969 Views
- 3 replies
- 3 kudos
Display data as multi-line in dashboard table
I am displaying a table in a notebook dashboard. One column of the data is conceptually a list of strings. I can originate or convert the list as whatever format would be useful (as a string representing a JSON array, as an ARRAY struct, etc.). I w...
- 969 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi @DavidKxx ,What you can do is convert your array to into an HTML formatted string with bullet points.Here is the code: # Sample data with an array column data = [ (1, ['Apple', 'Banana', 'Cherry']), (2, ['Dug', 'Elephant']), (3, ['Fish...
- 3 kudos
- 530 Views
- 1 replies
- 0 kudos
Word wrap in dashboards
When I'm displaying a Table-style visualization in a notebook dashboard, is there a setting I can apply to a text column so that it automatically word-wraps text longer than the display width of the column?For example, in the following dashboard disp...
- 530 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @DavidKxx ,That is quite similar question to one about displaying array as bullet list. Since you were successful in implementing displayHTML, what do you think about doing similar in this case? # Sample DataFrame with long text data = [ (1, '...
- 0 kudos
- 295 Views
- 0 replies
- 0 kudos
Databricks integration with Canada Post
hello everyone,I want to validate whether the postal code in my data is a valid postal code as per Canada postal code directory. Assuming, we have subscription to Canada Post APIs, how can we bring the postal code data into Databricks?Thanks
- 295 Views
- 0 replies
- 0 kudos
- 277 Views
- 0 replies
- 0 kudos
setting defaultValue for runtime_engine policy
Hi,I want to set runtime_engine default value to STANDARD but a user can also select PHOTON if they want. Something like below. Can anyone please verify if that works?"runtime_engine": {"type": "allowlist","values": ["STANDARD","PHOTON"],"defaultValu...
- 277 Views
- 0 replies
- 0 kudos
- 714 Views
- 3 replies
- 0 kudos
Lost untracked and gitignore files when push new commit
Hi guys, our team uses Databricks clusters to develop with Python using Jupyter Notebooks.Recently we had a serious problem in our repos folder ( mainly Jupyter Notebook and Python scripts ).After my teammate committed in the branch, all the tracked ...
- 714 Views
- 3 replies
- 0 kudos
- 0 kudos
Yes we have a subscription. Please tell me which information you need to check. I want to dive in this problem. If possible, pls contact through my email.
- 0 kudos
- 15764 Views
- 4 replies
- 0 kudos
Connecting live google sheets data to Databricks
Hi! So we have a live google sheets data that gets updated on an hourly/daily basis, and we want to bring it to databricks as a live/scheduled connection for further analysis, together with other tables and views present there. Do you have any sugges...
- 15764 Views
- 4 replies
- 0 kudos
- 0 kudos
Thanks @Ajay-Pandey ! Appreciate your reply. I am new to Databricks, apologies, but I wonder if it's possible to put this live data into a table under a specific catalog-schema? Such that the table will reflect the live data in google sheet?
- 0 kudos
- 1264 Views
- 5 replies
- 1 kudos
My Databricks Exam got suspended
Hi Databricks Team,My Certification exam got suspended today, I started my exam as normal and then my exam was put on hold quoting "team from support needs to talk with you", I connected with the support team and showed my Exam area and everything ve...
- 1264 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi @Cert-TeamOPS ,My exam was rescheduled today and I gave my exam and still it got suspended, it's very sad that it's happening again to meI have complied with all the rules set by the exam.The question and options are on the complete left of the sc...
- 1 kudos
- 621 Views
- 3 replies
- 1 kudos
#Table object name limits
Hi, is there a limit on the number of characters a table object name can be? If so, please provide the source of where this information can be found.
- 621 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @Chughes1408 ,The table name can have up to 255 characters:The source:Names - Azure Databricks - Databricks SQL | Microsoft Learn
- 1 kudos
- 380 Views
- 1 replies
- 0 kudos
Databricks Data engineering associates exam got suspended need urgent help
I am writing to request a review of my recently suspended exam. I believe that my situation warrants reconsideration, and I would like to provide some context for your understanding.I applied for Databricks Certified: Data Engineer Associate certific...
- 380 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @trivedi5678 , Thank you for reaching out to Databricks Community support! We know how frustrating it must be for you. Thank you for filing a ticket with our support team. Please allow the support team 24-48 hours for a resolution. In the meant...
- 0 kudos
- 405 Views
- 1 replies
- 0 kudos
Unable to sign up for databricks coummunity edition.
HI, I a getting error an error has occured .Please try again later while creating data bricks community edition account steps i followed are1. provided the datails like name , email etc,2. in the next page upon clicking get started with coummunity ed...
- 405 Views
- 1 replies
- 0 kudos
- 0 kudos
I tried creating with personal email, and it worked smoothly. Please try it in incognito mode if you have any extensions that are stopping it from working.
- 0 kudos
- 8701 Views
- 2 replies
- 2 kudos
Databricks Asia-Pacific LLM Cup 2023 - So you think you can hack?
You are invited to participate in Databricks Asia-Pacific LLM Cup 2023, an exciting virtual hackathon which kicks off in the month of October. Businesses across Asia-Pacific and Japan will have the opportunity to build, iterate and polish your LLM id...
- 8701 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi! This sounds like a great opportunity for anyone interested in machine learning and wanting to develop their skills through real projects. B If you are looking for additional resources to promote your ideas or want to increase the effectiveness of...
- 2 kudos
- 2721 Views
- 4 replies
- 2 kudos
Editor lags
Does anyone else experiencing lags on the Databricks notebook editor?The typing is sometimes very slow causes the cell to hang for a moment. (Using python)
- 2721 Views
- 4 replies
- 2 kudos
- 2 kudos
The notebook editor has been sluggish for me regardless of splitting up between cells and the language in use. Very frustrating.
- 2 kudos
- 656 Views
- 2 replies
- 0 kudos
CICD Unity Catalog
Hello How do you handle deploying Databricks Unity Catalog resources (schemas, tables, views, permissions) across environments? What are the strategies for building (compiling), validating and deploying these resources and ensure they’re error-free a...
- 656 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks, yes DAB can be used, but I'm still wondering how to validate the syntax, dependencies, permissions, etc. before deploying the DAB. Example: when you deploy an SQL Database DACBAC file, first you need to build the project and generate a DACBAC...
- 0 kudos
- 834 Views
- 3 replies
- 2 kudos
Resolved! Specify cluster when using dbutils.notebook API
Hello! Anyone know if there is a way of specifying cluster when using dbutils.notebook.run() ? As i understand it, this command will create a job compute on the run? but what if I want to use my general purpose cluster? I have been thinking of %run b...
- 834 Views
- 3 replies
- 2 kudos
- 2 kudos
Hi @ErikJ ,Check the docs:Also you may run the help method:.The parameters of the method are:1. path2. timeoutSeconds3. arguments
- 2 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »