cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Policepatil
by New Contributor III
  • 6342 Views
  • 0 replies
  • 0 kudos

Is it good to process files in multithreading?

Hi,I need to process nearly 30 files from different locations and insert records to RDS.I am using multi-threading to process these files parallelly like below. def process_files(file_path):    <process files here>    1. Find bad records based on fie...

  • 6342 Views
  • 0 replies
  • 0 kudos
bachan
by New Contributor II
  • 1940 Views
  • 1 replies
  • 0 kudos

Data Insertion

Scenario: Data from blob storage to SQL db once a week.I have 15(from current date to next 15 days) days data into the blob storage, stored date wise in parquet format, and after seven days the next 15 days data will be inserted. Means till 7th day t...

  • 1940 Views
  • 1 replies
  • 0 kudos
Sivaji
by New Contributor
  • 1224 Views
  • 1 replies
  • 0 kudos

Databricks data engineer associate Exam got suspended.

Hello Team, I encountered Pathetic experience while attempting my 1st DataBricks certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. wasted my time and then suspended my exam. I want to file a compla...

Get Started Discussions
Data engineer Associate
Exam.
  • 1224 Views
  • 1 replies
  • 0 kudos
Latest Reply
Cert-Team
Databricks Employee
  • 0 kudos

Hi @Sivaji Sorry to hear you had a bad experience, and that you got a slow response here in the community. I see that you have taken and passed the exam, Congratulations!For the future, our support team handles cases from here first so it tends to be...

  • 0 kudos
norbitek
by New Contributor II
  • 1955 Views
  • 1 replies
  • 0 kudos

Is it a bug in DEEP CLONE?

Hi,I'm trying to modify a delta table using following approach:Shallow clone of the table (source_table)Modification of the the clone (clonned_table)Deep clone of the modified table to the source table.Source delta table has 26 752 rows. Current Delt...

  • 1955 Views
  • 1 replies
  • 0 kudos
JRL
by New Contributor III
  • 2238 Views
  • 1 replies
  • 0 kudos

Github "Danger Zone"

There is a "Danger zone" appearing in the Github indicating that the repositories I share on Databricks should be Suspended and possibly that Databricks should be uninstalled.   This may be something standard in Github.  Has anyone run across it?

DangerZone.PNG
  • 2238 Views
  • 1 replies
  • 0 kudos
Latest Reply
sean_owen
Databricks Employee
  • 0 kudos

It's not telling you that you should do these things. It's telling you that you may break stuff by doing these things. Yes the "Danger Zone" is a thing on Github, it tries to warn you before you do things like click to delete a repo.

  • 0 kudos
Srikanthn
by New Contributor II
  • 1701 Views
  • 0 replies
  • 1 kudos

Unable to change/cast column datatype using Delta IO

I have created a delta table using Delta IO library, with following detailsTable Name: EmployeeColumns {Id Integer, name String, gender String, Salary decimal(5,2)}Now I want to upcast the salary from decimal(5,2) to decimal(10,4). If I use delta IO ...

  • 1701 Views
  • 0 replies
  • 1 kudos
jermaineharsh
by New Contributor III
  • 1432 Views
  • 0 replies
  • 1 kudos

How to switch from free trial to Community Edition of Databricks in my Azure workspace?

hello,I am trying to switch into Databricks Community Edition after a 14 day trial. I was able to register, but when I try to start my new cluster, I get an error message, "Cluster start feature is currently disabled, and the cluster does not run".In...

  • 1432 Views
  • 0 replies
  • 1 kudos
Picci
by New Contributor III
  • 5773 Views
  • 3 replies
  • 3 kudos

Resolved! Databricks community edition still available?

Is Databricks platform still available in its Community edition (outside Azure, AWS or GCP)? Can someone share the updated link?Thanks,Elisa

  • 5773 Views
  • 3 replies
  • 3 kudos
Latest Reply
jamescw
New Contributor II
  • 3 kudos

Look : it is still available but afaik always linked to azure/gcp/aws

  • 3 kudos
2 More Replies
bento
by New Contributor
  • 16873 Views
  • 1 replies
  • 1 kudos

Resolved! Notebook Langchain ModuleNotFoundError: No module named 'langchain.retrievers.merger_retriever'

Hi,As mentioned in the title, receiving this error despite%pip install --upgrade langchainSpecific line of code:from langchain.retrievers.merger_retriever import MergerRetriever All other langchain import works when this is commented out. Same line w...

  • 16873 Views
  • 1 replies
  • 1 kudos
Latest Reply
sean_owen
Databricks Employee
  • 1 kudos

More specifically: langchain releases a new update every few days, and it is likely that you are using code or a library that needs a later version of langchain than you have (or, perhaps, a later version that removed whatever part of langchain you r...

  • 1 kudos
ankit_batra1
by New Contributor
  • 2232 Views
  • 2 replies
  • 1 kudos

Databricks notebook execution using only one task

I am running a databricks notebook. While running, i only see one task on one worker getting started. My cluster has min 6 workers but seems like they are not getting used.I am performing a read operation from Cosmos DB.Can someone please help me her...

  • 2232 Views
  • 2 replies
  • 1 kudos
Latest Reply
sean_owen
Databricks Employee
  • 1 kudos

If your code does not use Spark, it will not use any machines except the driver. If you're using Spark but your source data that you operate on has 1 partition, there will be only 1 task. Hard to say more without knowing what you are doing in more de...

  • 1 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels