Databricks on Virtualization
Hi Team,Can you please direct me to any content on Databricks on Virtualization?Regards,Phanindra
- 4051 Views
- 0 replies
- 0 kudos
Hi Team,Can you please direct me to any content on Databricks on Virtualization?Regards,Phanindra
Platform You can now use Structured Streaming to Stream Data from Apache Pulsar on Databricks. For more information : https://docs.databricks.com/en/structured-streaming/pulsar.html (DBR 14.1 required)Databricks Runtime 14.1 and 14.1 ML are now avail...
Hi!We had bunch of strange failures for our jobs during 28-29 of September.Some jobs` runs could not start for some time (30-50 mins) and then were failed with an error:Unexpected failure while waiting for the cluster (0929-002141-2zkekhdj) to be rea...
I got some issues with Databricks online certification. I filed twice at (https://help.databricks.com/s/contact-us?ReqType=training), but did not get any confirmation emails.@Cert-Team
@Kaniz @Cert-Team Finally figured out why my request got dropped silently - I included a link in the form. Please indicate no link in the form submission section. Thanks a lot.
Hi guys I'm relatively new to Databricks and struggling to implement an autoloader ( with trigger once = true ) in file notifications mode. I have CSV files in one container (landing zone). I would like the autoloader to pick up new and existing file...
Hi Kaniz, thank you for your reply. I initially made the mistake of using a capital letter in the queue as part of config files. I can now write, there is no error as a batch process. However, when I try to run the write stream, it says"Running Comma...
Hello Team,I encountered Pathetic experience while attempting my DataBricks Data engineer certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. wasted my time and then suspended my exam.I want to file ...
I got error: xgboost.spark.core' has no attribute 'SparkXGBClassifierModel' when attempting to load model. I have upgraded to xgboost-2.0.0.
Hello, I am facing an error while trying to read a large binary file (rosbag format) using binaryFile reader. The file I am trying to read is approx 7GB large. Here's the error message I am getting:FileReadException: Error while reading file dbfs:/mn...
HelloI am new to Databricks.is Databricks a good tool to build a sql server data warehouse using azure?How does this compare to azure data factory?
I am new to Databricks and would like to learn and become certified. I have SQL knowledge.To get started, which exam should I do first so that I have a very good understanding of Databricks fundamentals and concepts?I was thinking of “Databricks Cert...
Hi @mipayof346 The Data Engineer Associate certification exam assesses an individual’s ability to use the Databricks Lakehouse Platform to complete introductory data engineering tasks. This includes an understanding of the Lakehouse Platform and its ...
Trying to import init script from local to workspace location using databricks cli via YAML pipeline but it is getting uploaded as notebook.Need to upload it as file format using cli command as workspace init script should be in file format.Does anyo...
Reference: Salesforce and Databricks Announce Strategic Partnership to Bring Lakehouse Data Sharing and Shared AI Models to Businesses - Salesforce NewsI was going through this article and wanted to know if anyone in community is planning to use this...
Hi, I need to do following calculations on a dataframe. It should be done for each period and calculated value will be used for next period's calculation. Adding sample data and formula from excel here. Thanks in advance for your help.Need to calcula...
I’m new to Databricks and don’t have any practical experience. I can write SQL code fluently. I’d like to get Databricks certified, which exam should I start to get a good understanding of the fundamentals?Also, what are the most important basic conc...
@mipayof346 it's best to start with the Lakehouse Fundamentals Accreditation (free course, free assessment). Then I recommend that you move to the Data Analyst certification path. Details on both can be found here: https://www.databricks.com/learn/ce...
I am loading a table into a data frame using df = spark.table(table_name) Is there a way to load only the required columns? The table has more than 50+ columns and I only need a handful of column.
@vk217 Simply just use select function, ex.df = spark.read.table(table_name).select("col1", "col2", "col3")
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now| User | Count |
|---|---|
| 133 | |
| 129 | |
| 72 | |
| 57 | |
| 42 |