Can I use Spark to stream data from Kafka?
Can I use Spark to stream data from Kafka?
- 915 Views
- 0 replies
- 1 kudos
Can I use Spark to stream data from Kafka?
Primer vez en este evento, la verdad muy buen evento, aunque este foro no tiene contenido en español, la comunidad hispano parlante está creciendo y esperamos poder aportar en el desarrollo de nuestros paises a traves de la tecnologia!.First time on ...
The Data + AI summit is a blast so far. There are so many new technologies being released such as Delta Lake ​2.0 being open source.
Agreed! You should check out the Azure booth if you haven't already they have a really cool demo.​
data is pretty awesome!
It has been an awesome experience learning about all the new releases! I’m excited to get into the new Market Place!
Databricks SQL Serverless Now Available on AWSFirst announcement at Data + AI Summit 2022 is on the availability of serverless compute for Databricks SQL (DBSQL) in Public Preview on AWS! DB SQL Serverless makes it easy to get started with data wareh...
Awesome! We definitely want to join this preview. This is the missing piece for us in making data accessible from the data lake to end users, because it will help with the compute costs. I can't wait to learn more. #Summit22​
As a SI partner of Databricks, we are excited to attend the summit and co-present with our client Amgen. Come see us in action at 3PM, check the app for the location!#ss22
Speeds up data retrieval when filling in provided fields by collocating data
Awesome session on next big thing in Google Cloud by Bruno Aziza!
Spark Connect will help in democratizing…
We are looking to move some of the artifacts to S3 from dbfs.
Yes you can do it via the mlflow python API. However know that when you do that and move the experiment location to S3 the Databricks UI interface doesnt allow interaction with the artifacts etc. the mlflow python API is rich with the abilities to in...
Excited to hear about Data mesh implementation patterns! Are you using data mesh alread?#summit22
Really excited about Delta Sharing Cleanrooms!
As am I! Between the clean rooms and the platform-agnostic data marketplace, collaborating on a variety of datasets will become much easier than before!
What will happen if a driver node will fail?What will happen if one of the worker node fails?Is it same in Spark and Databricks or Databricks provide additional features to overcome these situations?
If the driver node fails your cluster will fail. If the worker node fails, Databricks will spawn a new worker node to replace the failed node and resumes the workload. Generally it is recommended to assign a on-demand instance for your driver and spo...
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up NowUser | Count |
---|---|
1611 | |
768 | |
348 | |
286 | |
252 |