Daa & AI Summit
Sessions are very useful, welcome reception is very enjoyable. Thanks for the organization.
- 703 Views
- 0 replies
- 0 kudos
Sessions are very useful, welcome reception is very enjoyable. Thanks for the organization.
I learned about how to implement and utilize databricks more efficiently. We also explored how to live tables and how to build pipelines in the vastness that we can use this in our data orchestration process. We also explored autoloading go to live ...
I’ve learned so much I can’t put it into words! Looking forward to more!hopefully more information on data ingestion!
I need to setup a UC to share data among 4 different accounts. How can I do that using corporate VPC?
Love to be at Databricks summit #AI
How can we get the correct size for job cluster in a job workflow pipeline based in the complexity of the process? #databricks #deltalake
How can I parallelize io and cpu processing while processing microbatches on spark streaming.
We use databricks notebooks to create some inbuilt visualizations in the notebook but unable to checkin those in GitHubWe lose out the visualizations and dashboard that we create if we checkout a different branch
How Delta Lake CDF works? I seen it add additional column to data, where data was updated or deleted. So what is the purpose of change log?
Kudos to the amazing instructors and TAs for my first in-person Data Engineer Associate Training, and I've passed my exam! Having a fantastic time so far, can't wait for the content unfolded in the next two days!
Just finished the final day of training. Great content and delivery!
Just finished the advance data engineering training , was a great content and and usefull
when will be DLT ready for Scala?
Hello guys, I'm building a python package that return 1 row from DF at a time inside data bricks environment.To improve the performance of this package i used multiprocessing library in python, I have background process that his whole purpose is to p...
Using thread instead of processes solved the issue for me
Hello, I’m trying to copy a table with all it’s versions to unity catalog, I know I can use deep cloning but I want the table with the full history, is that possible?
To copy history, you would have to copy files along with the delta log folder and then create a delta table on that location
| User | Count |
|---|---|
| 1644 | |
| 793 | |
| 555 | |
| 349 | |
| 287 |