โ10-24-2022 07:56 AM
I only have 1000 columns. Each column has 252 rows, so there are only 252000 data points.
How come it can route tasks for the best-cached locality for 7 hours?
โ10-25-2022 02:32 AM
Hi @Cheuk Hin Christophe Poonโ, Can you please run this command and check again?
%sql Optimize [table name]
โ10-25-2022 06:50 AM
I tried the following one and it still took more than 10 hours until Fatal error: The Python kernel is unresponsive.
%sql
--Enable Auto Optimization
set spark.databricks.delta.properties.defaults.autoOptimize.optimizeWrite = true;
set spark.databricks.delta.properties.defaults.autoOptimize.autoCompact = true;
โ10-31-2022 11:52 AM
@Kaniz Fatmaโ Is there any way to shorten the process "Determining the location of DBIO file fragments." runtime?
โ11-27-2022 06:17 AM
Hi @Cheuk Hin Christophe Poonโ
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
โ11-30-2022 07:01 AM
Hi @Cheuk Hin Christophe Poonโ have you optimize your table anytime since it's creation? If not, then optimize may take some time depending on the no of underlying files.
Please try to run optimize manually as described in below document:
https://docs.databricks.com/sql/language-manual/delta-optimize.html
If this doesn't help, you can try disabling DBIO cache by setting below in your notebook:
spark.conf.set("spark.databricks.io.cache.enabled", "false")
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group