- 949 Views
- 1 replies
- 0 kudos
Do we have an analogous concept to package cells for Python notebooks?
- 949 Views
- 1 replies
- 0 kudos
Latest Reply
You can just declare your classes and in one cell, and use them in the others. It is recommended to get all your classes in one notebook, and use %run in the other to "import" those classes.The one thing you cannot do is to literally import a folder/...
- 649 Views
- 1 replies
- 0 kudos
Is %run magic command supported in R notebook?
- 649 Views
- 1 replies
- 0 kudos
Latest Reply
The %/magic commands are notebook commands and not tied to any language so R notebook also supports %run.
- 673 Views
- 1 replies
- 0 kudos
I have turned Photon on in my endpoint, but I don't know if it's actually being used in my queries. Is there some way I can see this other than manually testing queries with Photon turned on and off?
- 673 Views
- 1 replies
- 0 kudos
Latest Reply
@Trevor Bishop​ If you go to the History tab in DBSQL, click on the specific query and look at the execution details. At the bottom, you will see "Task time in Photon".
- 2112 Views
- 2 replies
- 0 kudos
I have a case where garbage collection is taking much time and I want to optimize it for better performance
- 2112 Views
- 2 replies
- 0 kudos
Latest Reply
You can also tune the JVM's GC parameters directly, if you mean the pauses are too long. Set "spark.executor.extraJavaOptions", but it does require knowing a thing or two about how to tune for what performance goal.
1 More Replies
- 884 Views
- 1 replies
- 0 kudos
Can I get an example of how do create UDF in python and use in sql
- 884 Views
- 1 replies
- 0 kudos
Latest Reply
def squared(s):
return s * s
spark.udf.register("squaredWithPython", squared)You can optionally set the return type of your UDF. The default return type is StringType.from pyspark.sql.types import LongType
def squared_typed(s):
return s * s
spark...
- 645 Views
- 1 replies
- 1 kudos
Which IDEs are integrated with Databricks till today
- 645 Views
- 1 replies
- 1 kudos
Latest Reply
EclipseIntelliJJupyterPyCharmSBTsparklyr and RStudio DesktopSparkR and RStudio DesktopVisual Studio Code
- 1711 Views
- 1 replies
- 0 kudos
Cluster terminated. Reason: Spark Startup Failure: Spark was not able to start in time. This issue can be caused by a malfunctioning Hive metastore, invalid Spark configurations, or malfunctioning init scripts. Please refer to the Spark driver logs t...
- 1711 Views
- 1 replies
- 0 kudos
Latest Reply
I Think Container is unable to talk to hosting instance or DBFS account. It Can be solved by adding a custom route to the subnets for the DBFS account with the next hop being public route.
- 1696 Views
- 1 replies
- 0 kudos
Cloud Provider Launch Failure: A cloud provider error was encountered while setting up the cluster. See the Azure Databricks guide for more information. Azure error code: AuthorizationFailed/InvalidResourceReference.
- 1696 Views
- 1 replies
- 0 kudos
Latest Reply
Possible cause: the VNet or subnets do not exist any more. Make sure the VNet and subnets exist.
- 1727 Views
- 3 replies
- 1 kudos
They both seem to package it. When should one use one over the other?
- 1727 Views
- 3 replies
- 1 kudos
Latest Reply
One thing I think it's useful to point out for Databricks users is that you would typically not use MLflow Projects to describe execution of a modeling run. You would just use MLflow directly in Databricks and use Databricks notebooks to manage code ...
2 More Replies
- 530 Views
- 1 replies
- 0 kudos
Do we have a template to create a azure workspace with V net Injection
- 530 Views
- 1 replies
- 0 kudos
Latest Reply
yes we have a community template available to create a workspace https://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/vnet-inject#--advanced-configuration-using-azure-resource-manager-templates
- 1015 Views
- 1 replies
- 0 kudos
I saw default path for artifacts as dbfs but not sure if that's where everything else is stored. Can we modify it?
- 1015 Views
- 1 replies
- 0 kudos
Latest Reply
Artifacts like models, model metadata like the "MLmodel" file, input samples, and other logged artifacts like plots, config, network architectures, are stored as files. While these could be simple local filesystem files when the tracking server is ru...