When I use LightGBM, I get the following error on the line below: 'str' object has no attribute 'getParam'.Is this because serverless cannot run the JAR files that SynapseML depends on?File /local_disk0/.ephemeral_nfs/envs/pythonEnv-b0d5f8ce-8426-443...
I only have an AWS Access Key ID and Secret Access Key, and I want to use this information to access S3.However, the official documentation states that I need to set the AWS_SECRET_ACCESS_KEY and AWS_ACCESS_KEY_ID environment variables, but I cannot ...
I can use pandas to read local files in a notebook, such as those located in tmp.However, when I run two consecutive notebooks within the same job and read files with pandas in both, I encounter a permission error in the second notebook stating that ...
Can the default cluster Serverless of Databricks install Scala packagesI need to use the spark-sftp package, but it seems that serverless is different from purpose compute, and I can only install python packages?There is another question. I can use p...
An error occurred while converting a timestamp in the yyyyMMddHHmmssSSS formatfrom pyspark.sql.functions import to_timestamp_ntz, col, lit
df = spark.createDataFrame(
[("20250730090833000")], ["datetime"])
df2 = df.withColumn("dateformat", to_t...
Oh, it seems i misunderstood someting...Still haven't found a solution.However, since the notebook environment can run the original LightGBM , I can use that for now.If anyone has any suggestions or ideas, your guidance would be greatly appreciated.
Sorry, I just found out while checking the official documentation that starting from November, dependency JAR files are supported.I’ll give it a try and see how well it works.If anyone has any insights, I would greatly appreciate your guidance.
@szymon_dybczak Sorry, I'm a newbie. Currently, I can only append S3 to external data through role. For your suggestion about using UC, can I convert the CSV file in S3 into a volume in UC only through id and key, or is there another method
hi @szymon_dybczak Thank you very much for your answer,But I can't use other cluster , I can only use serverless, can I set them for serverless?I also configure it at a notebook scope, but they are not working properly at the moment. I have been tol...