- 1095 Views
- 0 replies
- 0 kudos
In my spark application, I am using set of python libraries. I am submitting spark application as Jar Task. But I am not able to find any option provide Archive Files.So, in order to handle python dependencies, I am using approach:Create archive file...
- 1095 Views
- 0 replies
- 0 kudos
by
Surajv
• New Contributor III
- 1706 Views
- 0 replies
- 0 kudos
Hi community, I am getting below warning when I try using pyspark code for some of my use-cases using databricks-connect. Is this a critical warning, and any idea what does it mean?Logs: WARN DatabricksConnectConf: Could not parse /root/.databricks-c...
- 1706 Views
- 0 replies
- 0 kudos
by
Surajv
• New Contributor III
- 13552 Views
- 1 replies
- 0 kudos
Hi community, When I use pyspark rdd related functions in my environment using databricks connect, I get below error: Databricks cluster version: 12.2. `RuntimeError: Python in worker has different version 3.9 than that in driver 3.10, PySpark cannot...
- 13552 Views
- 1 replies
- 0 kudos
Latest Reply
Got it. As a side note, I tried above methods, but the error persisted, hence upon reading docs again, there was this statement: You must install Python 3 on your development machine, and the minor version of your client Python installation must be t...
- 2898 Views
- 0 replies
- 1 kudos
Hi,Are there any plans to build native slack integration? I'm envisioning a one-time connector to Slack that would automatically populate all channels and users to select to use for example when configuring an alert notification. It is does not seem ...
- 2898 Views
- 0 replies
- 1 kudos
by
ymt
• New Contributor II
- 2989 Views
- 0 replies
- 1 kudos
Hi team,This is how I connect to Snowflake from Jupyter Notebook:import snowflake.connector
snowflake_connection = snowflake.connector.connect(
authenticator='externalbrowser',
user='U1',
account='company1.us-east-1',
database='db1',...
- 2989 Views
- 0 replies
- 1 kudos
- 6004 Views
- 2 replies
- 1 kudos
I have a scheduled task running in workflow.Task 1 computes some parameters then these are picked up by a dependent reporting task: Task 2.I want Task 2 to report "Failure" if Task 1 fails. Yet creating a dependency in workflows means that Task 2 wil...
- 6004 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @sharpbetty , Any suggestions how I can keep the parameter sharing and dependency from Task 1 to Task 2, yet also allow Task 2 to fire even on failure of Task 1?Setup:Task 2 dependent on Task1 Challenge: To Fire Task 2 even on Task 1 FailureSoluti...
1 More Replies