cancel
Showing results for 
Search instead for 
Did you mean: 
Dicer
Valued Contributor
since ‎06-20-2022
‎05-27-2024

User Stats

  • 34 Posts
  • 1 Solutions
  • 14 Kudos given
  • 28 Kudos received

User Activity

I am using the distributed Pandas on Spark, not the single node Pandas.But when I try to run the following code to transform a data frame with 652 x 729803 data points  df_ps_pct = df.pandas_api().pct_change().to_spark()  , it returns me this error: ...
I want to import the ibapi python module in Azure Databricks Notebook.Before this, I downloaded the the TWS API folder from https://interactivebrokers.github.io/# I need to go through the following steps to install the API:Download and install TWS Ga...
I tried to VACUUM a delta table, but there is a Syntax error.Here is the code:%sql set spark.databricks.delta.retentionDurationCheck.enabled = False   VACUUM test_deltatable
I only have 1000 columns. Each column has 252 rows, so there are only 252000 data points.How come it can route tasks for the best-cached locality for 7 hours?
Data type:AAPL_Time: timestampAAPL_Close: floatRaw Data:AAPL_Time AAPL_Close 2015-05-11T08:00:00.000+0000 29.0344 2015-05-11T08:30:00.000+0000 29.0187 2015-05-11T09:00:00.000+0000 29.0346 2015-05-11T09:3...
Kudos from
Kudos given to