cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

amitdatabricksc
by New Contributor II
  • 5322 Views
  • 4 replies
  • 2 kudos

how to zip a dataframe

how to zip a dataframe so that i get a zipped csv output file. please share command. it is only 1 dataframe involved and not multiple. 

  • 5322 Views
  • 4 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

writing to a local directory does not work.See this topic:https://community.databricks.com/s/feed/0D53f00001M7hNlCAJ

  • 2 kudos
3 More Replies
Michelle_-_Devp
by New Contributor III
  • 536 Views
  • 1 replies
  • 1 kudos

Resolved! How is brainstorming going?

Wondering if anyone is willing to share their project ideas here. It would be great to know how things are going and if anyone has a good open-source dataset they are willing to share.

  • 536 Views
  • 1 replies
  • 1 kudos
Latest Reply
bayang
New Contributor III
  • 1 kudos

Good, read their docs to get a lot of info to sharpen this hackathon

  • 1 kudos
amitca71
by Contributor II
  • 1414 Views
  • 2 replies
  • 2 kudos

Resolved! sedona/shapely error Unknown WKB type 16

Hi,i stream data from postgis to s3 using debezium. postgis->debezium->s3->spark(databricks)once read it i decode it and i can see that the binary representation is similiar to what i have in postgis, on a wkb formated column.once i try to read it ei...

  • 1414 Views
  • 2 replies
  • 2 kudos
Latest Reply
Kaniz
Community Manager
  • 2 kudos

Hi @Amit Cahanovich​, The error message "Unknown WKB type 16" indicates that the WKB data you are trying to read has a geometry type that the library does not recognize. WKB type 16 is not valid in the Simple Feature Access (SFA) standard, the most w...

  • 2 kudos
1 More Replies
Bartek
by Contributor
  • 1906 Views
  • 1 replies
  • 1 kudos

Save Spark DataFrame to shape file (.shp format)

Hello,I know how to create .shp file from Geopandas dataframe using code similar to this, also mentioned on SO:gpd_df = geopandas.GeoDataFrame(pandas_df, geometry='geom') gpd_df .to_file("username/nh.shp")However I have .parquet files that I can load...

  • 1906 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Bartosz Maciejewski​ :Spark does not have native support for writing Shapefiles directly. However, you can use a third-party library such as GeoPandas or PyShp to write your Spark DataFrame to a Shapefile.Here's an example of how to use GeoPandas to...

  • 1 kudos
theSoyf
by New Contributor II
  • 2754 Views
  • 2 replies
  • 1 kudos

How to write to Salesforce object using Spark Salesforce Library

Hi I'm facing an issue when writing to a salesforce object. I'm using the springml/spark-salesforce library. I have the above libraries installed as recommended based on my research.I try to write like this:(_sqldf .write .format("com.springml.spar...

Screen Shot 2022-12-14 at 8.18.07 AM
  • 2754 Views
  • 2 replies
  • 1 kudos
Latest Reply
Gauthy
New Contributor II
  • 1 kudos

Im facing the same issue while trying to write to Salesforce, if you have found a resolution could you please share it ?

  • 1 kudos
1 More Replies
Vijaykumarj
by New Contributor III
  • 2726 Views
  • 5 replies
  • 3 kudos

Generate sh2 hashkey while loading files to delta table

I have files in azure data lake. I am using autoloader to read the incremental filesfiles don't have primary key to load, In this case i want to use some columns and generate an hashkey and use it as primary key to do changes.In this case i want to ...

image.png
  • 2726 Views
  • 5 replies
  • 3 kudos
Latest Reply
Kaniz
Community Manager
  • 3 kudos

Hi @Vijay Kumar J​(Customer)​ , We haven’t heard from you since the last response from @Debayan Mukherjee​ and @Jordan Fox​ and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please do share that with the...

  • 3 kudos
4 More Replies
farbodr
by New Contributor II
  • 2388 Views
  • 4 replies
  • 1 kudos

Shapley Progressbar

The shapley progress bar or tqdm progress bar in general doesn't show in notebooks. Do I need to set something special to get this or any other similar widgets to work?

  • 2388 Views
  • 4 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Fred Rahmanian​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...

  • 1 kudos
3 More Replies
siva_thiru
by Contributor
  • 556 Views
  • 0 replies
  • 6 kudos

Happy to share that #WAVICLE​  was able to do a hands-on workshop on #[Databricks notebook]​ #[Databricks SQL]​ #[Databricks cluster]​ Fundamentals wi...

Happy to share that #WAVICLE​  was able to do a hands-on workshop on #[Databricks notebook]​ #[Databricks SQL]​ #[Databricks cluster]​ Fundamentals with KCT College, Coimbatore, India.

Workshop Standee
  • 556 Views
  • 0 replies
  • 6 kudos
jimnaik
by New Contributor III
  • 16399 Views
  • 2 replies
  • 1 kudos

Resolved! How to execute .sh and .py file in the workspace?

I want to execute shell script which is running .py file. May I know how to run .sh file and .py files in Databricks workspace?

  • 16399 Views
  • 2 replies
  • 1 kudos
Latest Reply
jimnaik
New Contributor III
  • 1 kudos

I tried executing like this and it worked: %sh /dbfs/***/***/***.sh

  • 1 kudos
1 More Replies
User16869510359
by Esteemed Contributor
  • 900 Views
  • 1 replies
  • 0 kudos
  • 900 Views
  • 1 replies
  • 0 kudos
Latest Reply
aladda
Honored Contributor II
  • 0 kudos

You could potentially do this through a Global Init Script - https://docs.databricks.com/clusters/init-scripts.html

  • 0 kudos
ArielHerrera
by New Contributor II
  • 13665 Views
  • 5 replies
  • 2 kudos

Resolved! How to display SHAP plots?

I am looking to display SHAP plots, here is the code:import xgboost import shap shap.initjs() # load JS visualization code to notebookX,y = shap.datasets.boston() # train XGBoost model model = xgboost.train({"learning_rate": 0.01}, xgboost.DMatri...

0693f000007OoIfAAK
  • 13665 Views
  • 5 replies
  • 2 kudos
Latest Reply
lrnzcig
New Contributor II
  • 2 kudos

As @Vinh dqvinh87​  noted, the accepted solution only works for force_plot. For other plots, the following trick works for me:import matplotlib.pyplot as plt p = shap.summary_plot(shap_values, test_df, show=False) display(p)

  • 2 kudos
4 More Replies
Labels