Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Hello,I know how to create .shp file from Geopandas dataframe using code similar to this, also mentioned on SO:gpd_df = geopandas.GeoDataFrame(pandas_df, geometry='geom')
gpd_df .to_file("username/nh.shp")However I have .parquet files that I can load...
@Bartosz Maciejewski :Spark does not have native support for writing Shapefiles directly. However, you can use a third-party library such as GeoPandas or PyShp to write your Spark DataFrame to a Shapefile.Here's an example of how to use GeoPandas to...
Use Case: Copy data from SharePoint List to Blob using Power AutomateShort Description:To Access the blob storage account from Power Automate. There are three authentication type:1. Access Key2. Service Principal3. Azure AD IntegratedWhich authentica...
@KVNARK . :It's recommended to use the Azure AD Integrated authentication type. This authentication type allows you to use Azure Active Directory (AD) to authenticate and manage access to Blob Storage resources at the folder or container level using...
We will host the first Databricks Bay Area User Group meeting in the Databricks Mountain View office on March 14 2:30-5:00pm PT.We'll have Dave Mariani - CTO & Founder at AtScale, and Riley Phillips - Enterprise Solution Engineer at Matillion to shar...
I would like to download a file in DBFS using the FileStore Endpoint.If the file or folder name contains multibyte characters, the file path cannot be specified due to URL encoding and an error occurs.Question 1: If a file or folder name contains mul...
Hi,Databricks CLI can be used to download a file from DBFS. https://docs.databricks.com/dev-tools/cli/index.htmlAlso, you can refer to https://stackoverflow.com/questions/49019706/databricks-download-a-dbfs-filestore-file-to-my-local-machine , which ...
Databricks has introduced a new feature that allows users to send SQL statements to their database via REST API. Users can easily integrate this feature with any tool by simply posting their queries to the /api/2.0/sql/statements/ endpoint. With this...
Lakehouse architectures seem enticing, especially from the standpoint of querying the data lake directly as it sits (as opposed to first migrating the data to an external data warehouse). While documentation and support seems pretty clear regarding ...
Goal: To use Python 3.10.4+Why: We have Python repos that are not backward compatible we want to use.What: I have created an image from Databricks example experimental containers already with Ubuntu 22.04 (2 major versions newer than the curre...
After searching for an hour, I realized what I needed to look for. It's the importing the Iterable from collections, which is deprecated in 3.10. I guess Databricks hasn't migrated code, yet. In which case, I'm at a cross-roads. Databricks 3.9, local...
I have a hive table in Delta format with over 1B rows, when I check the Data Explorer in the SQL section of Databricks it notes that the table size is 139.3GiB with 401 files but when I check the S3 bucket where the files are located (dbfs:/user/hive...
When you run updates, deletes etc on a delta table, new files are created. However, the old files are not automatically deleted. This is to allow for features like time travel on the Delta tables. In order to delete older files for a delta table, you...
Exciting news for Databricks users! #databricks launched a new feature that allows users to run job workflows continuously. Setting up a continuous job workflow is straightforward: create a job and select the continuous trigger option in the scheduli...
Weekly Release Notes RecapHere’s a quick recap of the latest release notes updates from the past one week.Databricks platform release notesFebruary 21 - 28, 2023Ray on Databricks (Public Preview)With Databricks Runtime 12.0 and above, you can create ...
Hi all, after some time working with Devops and Repos and getting used to the convenience our SSL Cert situation got jacked up somehow. While not ideal, I'd like to be able to temporarily bypass cert verification. There are ways to do this in the she...
Exciting news for #azure users! The #databricks runtime 12.2 has been officially released as a long-term support (LTS) version, providing a stable and reliable platform for users to build and deploy their applications. As part of this release, the en...
Yes it does.Here is the syntaxe for Watermarkinghttps://docs.databricks.com/sql/language-manual/sql-ref-syntax-qry-select-watermark.htmlHere it the syntaxe for Windowing https://docs.databricks.com/sql/language-manual/sql-ref-window-functions.html
Hi @Youssef Mrini Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedbac...
With Databricks Runtime 12.0 and above, you can create a Ray cluster and run Ray applications in Databricks with the Ray on Spark API.Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a ...
Hi @Youssef Mrini Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...