- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-23-2022 10:47 PM
I am trying to write in my google sheet through Databricks but when it comes to reading the json, file containing the credentials, I am getting the error that No such file or directory exists.
import gspread
gc = gspread.service_account(filename='dbfs:/FileStore/shared_uploads/abc@gmail.com/myapi_3115465_2ed05-3.json')
sh = gc.open_by_key('1********_g3kkhhA_9vMyp9piw')
worksheet = sh.sheet1
worksheet.update_cell(1,1,103032)
Getting the following error when I run the above code
FileNotFoundError: [Errno 2] No such file or directory: 'dbfs:/FileStore/shared_uploads/abc@gmail.com/myapi_3115465_2ed05-3.json'
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-01-2022 08:57 PM
Hi @Devarsh Shah The issue is not with json file but the location you are specifying while reading.As suggested by @Werner Stinckens please start using spark API to read the json file as below:
spark.read.format("json").load("testjson")
Please check the path you are providing in the read command if you still face the same issue.
https://docs.databricks.com/data/data-sources/read-json.html
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-25-2022 12:14 AM
So open_by_key is a function of gspread, which is a python lib.
Pure python (not pyspark) will only read from local files,
see this topic also. It is about databricks-connect and pandas but the same principles apply.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-25-2022 02:23 AM
Yes @Werner Stinckens, I tried different ways to read the json file from the path where it was stored but was unable to do so. Instead of this what I have done is that I have copied the content of the JSON file in a variable and passed that to the following function
credentials = {"type":"service_account","project_id":"myapi-.................}
gc = gspread.service_account_from_dict(credentials)
So now no need to read from any json file
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-01-2022 08:57 PM
Hi @Devarsh Shah The issue is not with json file but the location you are specifying while reading.As suggested by @Werner Stinckens please start using spark API to read the json file as below:
spark.read.format("json").load("testjson")
Please check the path you are providing in the read command if you still face the same issue.
https://docs.databricks.com/data/data-sources/read-json.html

