cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Getting the error 'No such file or directory', when trying to access the json file

Devarsh
Contributor

I am trying to write in my google sheet through Databricks but when it comes to reading the json, file containing the credentials, I am getting the error that No such file or directory exists.

import gspread
 
 
gc = gspread.service_account(filename='dbfs:/FileStore/shared_uploads/abc@gmail.com/myapi_3115465_2ed05-3.json')
 
sh = gc.open_by_key('1********_g3kkhhA_9vMyp9piw')
 
worksheet = sh.sheet1
 
worksheet.update_cell(1,1,103032)

Getting the following error when I run the above code

FileNotFoundError: [Errno 2] No such file or directory: 'dbfs:/FileStore/shared_uploads/abc@gmail.com/myapi_3115465_2ed05-3.json'

1 ACCEPTED SOLUTION

Accepted Solutions

Noopur_Nigam
Valued Contributor II
Valued Contributor II

Hi @Devarsh Shah​ The issue is not with json file but the location you are specifying while reading.As suggested by @Werner Stinckens​ please start using spark API to read the json file as below:

spark.read.format("json").load("testjson")

Please check the path you are providing in the read command if you still face the same issue.

https://docs.databricks.com/data/data-sources/read-json.html

View solution in original post

3 REPLIES 3

-werners-
Esteemed Contributor III

So open_by_key is a function of gspread, which is a python lib.

Pure python (not pyspark) will only read from local files,

see this topic also. It is about databricks-connect and pandas but the same principles apply.

Yes @Werner Stinckens​, I tried different ways to read the json file from the path where it was stored but was unable to do so. Instead of this what I have done is that I have copied the content of the JSON file in a variable and passed that to the following function

credentials = {"type":"service_account","project_id":"myapi-.................}

gc = gspread.service_account_from_dict(credentials)

So now no need to read from any json file

Noopur_Nigam
Valued Contributor II
Valued Contributor II

Hi @Devarsh Shah​ The issue is not with json file but the location you are specifying while reading.As suggested by @Werner Stinckens​ please start using spark API to read the json file as below:

spark.read.format("json").load("testjson")

Please check the path you are providing in the read command if you still face the same issue.

https://docs.databricks.com/data/data-sources/read-json.html

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.