11-11-2021 07:34 AM
Hi,
How is that possible to disable SSL Certification.
With databricks API I got this error :
SSLCertVerificationError
SSLCertVerificationError: ("hostname 'https' doesn't match either of '*.numericable.fr', 'numericable.fr'",)
MaxRetryError: HTTPSConnectionPool(host='https', port=443): Max retries exceeded with url: //adb-XXXXXXXXXXXXXX.azuredatabricks.net/api/2.0/workspace/mkdirs (Caused by SSLError(SSLCertVerificationError("hostname 'https' doesn't match either of '*.numericable.fr', 'numericable.fr'")))
when executing the following script
response = requests.post(
'https://{}/api/2.0/workspace/mkdirs'.format(DOMAIN),
headers={'Authorization': 'Bearer {}'.format(TOKEN)},
json={"path": "/Shared/released_notebooks/modules"}
)
Thanks
11-12-2021 03:37 AM
Hi, thanks for you answer. Thanks to your code which was a bit different from mine, I noticed a error in my code. (I had "https://" twice...)
11-12-2021 12:33 AM
Hi @Bertrand BURCKER I tried your code and ended up with a different error. To create a directory you can try the below code and it will help create the folder for you.
import requests
from pyspark.sql.types import (StructField, StringType, StructType, IntegerType)
API_URL = dbutils.notebook.entry_point.getDbutils().notebook().getContext().apiUrl().getOrElse(None)
TOKEN = dbutils.notebook.entry_point.getDbutils().notebook().getContext().apiToken().getOrElse(None)
response = requests.post(
API_URL + '/api/2.0/workspace/mkdirs',
headers={"Authorization": "Bearer " + TOKEN},
json={"path": "/Users/prabakarxxxxxxxx/Test_dir1/"}
)
11-12-2021 03:37 AM
Hi, thanks for you answer. Thanks to your code which was a bit different from mine, I noticed a error in my code. (I had "https://" twice...)
11-17-2021 08:21 AM
Hi @Bertrand BURCKER I believe you were able to resolve the issue. Would you be happy to mark the answer as best so that others can quickly find the solution in the future?
11-12-2021 08:29 AM
@Bertrand BURCKER - Thanks for letting us know your issue is resolved. If @Prabakar Ammeappin's answer solved the problem, would you be happy to mark his answer as best so others can more easily find an answer for this?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group