โ09-17-2024 08:33 AM
Hi, I have enrolled into Databricks Apache Spark SQL for Data Analysts course on Coursera.
I am in Module 3, I imported the file through given url in course material. In 3.2Basic Queries I ran the first block which contains following command "%run ../Includes/Classroom-Setup".
Its running well, but when I try to run next cell i.e create table command I am getting error. I am new to all this please guide me to solve the error ASAP. Attaching a screenshot for reference. Thank you!
โ09-17-2024 09:49 AM
Hi @nir777 ,
Run "%fs ls /mnt/training/dataframes/" to see whether you see the file in the folder.
Run "%fs head /mnt/training/dataframes/people-10m.parquet" to see whether you can access the content.
We need to make sure first the file is in the location and also that you have the permissions to access it.
โ09-17-2024 10:48 AM
Hi @filipniziol thanks for your answer. I tried the commands but its showing
AccessDeniedException . I don't know how to get permissions.
โ09-17-2024 11:08 AM - edited โ09-17-2024 11:08 AM
Hi @nir777 ,
Have you mounted your s3 bucket correctly?
Could you just run %fs ls /mnt/training and see whether you can see the files or folders?
What is happening:
1. The path is incorrect. So /mnt/training is mounted correctly, but the file is in different folder
2. The file is missing
3. The s3 buckte is mounted incorrectly.
โ09-17-2024 11:53 AM
Hi @filipniziol, sharing a screenshot as I am not getting what exactly it means.
โ09-17-2024 12:06 PM
The first line is incorrect, it would need to be %fs ls /mnt/training instead of %python ls /mnt/training.
So you have this mounted:
Are you able to run below code and see the content:
# List the contents of /mnt/training
display(dbutils.fs.ls("/mnt/training"))
โ09-17-2024 12:12 PM - edited โ09-17-2024 12:17 PM
If you get AccessDeniedException when trying to list "/mnt/training", then I would:
1. unmount the directory:
dbutils.fs.unmount("/mnt/training")
2. mount the directory once again - check the docs:
access_key = "<aws-access-key>"
secret_key = "<aws-secret-key>"
encoded_secret_key = secret_key.replace("/", "%2F")
aws_bucket_name = "<aws-bucket-name>"
mount_name = "<mount-name>"
dbutils.fs.mount(f"s3a://{access_key}:{encoded_secret_key}@{aws_bucket_name}", f"/mnt/{mount_name}")
display(dbutils.fs.ls(f"/mnt/{mount_name}"))
โ10-03-2024 12:02 PM
Hi, I enrolled in the same coursera "databricks" course, and met the same issue.
I unmounted and re-run the classroom set up as following
However, when I move forward to list the mounted dirctory, the same issue happened.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group