cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Free Trial Help
Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insights, tips, and best practices for getting started, troubleshooting issues, and maximizing the value of your trial experience to explore Databricks' capabilities effectively.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Siddalinga
by New Contributor
  • 522 Views
  • 1 replies
  • 0 kudos

Getting error while creating external delta table in Databricks

I am getting below error while creating external delta table in Databricks, even there is a external location created.[NO_PARENT_EXTERNAL_LOCATION_FOR_PATH] No parent external location was found for path 'abfss://destination@datalakeprojectsid.dfs.co...

  • 522 Views
  • 1 replies
  • 0 kudos
Latest Reply
Takuya-Omi
Valued Contributor II
  • 0 kudos

@Siddalinga If the path specified during table creation is outside the scope of the external location, you may encounter the [NO_PARENT_EXTERNAL_LOCATION_FOR_PATH] error.Is the external location correctly defined to scope the directory, such as abfss...

  • 0 kudos
grisma56giga
by New Contributor
  • 302 Views
  • 1 replies
  • 0 kudos

an issue when trying to connect to my AWS S3 bucket from my local Python environment

Hi everyone,I’m having an issue when trying to connect to my AWS S3 bucket from my local Python environment using the boto3 library. import boto3s3 = boto3.client('s3')response = s3.list_objects_v2(Bucket='my-bucket-name')print(response)I keep gettin...

  • 302 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @grisma56giga, The error typically indicates that your Python environment does not have SSL support enabled Can you run to validate: import sslprint(ssl.OPENSSL_VERSION)

  • 0 kudos
ChristopherAlan
by New Contributor II
  • 476 Views
  • 1 replies
  • 0 kudos

Essential-PySpark-for-Scalable-Data-Analytics "wordcount-sql.ipynb"

I'm working through the code at the following, but getting an error:https://github.com/PacktPublishing/Essential-PySpark-for-Scalable-Data-Analytics/blob/main/Chapter01/wordcount-sql.ipynbCode:%sql DROP TABLE IF EXISTS word_counts; CREATE TABLE word_...

  • 476 Views
  • 1 replies
  • 0 kudos
Latest Reply
ChristopherAlan
New Contributor II
  • 0 kudos

Correction:  The error message from the screenshot is when I tried to add the dbms: prefix to the URL.  The error message without that prefix is the following:UnityCatalogServiceException: [RequestId=dbda5aee-b855-9ed9-abf8-3ee0e0dcc938 ErrorClass=IN...

  • 0 kudos
dmptiprabhakar
by New Contributor II
  • 480 Views
  • 2 replies
  • 0 kudos

Unable to login community.cloud.databricks.com site getting "User is not a member of this workspace"

Unable to login community.cloud.databricks.com site getting "User is not a member of this workspace".email-id: dmpti.prabhakar@gmail.com

  • 480 Views
  • 2 replies
  • 0 kudos
Latest Reply
dmptiprabhakar
New Contributor II
  • 0 kudos

@Rishabh-Pandey I tried both ways but not working

  • 0 kudos
1 More Replies
nima30frenkline
by New Contributor
  • 573 Views
  • 0 replies
  • 0 kudos

set up my AWS credentials and configure boto3, but I’m facing an issue

Hi everyone,I’m working on a project where I need to upload files from my local machine to an AWS S3 bucket using the boto3 library in Python. I’ve followed the official documentation to set up my AWS credentials and configure boto3, but I’m facing a...

  • 573 Views
  • 0 replies
  • 0 kudos
dona168lee
by New Contributor
  • 1170 Views
  • 1 replies
  • 0 kudos

I get error

Hi everyone,I'm trying to connect my local Python environment to an AWS S3 bucket using the boto3 library. However, when I try to run the code to list the files in my S3 bucket, I encounter an authentication error:import boto3s3 = boto3.client('s3')r...

  • 1170 Views
  • 1 replies
  • 0 kudos
Latest Reply
hari-prasad
Valued Contributor II
  • 0 kudos

Hi @dona168lee ,Can you share additional details how your environment is setup or which DBR version you are using?Followed with how you installed boto3 package and which version of boto3 is installed?

  • 0 kudos
nathan45shafer
by New Contributor
  • 357 Views
  • 1 replies
  • 0 kudos

queries are running extremely slow

Hello everyone,I’m encountering an issue when querying large Parquet files in Databricks, particularly with files exceeding 1 GB in size. The queries are running extremely slow, and at times, they even time out. I’ve tried optimizing the file size an...

  • 357 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @nathan45shafer, Thanks for your question, you can refer to: https://www.databricks.com/discover/pages/optimize-data-workloads-guide it covers good practices and actions to optimize your workflow, please let me know if you have questions.

  • 0 kudos
ash1127
by New Contributor II
  • 494 Views
  • 2 replies
  • 0 kudos

Access denied for Course

I can't access this course. https://www.databricks.com/training/catalog/advanced-machine-learning-operations-3508When I register this course  , display below error.Access deniedYou do not have permission to access this page, please contact your admin...

  • 494 Views
  • 2 replies
  • 0 kudos
Latest Reply
Advika_
Databricks Employee
  • 0 kudos

Hello, @ash1127! It looks like this post duplicates the one you recently posted. The original post has already been answered. I recommend continuing the discussion there to keep the conversation focused and organized.Let me know if you have any furth...

  • 0 kudos
1 More Replies
violeta482yee
by New Contributor
  • 294 Views
  • 1 replies
  • 0 kudos

when attempting to load a large 800 MB CSV file

Hello everyone,I’m facing an issue when attempting to load a large 800 MB CSV file from ADLS using Auto Loader. Unfortunately, the notebook crashes during the loading process. Has anyone experienced something similar or have any suggestions on how to...

  • 294 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @violeta482yee, Have you checked the resource availability of the compute attached to the notebook? One solution could be: to usecloudFiles.maxBytesPerTriggerOption: This option allows you to control the maximum number of bytes processed in each t...

  • 0 kudos
charles898cabal
by New Contributor
  • 506 Views
  • 1 replies
  • 0 kudos

please enter your credentials...

Hi everyone,I'm trying to connect my local Jupyter Notebook environment to Databricks Community Edition, and I’ve followed the setup guide on the Databricks website. I’m attempting to use the mlflow library, but I’m facing an issue where the authenti...

  • 506 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

What is the process you are following to create the PAT token? In community Edition the PAT token is not allowed, instead you need to use Oauth token https://docs.databricks.com/en/dev-tools/auth/oauth-u2m.html 

  • 0 kudos
Rajeshwar_Reddy
by New Contributor II
  • 677 Views
  • 2 replies
  • 0 kudos

ODBC connection issue Simba 64 bit

Hello AllAm getting the below error when trying to create ODBC DSN Simba 64 in local system to connect Databricks Server using the token and enabled SSL System trust store & Thrift Transport: HTTP.any helping hand really appreciated . [Simba][ThriftE...

  • 677 Views
  • 2 replies
  • 0 kudos
Latest Reply
Rajeshwar_Reddy
New Contributor II
  • 0 kudos

Hi Alberto_UmanaMy connection string is perfect , what do you want me to cover here ?

  • 0 kudos
1 More Replies
arjunraghavdev
by New Contributor II
  • 690 Views
  • 2 replies
  • 1 kudos

I was about to signup to community edition but mistakenly signed up to regular edition

I was about to signup to community edition but mistakenly signed up to regular edition, I want to convert my subscription into community edition how do i do it.When i try to login i get a error that, we are not able to find community edition workspac...

  • 690 Views
  • 2 replies
  • 1 kudos
Latest Reply
khtrivedi84
New Contributor II
  • 1 kudos

Same issue here. Couldn't use community edition using any other email as well

  • 1 kudos
1 More Replies
RajeshRK
by Contributor II
  • 2022 Views
  • 11 replies
  • 1 kudos

Resolved! Generate a temporary table credential fails

Hi Team,I am trying to execute the below API, and it is failing.API:curl -v  -X POST "https://dbc-xxxxxxxx-xxxx.cloud.databricks.com/api/2.0/unity-catalog/temporary-stage-credentials" -H "Authentication: Bearer xxxxxxxxxxxxxxxx" -d '{"table_id":"exte...

  • 2022 Views
  • 11 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Your endpoint also seems to be incorrect, as per doc https://docs.databricks.com/api/gcp/workspace/temporarytablecredentials/generatetemporarytablecredentials it has to be /api/2.0/unity-catalog/temporary-table-credentials and you are using  /api/2.0...

  • 1 kudos
10 More Replies
DanScho
by New Contributor III
  • 1565 Views
  • 10 replies
  • 0 kudos

Resolved! API access via python script

Hello,I try to access an API via a python script.I have access using postman. But if I try to access it via python in databricks, it will not work (timeout).Here is my code:import requestsimport socketurl = 'https://prod-api.xsp.cloud.corpintra.net/t...

  • 1565 Views
  • 10 replies
  • 0 kudos
Latest Reply
DanScho
New Contributor III
  • 0 kudos

Nevertheless: Thank you very much for helping me!

  • 0 kudos
9 More Replies
SaikiranTamide
by New Contributor III
  • 1203 Views
  • 7 replies
  • 0 kudos

Databricks - Subscription is not Active

Hello TeamMy databricks free trial has been extended till 2030 but somehow my account is showing "Subscription is not Active".  Currently, I have one workspace running in my account and it is not letting me create another workspace.  I am also unsure...

SaikiranTamide_0-1733995581622.png
  • 1203 Views
  • 7 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

@SaikiranTamide - I pinged you with additional details.

  • 0 kudos
6 More Replies