cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Free Edition Help
Engage in discussions about the Databricks Free Edition within the Databricks Community. Share insights, tips, and best practices for getting started, troubleshooting issues, and maximizing the value of your trial experience to explore Databricks' capabilities effectively.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

dona168lee
by New Contributor
  • 1850 Views
  • 1 replies
  • 0 kudos

I get error

Hi everyone,I'm trying to connect my local Python environment to an AWS S3 bucket using the boto3 library. However, when I try to run the code to list the files in my S3 bucket, I encounter an authentication error:import boto3s3 = boto3.client('s3')r...

  • 1850 Views
  • 1 replies
  • 0 kudos
Latest Reply
hari-prasad
Valued Contributor II
  • 0 kudos

Hi @dona168lee ,Can you share additional details how your environment is setup or which DBR version you are using?Followed with how you installed boto3 package and which version of boto3 is installed?

  • 0 kudos
nathan45shafer
by New Contributor
  • 1531 Views
  • 1 replies
  • 0 kudos

queries are running extremely slow

Hello everyone,I’m encountering an issue when querying large Parquet files in Databricks, particularly with files exceeding 1 GB in size. The queries are running extremely slow, and at times, they even time out. I’ve tried optimizing the file size an...

  • 1531 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @nathan45shafer, Thanks for your question, you can refer to: https://www.databricks.com/discover/pages/optimize-data-workloads-guide it covers good practices and actions to optimize your workflow, please let me know if you have questions.

  • 0 kudos
ash1127
by New Contributor II
  • 1737 Views
  • 2 replies
  • 1 kudos

Access denied for Course

I can't access this course. https://www.databricks.com/training/catalog/advanced-machine-learning-operations-3508When I register this course  , display below error.Access deniedYou do not have permission to access this page, please contact your admin...

  • 1737 Views
  • 2 replies
  • 1 kudos
Latest Reply
Advika_
Databricks Employee
  • 1 kudos

Hello, @ash1127! It looks like this post duplicates the one you recently posted. The original post has already been answered. I recommend continuing the discussion there to keep the conversation focused and organized.Let me know if you have any furth...

  • 1 kudos
1 More Replies
violeta482yee
by New Contributor
  • 737 Views
  • 1 replies
  • 0 kudos

when attempting to load a large 800 MB CSV file

Hello everyone,I’m facing an issue when attempting to load a large 800 MB CSV file from ADLS using Auto Loader. Unfortunately, the notebook crashes during the loading process. Has anyone experienced something similar or have any suggestions on how to...

  • 737 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @violeta482yee, Have you checked the resource availability of the compute attached to the notebook? One solution could be: to usecloudFiles.maxBytesPerTriggerOption: This option allows you to control the maximum number of bytes processed in each t...

  • 0 kudos
charles898cabal
by New Contributor
  • 1278 Views
  • 1 replies
  • 0 kudos

please enter your credentials...

Hi everyone,I'm trying to connect my local Jupyter Notebook environment to Databricks Community Edition, and I’ve followed the setup guide on the Databricks website. I’m attempting to use the mlflow library, but I’m facing an issue where the authenti...

  • 1278 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

What is the process you are following to create the PAT token? In community Edition the PAT token is not allowed, instead you need to use Oauth token https://docs.databricks.com/en/dev-tools/auth/oauth-u2m.html 

  • 0 kudos
arjunraghavdev
by New Contributor II
  • 1977 Views
  • 2 replies
  • 1 kudos

I was about to signup to community edition but mistakenly signed up to regular edition

I was about to signup to community edition but mistakenly signed up to regular edition, I want to convert my subscription into community edition how do i do it.When i try to login i get a error that, we are not able to find community edition workspac...

  • 1977 Views
  • 2 replies
  • 1 kudos
Latest Reply
khtrivedi84
New Contributor II
  • 1 kudos

Same issue here. Couldn't use community edition using any other email as well

  • 1 kudos
1 More Replies
RajeshRK
by Contributor II
  • 5699 Views
  • 11 replies
  • 1 kudos

Resolved! Generate a temporary table credential fails

Hi Team,I am trying to execute the below API, and it is failing.API:curl -v  -X POST "https://dbc-xxxxxxxx-xxxx.cloud.databricks.com/api/2.0/unity-catalog/temporary-stage-credentials" -H "Authentication: Bearer xxxxxxxxxxxxxxxx" -d '{"table_id":"exte...

  • 5699 Views
  • 11 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Your endpoint also seems to be incorrect, as per doc https://docs.databricks.com/api/gcp/workspace/temporarytablecredentials/generatetemporarytablecredentials it has to be /api/2.0/unity-catalog/temporary-table-credentials and you are using  /api/2.0...

  • 1 kudos
10 More Replies
DanScho
by New Contributor III
  • 4329 Views
  • 10 replies
  • 0 kudos

Resolved! API access via python script

Hello,I try to access an API via a python script.I have access using postman. But if I try to access it via python in databricks, it will not work (timeout).Here is my code:import requestsimport socketurl = 'https://prod-api.xsp.cloud.corpintra.net/t...

  • 4329 Views
  • 10 replies
  • 0 kudos
Latest Reply
DanScho
New Contributor III
  • 0 kudos

Nevertheless: Thank you very much for helping me!

  • 0 kudos
9 More Replies
SaikiranTamide
by New Contributor III
  • 2376 Views
  • 7 replies
  • 0 kudos

Databricks - Subscription is not Active

Hello TeamMy databricks free trial has been extended till 2030 but somehow my account is showing "Subscription is not Active".  Currently, I have one workspace running in my account and it is not letting me create another workspace.  I am also unsure...

SaikiranTamide_0-1733995581622.png
  • 2376 Views
  • 7 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

@SaikiranTamide - I pinged you with additional details.

  • 0 kudos
6 More Replies
mskulkarni1610
by New Contributor II
  • 1420 Views
  • 2 replies
  • 0 kudos

Community Edition isnt't supporting importing .dbc file

Hello Databricks Community Support Team,I am using Databricks Community Edition for exploring Databricks features.It has been observed that, Importing .DBC file is no more supported in community edition. The below snippet shows the error message we a...

  • 1420 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

I was informed that this feature of importing workspace files does not work and it's expected for Community Edition.

  • 0 kudos
1 More Replies
RashidQuamarCog
by New Contributor III
  • 2798 Views
  • 9 replies
  • 0 kudos

Upload dbc file from another workspace

Hi team,I have a dbc file from different workspace. I am trying to upload that dbc into different workspace it says folder does not existsPlease suggest some solution 

  • 2798 Views
  • 9 replies
  • 0 kudos
Latest Reply
RashidQuamarCog
New Contributor III
  • 0 kudos

Hi,I found a work around :Step 1: If you are using Azure or AWS create an instance of databricks workspaceStep 2: Once the workspace is ready, import your dbc(databricks Archive) file.Step3: This will surely show all the files within the dbcStep4: Ex...

  • 0 kudos
8 More Replies
JissMathew
by Valued Contributor
  • 885 Views
  • 1 replies
  • 0 kudos

any Quota limit in Auto loader

Hi all,I encountered an issue while trying to load an 800 MB CSV file from ADLS using Auto Loader. The notebook crashed during the process. Could anyone assist me with resolving this?

  • 885 Views
  • 1 replies
  • 0 kudos
Latest Reply
MuthuLakshmi
Databricks Employee
  • 0 kudos

@JissMathew What is the error that you are getting ?

  • 0 kudos
harold07douglas
by New Contributor II
  • 1680 Views
  • 1 replies
  • 0 kudos

I suspect that my environment variables

Hello,I’ve been trying to set up my local MLflow client to work with Databricks Community Edition, but I’m running into issues with authentication. I followed the official setup guide for integrating MLflow with Databricks, but when I try to run any ...

  • 1680 Views
  • 1 replies
  • 0 kudos
Latest Reply
MuthuLakshmi
Databricks Employee
  • 0 kudos

@harold07douglas Can you follow this doc?  https://docs.databricks.com/en/mlflow/access-hosted-tracking-server.html

  • 0 kudos
manojpatil04
by New Contributor III
  • 2551 Views
  • 1 replies
  • 1 kudos

Resolved! GCP Databricks 14days free trail

I am using GCP-Databricks 14days free trail, what would happened after 14days trail is over. what happened with the already created workspaces.

  • 2551 Views
  • 1 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

After the 14-day free trial of Databricks on Google Cloud Platform (GCP), your account will transition depending on whether you have provided billing information or not. Here are the key outcomes: Transition to Paid Subscription: If you have provided...

  • 1 kudos
mban-mondo
by New Contributor II
  • 2504 Views
  • 2 replies
  • 1 kudos

Resolved! Notebook Paths Errors in Community Edition

I have the following Notebook in Databricks UI: dbutils.entry_point.getDbutils().notebook().getContext().toJson()notebook_path = dbutils.notebook.entry_point.getDbutils().notebook().getContext().notebookPath().get()print(f"Current notebook path: {not...

  • 2504 Views
  • 2 replies
  • 1 kudos
Latest Reply
mban-mondo
New Contributor II
  • 1 kudos

Many thanks. It would be helpful if the error message said 'No access has been granted to this resource' instead of 'Resource not found.   

  • 1 kudos
1 More Replies