cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Free Trial Help
Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insights, tips, and best practices for getting started, troubleshooting issues, and maximizing the value of your trial experience to explore Databricks' capabilities effectively.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

arjunraghavdev
by New Contributor II
  • 1697 Views
  • 2 replies
  • 1 kudos

I was about to signup to community edition but mistakenly signed up to regular edition

I was about to signup to community edition but mistakenly signed up to regular edition, I want to convert my subscription into community edition how do i do it.When i try to login i get a error that, we are not able to find community edition workspac...

  • 1697 Views
  • 2 replies
  • 1 kudos
Latest Reply
khtrivedi84
New Contributor II
  • 1 kudos

Same issue here. Couldn't use community edition using any other email as well

  • 1 kudos
1 More Replies
RajeshRK
by Contributor II
  • 5071 Views
  • 11 replies
  • 1 kudos

Resolved! Generate a temporary table credential fails

Hi Team,I am trying to execute the below API, and it is failing.API:curl -v  -X POST "https://dbc-xxxxxxxx-xxxx.cloud.databricks.com/api/2.0/unity-catalog/temporary-stage-credentials" -H "Authentication: Bearer xxxxxxxxxxxxxxxx" -d '{"table_id":"exte...

  • 5071 Views
  • 11 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Your endpoint also seems to be incorrect, as per doc https://docs.databricks.com/api/gcp/workspace/temporarytablecredentials/generatetemporarytablecredentials it has to be /api/2.0/unity-catalog/temporary-table-credentials and you are using  /api/2.0...

  • 1 kudos
10 More Replies
DanScho
by New Contributor III
  • 3638 Views
  • 10 replies
  • 0 kudos

Resolved! API access via python script

Hello,I try to access an API via a python script.I have access using postman. But if I try to access it via python in databricks, it will not work (timeout).Here is my code:import requestsimport socketurl = 'https://prod-api.xsp.cloud.corpintra.net/t...

  • 3638 Views
  • 10 replies
  • 0 kudos
Latest Reply
DanScho
New Contributor III
  • 0 kudos

Nevertheless: Thank you very much for helping me!

  • 0 kudos
9 More Replies
SaikiranTamide
by New Contributor III
  • 2004 Views
  • 7 replies
  • 0 kudos

Databricks - Subscription is not Active

Hello TeamMy databricks free trial has been extended till 2030 but somehow my account is showing "Subscription is not Active".  Currently, I have one workspace running in my account and it is not letting me create another workspace.  I am also unsure...

SaikiranTamide_0-1733995581622.png
  • 2004 Views
  • 7 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

@SaikiranTamide - I pinged you with additional details.

  • 0 kudos
6 More Replies
mskulkarni1610
by New Contributor II
  • 1165 Views
  • 2 replies
  • 0 kudos

Community Edition isnt't supporting importing .dbc file

Hello Databricks Community Support Team,I am using Databricks Community Edition for exploring Databricks features.It has been observed that, Importing .DBC file is no more supported in community edition. The below snippet shows the error message we a...

  • 1165 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

I was informed that this feature of importing workspace files does not work and it's expected for Community Edition.

  • 0 kudos
1 More Replies
RashidQuamarCog
by New Contributor III
  • 2290 Views
  • 9 replies
  • 0 kudos

Upload dbc file from another workspace

Hi team,I have a dbc file from different workspace. I am trying to upload that dbc into different workspace it says folder does not existsPlease suggest some solution 

  • 2290 Views
  • 9 replies
  • 0 kudos
Latest Reply
RashidQuamarCog
New Contributor III
  • 0 kudos

Hi,I found a work around :Step 1: If you are using Azure or AWS create an instance of databricks workspaceStep 2: Once the workspace is ready, import your dbc(databricks Archive) file.Step3: This will surely show all the files within the dbcStep4: Ex...

  • 0 kudos
8 More Replies
JissMathew
by Valued Contributor
  • 747 Views
  • 1 replies
  • 0 kudos

any Quota limit in Auto loader

Hi all,I encountered an issue while trying to load an 800 MB CSV file from ADLS using Auto Loader. The notebook crashed during the process. Could anyone assist me with resolving this?

  • 747 Views
  • 1 replies
  • 0 kudos
Latest Reply
MuthuLakshmi
Databricks Employee
  • 0 kudos

@JissMathew What is the error that you are getting ?

  • 0 kudos
harold07douglas
by New Contributor II
  • 1399 Views
  • 1 replies
  • 0 kudos

I suspect that my environment variables

Hello,I’ve been trying to set up my local MLflow client to work with Databricks Community Edition, but I’m running into issues with authentication. I followed the official setup guide for integrating MLflow with Databricks, but when I try to run any ...

  • 1399 Views
  • 1 replies
  • 0 kudos
Latest Reply
MuthuLakshmi
Databricks Employee
  • 0 kudos

@harold07douglas Can you follow this doc?  https://docs.databricks.com/en/mlflow/access-hosted-tracking-server.html

  • 0 kudos
manojpatil04
by New Contributor III
  • 2300 Views
  • 1 replies
  • 1 kudos

Resolved! GCP Databricks 14days free trail

I am using GCP-Databricks 14days free trail, what would happened after 14days trail is over. what happened with the already created workspaces.

  • 2300 Views
  • 1 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

After the 14-day free trial of Databricks on Google Cloud Platform (GCP), your account will transition depending on whether you have provided billing information or not. Here are the key outcomes: Transition to Paid Subscription: If you have provided...

  • 1 kudos
mban-mondo
by New Contributor II
  • 2224 Views
  • 2 replies
  • 1 kudos

Resolved! Notebook Paths Errors in Community Edition

I have the following Notebook in Databricks UI: dbutils.entry_point.getDbutils().notebook().getContext().toJson()notebook_path = dbutils.notebook.entry_point.getDbutils().notebook().getContext().notebookPath().get()print(f"Current notebook path: {not...

  • 2224 Views
  • 2 replies
  • 1 kudos
Latest Reply
mban-mondo
New Contributor II
  • 1 kudos

Many thanks. It would be helpful if the error message said 'No access has been granted to this resource' instead of 'Resource not found.   

  • 1 kudos
1 More Replies
JissMathew
by Valued Contributor
  • 1709 Views
  • 3 replies
  • 2 kudos

Multi Node Cluster

Hi All ,Using free trail subscription can we create a Multi Node Cluster ? is it possible in any region ?

  • 1709 Views
  • 3 replies
  • 2 kudos
Latest Reply
michelle653burk
New Contributor III
  • 2 kudos

@JissMathew wrote:Hi All ,Using free trail subscription can we create a Multi Node Cluster ? is it possible in any region ?Hi there! Unfortunately, with a free trial subscription, you can only create a single node cluster. The free trial typically pr...

  • 2 kudos
2 More Replies
vickybhoir23
by New Contributor
  • 923 Views
  • 1 replies
  • 0 kudos

request you to provide help desk number of data bricks

Hi team,Due to mother health issues, I didn't attend professional data bricks exam which was scheduled on  (25th Nov 2024). So I want to attend the exam again. If there is any solution or way for that, please let me knowrequest you to provide help de...

  • 923 Views
  • 1 replies
  • 0 kudos
Latest Reply
gchandra
Databricks Employee
  • 0 kudos

File a ticket https://help.databricks.com/s/contact-us?ReqType=training @Cert_TeamOPS  will assist you.

  • 0 kudos
harold07douglas
by New Contributor II
  • 1115 Views
  • 1 replies
  • 1 kudos

I try to run any MLflow command, I encounter the following error

Hello,I'm trying to connect my local MLflow client to Databricks Community Edition but I'm running into issues with authentication.I followed the setup guide for MLflow integration with Databricks, but when I try to run any MLflow command, I encounte...

  • 1115 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Can you share the guide you are referring to? Also please let us know what credentials are you using?  

  • 1 kudos
fjdslelsj
by New Contributor II
  • 1580 Views
  • 3 replies
  • 0 kudos

DBFS Explorer displays outdated data

Hey, I use Databricks 15.4 ML LTS (and before Databricks 12LTS).When clicking on Data Ingestion, Upload files to DBFS -> DBFS and then click on the 'mnt' folder, I see outdated content. When I however use the web terminal of Databricks or the tools d...

  • 1580 Views
  • 3 replies
  • 0 kudos
Latest Reply
fjdslelsj
New Contributor II
  • 0 kudos

The problem persists. Is this the correct channel to report this issue which we assume to be a bug in DataBricks? Where else can one report such an issue?

  • 0 kudos
2 More Replies