- 1637 Views
- 10 replies
- 0 kudos
Resolved! API access via python script
Hello,I try to access an API via a python script.I have access using postman. But if I try to access it via python in databricks, it will not work (timeout).Here is my code:import requestsimport socketurl = 'https://prod-api.xsp.cloud.corpintra.net/t...
- 1637 Views
- 10 replies
- 0 kudos
- 0 kudos
Nevertheless: Thank you very much for helping me!
- 0 kudos
- 1223 Views
- 7 replies
- 0 kudos
Databricks - Subscription is not Active
Hello TeamMy databricks free trial has been extended till 2030 but somehow my account is showing "Subscription is not Active". Currently, I have one workspace running in my account and it is not letting me create another workspace. I am also unsure...
- 1223 Views
- 7 replies
- 0 kudos
- 0 kudos
@SaikiranTamide - I pinged you with additional details.
- 0 kudos
- 704 Views
- 2 replies
- 0 kudos
Community Edition isnt't supporting importing .dbc file
Hello Databricks Community Support Team,I am using Databricks Community Edition for exploring Databricks features.It has been observed that, Importing .DBC file is no more supported in community edition. The below snippet shows the error message we a...
- 704 Views
- 2 replies
- 0 kudos
- 0 kudos
I was informed that this feature of importing workspace files does not work and it's expected for Community Edition.
- 0 kudos
- 1264 Views
- 9 replies
- 0 kudos
Upload dbc file from another workspace
Hi team,I have a dbc file from different workspace. I am trying to upload that dbc into different workspace it says folder does not existsPlease suggest some solution
- 1264 Views
- 9 replies
- 0 kudos
- 0 kudos
Hi,I found a work around :Step 1: If you are using Azure or AWS create an instance of databricks workspaceStep 2: Once the workspace is ready, import your dbc(databricks Archive) file.Step3: This will surely show all the files within the dbcStep4: Ex...
- 0 kudos
- 384 Views
- 1 replies
- 0 kudos
any Quota limit in Auto loader
Hi all,I encountered an issue while trying to load an 800 MB CSV file from ADLS using Auto Loader. The notebook crashed during the process. Could anyone assist me with resolving this?
- 384 Views
- 1 replies
- 0 kudos
- 0 kudos
@JissMathew What is the error that you are getting ?
- 0 kudos
- 547 Views
- 1 replies
- 0 kudos
I suspect that my environment variables
Hello,I’ve been trying to set up my local MLflow client to work with Databricks Community Edition, but I’m running into issues with authentication. I followed the official setup guide for integrating MLflow with Databricks, but when I try to run any ...
- 547 Views
- 1 replies
- 0 kudos
- 0 kudos
@harold07douglas Can you follow this doc? https://docs.databricks.com/en/mlflow/access-hosted-tracking-server.html
- 0 kudos
- 836 Views
- 1 replies
- 1 kudos
Resolved! GCP Databricks 14days free trail
I am using GCP-Databricks 14days free trail, what would happened after 14days trail is over. what happened with the already created workspaces.
- 836 Views
- 1 replies
- 1 kudos
- 1 kudos
After the 14-day free trial of Databricks on Google Cloud Platform (GCP), your account will transition depending on whether you have provided billing information or not. Here are the key outcomes: Transition to Paid Subscription: If you have provided...
- 1 kudos
- 907 Views
- 2 replies
- 1 kudos
Resolved! Notebook Paths Errors in Community Edition
I have the following Notebook in Databricks UI: dbutils.entry_point.getDbutils().notebook().getContext().toJson()notebook_path = dbutils.notebook.entry_point.getDbutils().notebook().getContext().notebookPath().get()print(f"Current notebook path: {not...
- 907 Views
- 2 replies
- 1 kudos
- 1 kudos
Many thanks. It would be helpful if the error message said 'No access has been granted to this resource' instead of 'Resource not found.
- 1 kudos
- 825 Views
- 3 replies
- 2 kudos
Multi Node Cluster
Hi All ,Using free trail subscription can we create a Multi Node Cluster ? is it possible in any region ?
- 825 Views
- 3 replies
- 2 kudos
- 2 kudos
@JissMathew wrote:Hi All ,Using free trail subscription can we create a Multi Node Cluster ? is it possible in any region ?Hi there! Unfortunately, with a free trial subscription, you can only create a single node cluster. The free trial typically pr...
- 2 kudos
- 478 Views
- 1 replies
- 0 kudos
request you to provide help desk number of data bricks
Hi team,Due to mother health issues, I didn't attend professional data bricks exam which was scheduled on (25th Nov 2024). So I want to attend the exam again. If there is any solution or way for that, please let me knowrequest you to provide help de...
- 478 Views
- 1 replies
- 0 kudos
- 0 kudos
File a ticket https://help.databricks.com/s/contact-us?ReqType=training @Cert_TeamOPS will assist you.
- 0 kudos
- 670 Views
- 1 replies
- 1 kudos
I try to run any MLflow command, I encounter the following error
Hello,I'm trying to connect my local MLflow client to Databricks Community Edition but I'm running into issues with authentication.I followed the setup guide for MLflow integration with Databricks, but when I try to run any MLflow command, I encounte...
- 670 Views
- 1 replies
- 1 kudos
- 1 kudos
Can you share the guide you are referring to? Also please let us know what credentials are you using?
- 1 kudos
- 412 Views
- 0 replies
- 0 kudos
input() ignores newlines, truncates after 200 characters
Is there any setting to respect newlines and not truncate for input?Also you can see the input text overflows over outside the box when the browser is narrow which is not desirable.
- 412 Views
- 0 replies
- 0 kudos
- 819 Views
- 3 replies
- 0 kudos
DBFS Explorer displays outdated data
Hey, I use Databricks 15.4 ML LTS (and before Databricks 12LTS).When clicking on Data Ingestion, Upload files to DBFS -> DBFS and then click on the 'mnt' folder, I see outdated content. When I however use the web terminal of Databricks or the tools d...
- 819 Views
- 3 replies
- 0 kudos
- 0 kudos
The problem persists. Is this the correct channel to report this issue which we assume to be a bug in DataBricks? Where else can one report such an issue?
- 0 kudos
- 398 Views
- 1 replies
- 1 kudos
I am on free trial and databricks System schema gives error for lineage
I was trying to access lineage info for my workspace using admin creds. This is what i got
- 398 Views
- 1 replies
- 1 kudos
- 1 kudos
I was able to get 'Lineage' schema enabled on my paid version in System tables. So this appears to be a limitation of the free version. I'm guessing that's because there is a cost associated with querying these schemas...
- 1 kudos
- 1304 Views
- 4 replies
- 0 kudos
saveAsTable sometimes works sometimes dont
I have the following Spark (Save As Table) example. sometimes it works fine, sometimes it failsCode below with file listed in "/temp" directory. This has worked fine as it is, but when I have to create a new Cluster, as I am using the community edit...
- 1304 Views
- 4 replies
- 0 kudos
- 0 kudos
This seems to work, along with explicitly dropping the Database, and re running all code within Notebook.Thank you
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access Controls
1 -
ADLS Gen2 Using ABFSS
1 -
AML
1 -
Api Calls
1 -
Api Requests
1 -
Azure databricks
1 -
Azure Delta Lake
1 -
ClusterCreation
1 -
Community Edition
1 -
Community Edition Account
1 -
Community Edition Login Issues
2 -
community workspace login
1 -
Databricks Community Edition Account
1 -
DB Notebook
1 -
Error
1 -
Free trial
1 -
Help
1 -
Login Issue
1 -
MlFlow
1 -
Partner
1 -
paulo
1 -
Syn
1
- « Previous
- Next »