- 3284 Views
- 3 replies
- 0 kudos
Unable to create folder in dbfs view
When trying to create a folder in dbfs view, the folder doesn't get created. I am using the community edition to practice. Can somebody help ?is it due to community edition or it is intended use ?
- 3284 Views
- 3 replies
- 0 kudos
- 0 kudos
@Lakshay_leo I understand. I also tried it, and I couldn’t create the folder either.To create a new folder, you can specify the desired folder name in the "DBFS Target Directory" field when uploading a file to DBFS. For example, as shown in the image...
- 0 kudos
- 1217 Views
- 1 replies
- 0 kudos
an issue when trying to connect to my AWS S3 bucket from my local Python environment
Hi everyone,I’m having an issue when trying to connect to my AWS S3 bucket from my local Python environment using the boto3 library. import boto3s3 = boto3.client('s3')response = s3.list_objects_v2(Bucket='my-bucket-name')print(response)I keep gettin...
- 1217 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @grisma56giga, The error typically indicates that your Python environment does not have SSL support enabled Can you run to validate: import sslprint(ssl.OPENSSL_VERSION)
- 0 kudos
- 2165 Views
- 1 replies
- 0 kudos
Essential-PySpark-for-Scalable-Data-Analytics "wordcount-sql.ipynb"
I'm working through the code at the following, but getting an error:https://github.com/PacktPublishing/Essential-PySpark-for-Scalable-Data-Analytics/blob/main/Chapter01/wordcount-sql.ipynbCode:%sql DROP TABLE IF EXISTS word_counts; CREATE TABLE word_...
- 2165 Views
- 1 replies
- 0 kudos
- 0 kudos
Correction: The error message from the screenshot is when I tried to add the dbms: prefix to the URL. The error message without that prefix is the following:UnityCatalogServiceException: [RequestId=dbda5aee-b855-9ed9-abf8-3ee0e0dcc938 ErrorClass=IN...
- 0 kudos
- 2097 Views
- 1 replies
- 0 kudos
I get error
Hi everyone,I'm trying to connect my local Python environment to an AWS S3 bucket using the boto3 library. However, when I try to run the code to list the files in my S3 bucket, I encounter an authentication error:import boto3s3 = boto3.client('s3')r...
- 2097 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @dona168lee ,Can you share additional details how your environment is setup or which DBR version you are using?Followed with how you installed boto3 package and which version of boto3 is installed?
- 0 kudos
- 2259 Views
- 1 replies
- 0 kudos
queries are running extremely slow
Hello everyone,I’m encountering an issue when querying large Parquet files in Databricks, particularly with files exceeding 1 GB in size. The queries are running extremely slow, and at times, they even time out. I’ve tried optimizing the file size an...
- 2259 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @nathan45shafer, Thanks for your question, you can refer to: https://www.databricks.com/discover/pages/optimize-data-workloads-guide it covers good practices and actions to optimize your workflow, please let me know if you have questions.
- 0 kudos
- 2194 Views
- 2 replies
- 1 kudos
Access denied for Course
I can't access this course. https://www.databricks.com/training/catalog/advanced-machine-learning-operations-3508When I register this course , display below error.Access deniedYou do not have permission to access this page, please contact your admin...
- 2194 Views
- 2 replies
- 1 kudos
- 1 kudos
Hello, @ash1127! It looks like this post duplicates the one you recently posted. The original post has already been answered. I recommend continuing the discussion there to keep the conversation focused and organized.Let me know if you have any furth...
- 1 kudos
- 1032 Views
- 1 replies
- 0 kudos
when attempting to load a large 800 MB CSV file
Hello everyone,I’m facing an issue when attempting to load a large 800 MB CSV file from ADLS using Auto Loader. Unfortunately, the notebook crashes during the loading process. Has anyone experienced something similar or have any suggestions on how to...
- 1032 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @violeta482yee, Have you checked the resource availability of the compute attached to the notebook? One solution could be: to usecloudFiles.maxBytesPerTriggerOption: This option allows you to control the maximum number of bytes processed in each t...
- 0 kudos
- 1630 Views
- 1 replies
- 0 kudos
please enter your credentials...
Hi everyone,I'm trying to connect my local Jupyter Notebook environment to Databricks Community Edition, and I’ve followed the setup guide on the Databricks website. I’m attempting to use the mlflow library, but I’m facing an issue where the authenti...
- 1630 Views
- 1 replies
- 0 kudos
- 0 kudos
What is the process you are following to create the PAT token? In community Edition the PAT token is not allowed, instead you need to use Oauth token https://docs.databricks.com/en/dev-tools/auth/oauth-u2m.html
- 0 kudos
- 2383 Views
- 2 replies
- 1 kudos
I was about to signup to community edition but mistakenly signed up to regular edition
I was about to signup to community edition but mistakenly signed up to regular edition, I want to convert my subscription into community edition how do i do it.When i try to login i get a error that, we are not able to find community edition workspac...
- 2383 Views
- 2 replies
- 1 kudos
- 1 kudos
Same issue here. Couldn't use community edition using any other email as well
- 1 kudos
- 7188 Views
- 11 replies
- 1 kudos
Resolved! Generate a temporary table credential fails
Hi Team,I am trying to execute the below API, and it is failing.API:curl -v -X POST "https://dbc-xxxxxxxx-xxxx.cloud.databricks.com/api/2.0/unity-catalog/temporary-stage-credentials" -H "Authentication: Bearer xxxxxxxxxxxxxxxx" -d '{"table_id":"exte...
- 7188 Views
- 11 replies
- 1 kudos
- 1 kudos
Your endpoint also seems to be incorrect, as per doc https://docs.databricks.com/api/gcp/workspace/temporarytablecredentials/generatetemporarytablecredentials it has to be /api/2.0/unity-catalog/temporary-table-credentials and you are using /api/2.0...
- 1 kudos
- 5729 Views
- 10 replies
- 0 kudos
Resolved! API access via python script
Hello,I try to access an API via a python script.I have access using postman. But if I try to access it via python in databricks, it will not work (timeout).Here is my code:import requestsimport socketurl = 'https://prod-api.xsp.cloud.corpintra.net/t...
- 5729 Views
- 10 replies
- 0 kudos
- 0 kudos
Nevertheless: Thank you very much for helping me!
- 0 kudos
- 3134 Views
- 7 replies
- 0 kudos
Databricks - Subscription is not Active
Hello TeamMy databricks free trial has been extended till 2030 but somehow my account is showing "Subscription is not Active". Currently, I have one workspace running in my account and it is not letting me create another workspace. I am also unsure...
- 3134 Views
- 7 replies
- 0 kudos
- 0 kudos
@SaikiranTamide - I pinged you with additional details.
- 0 kudos
- 1855 Views
- 2 replies
- 0 kudos
Community Edition isnt't supporting importing .dbc file
Hello Databricks Community Support Team,I am using Databricks Community Edition for exploring Databricks features.It has been observed that, Importing .DBC file is no more supported in community edition. The below snippet shows the error message we a...
- 1855 Views
- 2 replies
- 0 kudos
- 0 kudos
I was informed that this feature of importing workspace files does not work and it's expected for Community Edition.
- 0 kudos
- 3965 Views
- 9 replies
- 0 kudos
Upload dbc file from another workspace
Hi team,I have a dbc file from different workspace. I am trying to upload that dbc into different workspace it says folder does not existsPlease suggest some solution
- 3965 Views
- 9 replies
- 0 kudos
- 0 kudos
Hi,I found a work around :Step 1: If you are using Azure or AWS create an instance of databricks workspaceStep 2: Once the workspace is ready, import your dbc(databricks Archive) file.Step3: This will surely show all the files within the dbcStep4: Ex...
- 0 kudos
- 1149 Views
- 1 replies
- 0 kudos
any Quota limit in Auto loader
Hi all,I encountered an issue while trying to load an 800 MB CSV file from ADLS using Auto Loader. The notebook crashed during the process. Could anyone assist me with resolving this?
- 1149 Views
- 1 replies
- 0 kudos
- 0 kudos
@JissMathew What is the error that you are getting ?
- 0 kudos
-
Access Controls
1 -
ADLS Gen2 Using ABFSS
1 -
AML
1 -
Apache spark
1 -
Api Calls
1 -
App
1 -
Autoloader
1 -
AWSDatabricksCluster
1 -
Azure databricks
3 -
Azure Delta Lake
1 -
BI Integrations
1 -
Billing
1 -
Billing and Cost Management
1 -
Cluster
3 -
Cluster Creation
1 -
ClusterCreation
1 -
Community Edition
4 -
Community Edition Account
1 -
Community Edition Login Issues
2 -
community workspace login
1 -
Compute
3 -
Compute Instances
2 -
Continue Community Edition
1 -
databricks
1 -
Databricks Community Edition Account
2 -
Databricks Free Edition
1 -
Databricks Issue
1 -
Databricks Notebooks
1 -
databricks one
1 -
Databricks Support
1 -
databricksapps
1 -
DB Notebook
1 -
DBFS
1 -
Delta Tables
1 -
documentation
1 -
financial data market
1 -
Free Databricks
1 -
Free Edition
1 -
Free trial
1 -
Genie
1 -
Google cloud
1 -
Hubert Dudek
1 -
link for labs
1 -
Login Issue
2 -
mcp
1 -
MlFlow
1 -
ow
1 -
Serverless
1 -
Sign Up Issues
2 -
Software Development
1 -
someone is trying to help you
1 -
Spark
1 -
URGENT
2 -
Web Application
1
- « Previous
- Next »
| User | Count |
|---|---|
| 41 | |
| 12 | |
| 10 | |
| 9 | |
| 8 |