- 366 Views
- 15 replies
- 1 kudos
Unable to use community edition
I am getting this error when i login in community edition We were not able to find a Community Editionworkspace with this email. Please login toaccounts.cloud.databricks.comcommunity-edition workspaces you may have accessto. For help, please see Comm...
- 366 Views
- 15 replies
- 1 kudos
- 1 kudos
@Bhaskar1- Hello Bhaskar,My email is rahulmarathe9891@gmail.comWorkspace id = 1082140285761526Thank you for your help.
- 1 kudos
- 13 Views
- 0 replies
- 0 kudos
table path has = in the name and delta table automatically picks this as parition fields in flink.
i want to disable this behaviour that when we have = in path it shouldnt consider this to be a paritioned field. is there any config to do this ?
- 13 Views
- 0 replies
- 0 kudos
- 62 Views
- 1 replies
- 0 kudos
Como solicionar o erro abaixo
Não consegui encontrar um espaço de trabalho Community Edition com este e-mail. Faça login em accounts.cloud.databricks.com para encontrar vagas de trabalho que não sejam da edição comunitária nas quais você pode ter acesso. Para obter ajuda, consul...
- 62 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Paulovasconcelo, Please check this out and submit a case if needed: https://community.databricks.com/t5/support-faqs/databricks-community-sso-august-3rd-2024/ta-p/78459
- 0 kudos
- 60 Views
- 1 replies
- 0 kudos
- 60 Views
- 1 replies
- 0 kudos
- 0 kudos
Please see: https://docs.databricks.com/en/compute/configure.html
- 0 kudos
- 79 Views
- 0 replies
- 0 kudos
How to check databricks-connect types of objects.
While using `databricks-sdk` in my code, I've found that checking PySpark objects types is not reliable anymore. I've used to do the following: from pyspark.sql import Column, DataFrame, SparkSession isinstance(spark, SparkSession) isinstance(a_df...
- 79 Views
- 0 replies
- 0 kudos
- 8350 Views
- 32 replies
- 25 kudos
community edition : "User is not a member of this workspace"
Something very strange has happened. When trying to login to my databricks community edition account. I'm getting the email with my verification code. but after entring that code, I'm getting the error message : "User is not a member of this workspac...
- 8350 Views
- 32 replies
- 25 kudos
- 25 kudos
Hi @Walter_C I am also Facing the same Issue I am trying to login.When trying to login to my databricks community edition account. I'm getting the email with my verification code. but after entering that code, I'm getting the error message : "User i...
- 25 kudos
- 115 Views
- 1 replies
- 0 kudos
How do you import training notebooks, following the data engineer course
In the databricks community edition, I am trying to follow the course and they pull variables and others through notebooks which is supplied for training purposes. I can't see to import any
- 115 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Lakshay_leo ,Datbricks community edition is limited in terms of functionality. If you follow course on databricks academy it's better to use databricks premium account. You can start free trial on azure.
- 0 kudos
- 97 Views
- 1 replies
- 0 kudos
I’m having trouble connecting to my AWS S3 bucket
Hello,I’m having trouble connecting to my AWS S3 bucket from my local Python environment using boto3. When I try to run the following code:import boto3s3 = boto3.client('s3')response = s3.list_objects_v2(Bucket='my-bucket-name')print(response)error m...
- 97 Views
- 1 replies
- 0 kudos
- 0 kudos
Can you check your firewall rules in AWS if that is not blocking outgoing requests.
- 0 kudos
- 378 Views
- 3 replies
- 0 kudos
Unable to create folder in dbfs view
When trying to create a folder in dbfs view, the folder doesn't get created. I am using the community edition to practice. Can somebody help ?is it due to community edition or it is intended use ?
- 378 Views
- 3 replies
- 0 kudos
- 0 kudos
@Lakshay_leo I understand. I also tried it, and I couldn’t create the folder either.To create a new folder, you can specify the desired folder name in the "DBFS Target Directory" field when uploading a file to DBFS. For example, as shown in the image...
- 0 kudos
- 264 Views
- 1 replies
- 0 kudos
Getting error while creating external delta table in Databricks
I am getting below error while creating external delta table in Databricks, even there is a external location created.[NO_PARENT_EXTERNAL_LOCATION_FOR_PATH] No parent external location was found for path 'abfss://destination@datalakeprojectsid.dfs.co...
- 264 Views
- 1 replies
- 0 kudos
- 0 kudos
@Siddalinga If the path specified during table creation is outside the scope of the external location, you may encounter the [NO_PARENT_EXTERNAL_LOCATION_FOR_PATH] error.Is the external location correctly defined to scope the directory, such as abfss...
- 0 kudos
- 183 Views
- 1 replies
- 0 kudos
an issue when trying to connect to my AWS S3 bucket from my local Python environment
Hi everyone,I’m having an issue when trying to connect to my AWS S3 bucket from my local Python environment using the boto3 library. import boto3s3 = boto3.client('s3')response = s3.list_objects_v2(Bucket='my-bucket-name')print(response)I keep gettin...
- 183 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @grisma56giga, The error typically indicates that your Python environment does not have SSL support enabled Can you run to validate: import sslprint(ssl.OPENSSL_VERSION)
- 0 kudos
- 245 Views
- 1 replies
- 0 kudos
Essential-PySpark-for-Scalable-Data-Analytics "wordcount-sql.ipynb"
I'm working through the code at the following, but getting an error:https://github.com/PacktPublishing/Essential-PySpark-for-Scalable-Data-Analytics/blob/main/Chapter01/wordcount-sql.ipynbCode:%sql DROP TABLE IF EXISTS word_counts; CREATE TABLE word_...
- 245 Views
- 1 replies
- 0 kudos
- 0 kudos
Correction: The error message from the screenshot is when I tried to add the dbms: prefix to the URL. The error message without that prefix is the following:UnityCatalogServiceException: [RequestId=dbda5aee-b855-9ed9-abf8-3ee0e0dcc938 ErrorClass=IN...
- 0 kudos
- 156 Views
- 2 replies
- 0 kudos
Unable to login community.cloud.databricks.com site getting "User is not a member of this workspace"
Unable to login community.cloud.databricks.com site getting "User is not a member of this workspace".email-id: dmpti.prabhakar@gmail.com
- 156 Views
- 2 replies
- 0 kudos
- 0 kudos
@Rishabh-Pandey I tried both ways but not working
- 0 kudos
- 291 Views
- 0 replies
- 0 kudos
set up my AWS credentials and configure boto3, but I’m facing an issue
Hi everyone,I’m working on a project where I need to upload files from my local machine to an AWS S3 bucket using the boto3 library in Python. I’ve followed the official documentation to set up my AWS credentials and configure boto3, but I’m facing a...
- 291 Views
- 0 replies
- 0 kudos
- 789 Views
- 1 replies
- 0 kudos
I get error
Hi everyone,I'm trying to connect my local Python environment to an AWS S3 bucket using the boto3 library. However, when I try to run the code to list the files in my S3 bucket, I encounter an authentication error:import boto3s3 = boto3.client('s3')r...
- 789 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @dona168lee ,Can you share additional details how your environment is setup or which DBR version you are using?Followed with how you installed boto3 package and which version of boto3 is installed?
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access Controls
1 -
ADLS Gen2 Using ABFSS
1 -
AML
1 -
Api Calls
1 -
Api Requests
1 -
Azure Delta Lake
1 -
Community Edition
1 -
Community Edition Account
1 -
Community Edition Login Issues
2 -
Databricks Community Edition Account
1 -
DB Notebook
1 -
Error
1 -
Help
1 -
MlFlow
1 -
Partner
1 -
paulo
1 -
Syn
1
- « Previous
- Next »