DBFS problem
Hi All, When trying to get to my DBFS in Databricks comunity edition I get the next errorI have already disabled and Enabled de DBFS file browser but it did not help
- 943 Views
- 0 replies
- 0 kudos
Hi All, When trying to get to my DBFS in Databricks comunity edition I get the next errorI have already disabled and Enabled de DBFS file browser but it did not help
I've encountered a significant issue while using the VSCode extension for Databricks, particularly when working with a cluster configured with a Docker image. Here's a detailed description of the problem:Problem DescriptionWhen attempting to upload a...
Hi @Retired_mod thanks for a such quick response.Actually, I am using the Dockerfile from the Databricks runtime example here: https://github.com/databricks/containers/blob/master/ubuntu/minimal/Dockerfile . The configuration with the VSCode extensio...
################################################ New Code as per UC ################################################ def parse_json(df, *cols, clean=True res = df for i in cols: if clean: res = ( res.wit...
Hi @sambgp , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedback...
If we want to read all the data of the databricks tables at single time how can we able to do it.
Hi @Krishna2110 ,Here it is, it should work now tables = spark.sql("SHOW TABLES IN ewt_edp_prod.crm_raw").collect() for row in tables: table_name = f"ewt_edp_prod.{row[0]}.{row[1]}" try: df = spark.table(table_name) count = df...
Hi,I am using Databricks SQL and I am converting a integer field which is of format ('20240719' or 'yyyyMMdd'),now I am able to convert it to date type using to_date(forecastedHorizonPeriodDate,'yyyyMMdd')I then tried to change the format of this dat...
Hi,Getting below error message.We were not able to find a Community Edition workspace with this email. Please login to non-community-edition workspaces you may have access to.
Hi @virendra1212 ,Take a look at below thread. Maybe some of the suggestions will help in your case.Community Edition Login Issues Below is a list of ... - Page 4 - Databricks Community - 26926You signed up before May 14, 2022 -Please confirm that th...
hi,is there any cost implications for automatic statistics collection?or databricks is providing it as a feature and didn't cost on my cluster?
What if I have lot of empty shuffled partitions due to data skewness Secondly , if the shuffle partition size is 128 MB and if the size of the key's partition is 700 MB
Is there any way to visualize logging such as execution runtime or memory usage of Spark Job graphically just like the image below by utilizing Databricks Partner Connect for free?Also, I'd appreciate to know any other ways to visualize the logging o...
Hi.Struggling to add description by script to the comment column within the catalog view in Databricks, particularly for foreign/external tables sourced from Azure SQL.I have no issue doing that for delta tables. Also for information schema columns f...
@Retired_mod Sorry, I did read this way back but forgot to reply.Thanks for information.And all those steps I found as well and for "3. Programmatic Approach (For Azure SQL)" if you are adding to sql any description that way I don't think it will app...
I am trying to create an online table in a Unity catalog. However, I get a GET, 403 error. DataPlaneException: Failed to start the DLT service on cluster . Please check the stack trace below or driver logs for more details. com.databricks.pipelines....
I figured it out. It was because of the Network Connectivity Configurations. I did not have one setup with a private endpoint connection to the ADLS Gen2. I followed the instructions here: https://learn.microsoft.com/en-us/azure/databricks/security/...
Hi,I have recently started Splunk Integration with Databricks. Basically I am trying to ingest the data from Splunk to Databricks. I have gone through the documentation regarding Splunk Integration. There are some basic information about the integrat...
Hi @Arch_dbxlearner Did you done integration with splunk if yes can you please help
Hi,I'm using CE and trying to upload a JAR library of about 45MB into my workspace so I can use it from Pyspark, but getting error "No API found for 'POST /workspace-files". Any thoughts?
Hello I am a newbie on this platform can anyone please tell me how can I enroll in courses that we have supposed to complete to get a voucher for exams I came to know about databricks Learning Festival
Hi @Nikhilkamode , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your fe...
Float data from file, is never getting copied to the temp_table created though the schema matches.As a workaround, using CREATE TABLE [USING] which is able to insert the file data to a temp_table. Is this a known issue? COPY INTO issue is explained w...
Hi @inagar , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedback...
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up NowUser | Count |
---|---|
133 | |
112 | |
56 | |
42 | |
30 |