cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Olfa_Kamli
by New Contributor II
  • 470 Views
  • 1 replies
  • 0 kudos

log delivery are not creating data in s3 bucket

Hiii, Does anyone have an idea about the typical duration for Databricks to create logs in an S3 bucket using the databricks_mws_log_delivery Terraform resource? I've implemented the code provided in the Databricks official documentation, but I've be...

  • 470 Views
  • 1 replies
  • 0 kudos
Latest Reply
Olfa_Kamli
New Contributor II
  • 0 kudos

The issue has been resolved. There was no problem with the code or the API. However, it took over 12 hours for logs to start appearing in my bucket, despite Databricks documentation indicating that logs should appear within 1 hour..Thank you!

  • 0 kudos
TheIceBrick
by New Contributor III
  • 3039 Views
  • 3 replies
  • 1 kudos

Is there a (request-) size limit for the Databricks Rest Api Sql statements?

When inserting rows through the Sql Api (/api/2.0/sql/statements/), when more than a certain number of records (about 25 records with 8 small columns) are included in the statement, the call fails with the error:"The request could not be processed by...

Community Discussions
REST API
Sql Statements
  • 3039 Views
  • 3 replies
  • 1 kudos
Latest Reply
ChrisCkx
New Contributor II
  • 1 kudos

@TheIceBrick did you find out anything else about this?I am experiencing exactly the same, I can insert up to 35 rows but break at about 50 rows.The payload size is 42KB, I am passing parameters for each row.@Debayan This is no where near the 16MiB /...

  • 1 kudos
2 More Replies
Ruby8376
by Valued Contributor
  • 1282 Views
  • 7 replies
  • 1 kudos

Expose delta table data to Salesforce - odata?

HI Looking for suggestiongs to stream on demand data from databricks delta tables to salesforce.Is odata a good option? 

  • 1282 Views
  • 7 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

I see, is there a possibility in SF to define an external location/datasource?Just guessing here, as these type of packages are really good in isolating data, not integrating it.

  • 1 kudos
6 More Replies
jenshumrich
by New Contributor III
  • 519 Views
  • 2 replies
  • 0 kudos

Long running jobs get lost

Hello,I tried to schedule a long running job and surprisingly it does seem to neither terminate (and thus does not let the cluster shut down), nor continue running, even though the state is still "Running":But the truth is that the job has miserably ...

jenshumrich_0-1712742957610.png jenshumrich_2-1712743008070.png jenshumrich_3-1712743098546.png
  • 519 Views
  • 2 replies
  • 0 kudos
Latest Reply
Lakshay
Esteemed Contributor
  • 0 kudos

Have you looked at the sql plan to see what the  spark job 72 was doing?

  • 0 kudos
1 More Replies
chari
by Contributor
  • 466 Views
  • 3 replies
  • 0 kudos

Reading csv file with spark throws [insufficient privelage] error

Hello Community,I have some csv files saved in databricks workspace and want to read them with spark. I make use of the commanddf = spark.read.format('csv').load(r'filepath') However, it throws the error.org.apache.spark.SparkSecurityException: [INSU...

  • 466 Views
  • 3 replies
  • 0 kudos
Latest Reply
Lakshay
Esteemed Contributor
  • 0 kudos

If this a UC enabled workspace, you need to provide the right access.

  • 0 kudos
2 More Replies
thilanka02
by New Contributor II
  • 727 Views
  • 2 replies
  • 1 kudos

Resolved! Spark read CSV does not throw Exception if the file path is not available in Databricks 14.3

We were using this method and this was working as expected in Databricks 13.3.  def read_file(): try: df_temp_dlr_kpi = spark.read.load(raw_path,format="csv", schema=kpi_schema) return df_temp_dlr_kpi except Exce...

Screenshot 2024-04-19 at 13.29.19.png
  • 727 Views
  • 2 replies
  • 1 kudos
Latest Reply
thilanka02
New Contributor II
  • 1 kudos

Thank you @daniel_sahal for the reply

  • 1 kudos
1 More Replies
Ajay-Pandey
by Esteemed Contributor III
  • 1565 Views
  • 3 replies
  • 2 kudos

Resolved! Update regarding Community Reward Store

Hi Team,Is there any update on the Community Reward Store, as it's been discontinued from the old portal, and we still can't see the new portal for that.Is there any expected date when this will be available for community members?

  • 1565 Views
  • 3 replies
  • 2 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 2 kudos

Thanks for update.

  • 2 kudos
2 More Replies
anonymous_567
by New Contributor II
  • 580 Views
  • 3 replies
  • 0 kudos

Autoloader update table when new changes are made

Hello,Everyday a new file of the same name gets sent to my storage account with old and new data appended at the end.  Columns may also be added during one of these file updates. This file does a complete overwrite of the previous file. Is it possibl...

  • 580 Views
  • 3 replies
  • 0 kudos
Latest Reply
data-grassroots
New Contributor III
  • 0 kudos

This may be helpful - the bit on allow overwritehttps://docs.databricks.com/en/ingestion/auto-loader/faq.html

  • 0 kudos
2 More Replies
Alexandru
by New Contributor III
  • 797 Views
  • 3 replies
  • 0 kudos

Resolved! vscode python project for development

Hi,I'm trying to set up a local development environment using python / vscode / poetry. Also, linting is enabled (Microsoft pylance extension) and the python.analysis.typeCheckingMode is set to strict.We are using python files for our code (.py) whit...

  • 797 Views
  • 3 replies
  • 0 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 0 kudos

Hi Alexandru, Take a look at VSCode extension for Databricks : https://marketplace.visualstudio.com/items?itemName=databricks.databricks 

  • 0 kudos
2 More Replies
Archana_Mathan
by New Contributor
  • 312 Views
  • 1 replies
  • 1 kudos

Maintaining Order Consistency: Table Creation in Databricks SQL vs. DLT Pipeline

I have a CTE table with the below names as values. My objective is to create another table by concatenating all the rows from the CTE table in ascending order, resulting in the final output sequence: "Abi, Rahul, ram, Siva". When executing the query ...

Archana_Mathan_0-1713354324971.png
  • 312 Views
  • 1 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

when writing, order is not guaranteed due to the nature of distributed processing.If you want the order to be guaranteed, you should order it when reading the data.Your query does not write any data, DLT does, that is the difference.

  • 1 kudos
Hogan
by New Contributor II
  • 288 Views
  • 1 replies
  • 0 kudos

Can browse external Storage, but can not create a Table from there - VNET, ADLSGen2

Hi there!Hope somebody here can help me. We have created a new Databricks Account on Azure with the ARM template for VNET injection.We have all the subnets etc., unitiy catalog active and the connector for databricks.I want now to create my first tab...

  • 288 Views
  • 1 replies
  • 0 kudos
Latest Reply
Hogan
New Contributor II
  • 0 kudos

Hi,To solve this problem, the following Microsoft documentation can be used to configure the NCC to enable the connection between the private Azure storage and the serverless resources.https://learn.microsoft.com/en-us/azure/databricks/security/netwo...

  • 0 kudos
sai_sathya
by New Contributor III
  • 795 Views
  • 6 replies
  • 1 kudos

DataFrame to CSV write has issues due to multiple commas inside an row value

Hi alliam working on a data containing JSON fields with embedded commas into CSV format. iam facing challenges due to the commas within the JSON being misinterpreted as column delimiters during the conversion process.i tried several methods to modify...

sai_sathya_0-1712850570456.png sai_sathya_1-1712850991923.png
  • 795 Views
  • 6 replies
  • 1 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 1 kudos

Hi Sai, I assume that the problem comes not from the PySpark, but from Excel. I tried to reproduce the error and didn't find the way - that a good thing, right ? Please try the following :    df.write.format("csv").save("/Volumes/<my_catalog_name>/<m...

  • 1 kudos
5 More Replies
Nithya_r
by New Contributor II
  • 319 Views
  • 1 replies
  • 0 kudos

Access Delta sharing from Azure Data Factory

I recently got access to delta sharing and I am looking to access the data from the tables in share through ADF. I used linked services such as REST API and HTTP and successfully established connection using the credential file token and http path, h...

  • 319 Views
  • 1 replies
  • 0 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 0 kudos

Hey, I think you'll need to use a Databricks activity instead of Copy See : https://learn.microsoft.com/en-us/azure/data-factory/connector-overview#integrate-with-more-data-storeshttps://learn.microsoft.com/en-us/azure/data-factory/transform-data-dat...

  • 0 kudos
databird
by New Contributor II
  • 1103 Views
  • 4 replies
  • 1 kudos

Redefine ETL strategy with pypskar approach

Hey everyone!I've some previous experience with Data Engineering, but totally new in Databricks and Delta Tables.Starting this thread hoping to ask some questions and asking for help on how to design a process.So I have essentially 2 delta tables (sa...

  • 1103 Views
  • 4 replies
  • 1 kudos
Latest Reply
artsheiko
Valued Contributor III
  • 1 kudos

Hi @databird , You can review the code of each demo by opening the content via "View the Notebooks" or by exploring the following repo : https://github.com/databricks-demos (you can try to search for "merge" to see all the occurrences, for example) T...

  • 1 kudos
3 More Replies
vinay076
by New Contributor III
  • 743 Views
  • 2 replies
  • 0 kudos

There is no certification number in my Databricks certificate that i had received after passing the

I enrolled myself for the Databricks data engineer certification recently and gave a shot at the exam and i did clear it successfully. I have received the certificate in the form of a pdf file along with a URL in which i can see my certificate and ba...

  • 743 Views
  • 2 replies
  • 0 kudos
Latest Reply
Cert-Team
Honored Contributor III
  • 0 kudos

Hi @vinay076 Thanks for asking! Our support team can provide you with a credential ID. Please file a ticket with our support team, give them your email associated with your certification, and they can get you the credential ID.

  • 0 kudos
1 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!