cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Phani1
by Valued Contributor
  • 1629 Views
  • 2 replies
  • 1 kudos

Resolved! Convert EBCDIC (Binary) file format to ASCII

Hi Team,How can we convert EBCDIC (Binary) file format to ASCII in databricks? Do we have any libraries in Databricks?

  • 1629 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vartika
Moderator
  • 1 kudos

Hi @Janga Reddy​,Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...

  • 1 kudos
1 More Replies
PK225
by New Contributor III
  • 673 Views
  • 2 replies
  • 1 kudos
  • 673 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vartika
Moderator
  • 1 kudos

Hi @Pavan Kumar​,Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too.Cheers!

  • 1 kudos
1 More Replies
James1100
by New Contributor II
  • 725 Views
  • 2 replies
  • 2 kudos

Resolved! Databricks connect to GCS

Hi,Would like to ask if anyone knows how to connect to GCS - basically read csv file from GCS bucket.I have no issue connecting to Data Lake.Thank you so much in advance.

  • 725 Views
  • 2 replies
  • 2 kudos
Latest Reply
Vartika
Moderator
  • 2 kudos

Hi @James C​,Just checking in. If @Kaniz Fatma​'s answer helped, would you let us know and mark the answer as best? If not, would you be happy to give us more information?We'd love to hear from you.Cheers!

  • 2 kudos
1 More Replies
valskyyy
by New Contributor II
  • 2232 Views
  • 5 replies
  • 5 kudos

Resolved! Command skipped but no error message

Hi all ! .This is my first post here !I have a problem when I launch a "run all" on my notebook : at a moment (always on the same cell), all the following cells are skipped.As you can see the command 38 is correctly executed and in the command 40 I ...

  • 2232 Views
  • 5 replies
  • 5 kudos
Latest Reply
Vartika
Moderator
  • 5 kudos

Hi @valskyyy valentin.lewandowski.partner​,Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd lo...

  • 5 kudos
4 More Replies
sanjay
by Valued Contributor II
  • 1927 Views
  • 3 replies
  • 2 kudos

Resolved! Autoloader maxFilesPerTrigger not working correctly

Hi,am trying to apply batch size in autoloader and code is as below. But its picking all the changes in one go even if I have put maxFilesPerTrigger as 10. Appreciate any help.(spark.readStream.format("json").schema(streamSchema).option("cloudFiles.b...

  • 1927 Views
  • 3 replies
  • 2 kudos
Latest Reply
Lakshay
Esteemed Contributor
  • 2 kudos

Hi @Sanjay Jain​ , Since you have provided the trigger as once, the maxFilesPerTrigger will not take effect here. With trigger once, all the files will be read together. You need to change the trigger for this option to come into effect.Please refer ...

  • 2 kudos
2 More Replies
_deepak_
by New Contributor II
  • 975 Views
  • 3 replies
  • 0 kudos

Databricks regression test suite

Hi, I am new to Databricks and setting up the non-prod environment. I am wanted to know, IS there any way by which I can run a regression suite so that existing setup should not break in case of any feature addition and also how can I make available ...

  • 975 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@deepak prasad​ :Yes, you can run regression tests to ensure that your changes do not break existing functionality. Databricks supports a number of testing frameworks like PyTest, which can be used to automate regression testing. You can write test c...

  • 0 kudos
2 More Replies
santhosh1
by New Contributor II
  • 1078 Views
  • 4 replies
  • 3 kudos

Can we share exam voucher to another databricks account

Hi, I received free voucher for lakehouse webinar, My friend also got free voucher, by any chance can i use my friend voucher to shedule another exam for me.

  • 1078 Views
  • 4 replies
  • 3 kudos
Latest Reply
SUMI1
New Contributor III
  • 3 kudos

Hi guysUnfortunately, it is not possible to share an exam voucher with another Databricks account. Exam vouchers are typically tied to specific accounts or individuals and cannot be transferred or shared. Free Fire

  • 3 kudos
3 More Replies
tototox
by New Contributor III
  • 4566 Views
  • 3 replies
  • 2 kudos

how to check table size by partition?

I want to check the size of the delta table by partition.As you can see, only the size of the table can be checked, but not by partition.

  • 4566 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@jin park​ :You can use the Databricks Delta Lake SHOW TABLE EXTENDED command to get the size of each partition of the table. Here's an example:%sql SHOW TABLE EXTENDED LIKE '<table_name>' PARTITION (<partition_column> = '<partition_value>') SELECT...

  • 2 kudos
2 More Replies
Yash_542965
by New Contributor II
  • 1950 Views
  • 2 replies
  • 3 kudos

Resolved! Access Excel file in delta live pipeline

I'm having an issue accessing the excel through dlt pipeline. the file is in ADLS I'm using pandas to read the Excel. It seems pandas are not able to understand abfss protocol is there any way to read Excel with pandas in dlt pipeline?I'm getting thi...

  • 1950 Views
  • 2 replies
  • 3 kudos
Latest Reply
Yash_542965
New Contributor II
  • 3 kudos

Thanks for the info. It works just need to install an additional library using "%pip install openpyxl".

  • 3 kudos
1 More Replies
Inna_M
by New Contributor III
  • 927 Views
  • 1 replies
  • 1 kudos

Resolved! Is there any maintenance (patches , upgrade for VMs created by DataBricks on Azure) from DataBricks

We are using Databricks on Azure. Infra team noticed we have some VMs created in the past for DataBricks clusters on version Linux (ubuntu 18.04). Is there maintenance previewed for that, upgrade? Are there any patches for created in Azure objects by...

  • 927 Views
  • 1 replies
  • 1 kudos
Latest Reply
Inna_M
New Contributor III
  • 1 kudos

Finally while I was posting this question, AzureDataBricks upgraded VMs to the supported version 20, not the latest , 22. It was a week after old version was no longer supported by Microsoft

  • 1 kudos
CoopCoop
by New Contributor III
  • 2259 Views
  • 6 replies
  • 7 kudos

Resolved! PDF Attachment on an Alert

Currently my Alert is an HTML table using data pointing to an SQL query.I was wondering if it is possible to attach the resulting table from this SQL query as a PDF to the alert email.If anyone has successfully implemented this, please let me know! T...

  • 2259 Views
  • 6 replies
  • 7 kudos
Latest Reply
Atanu
Esteemed Contributor
  • 7 kudos

Ok understood the concern, so basically the issue is with PDF rendering as much I understood. Let me know if I am wrong. Let me see if there is any improvement by our engineering team on this front.

  • 7 kudos
5 More Replies
Louis_Databrick
by New Contributor II
  • 588 Views
  • 2 replies
  • 0 kudos

Registering a dataframe coming from a CDC data stream removes the CDC columns from the resulting temporary view, even when explicitly adding a copy of the column to the dataframe.

df_source_records.filter(F.col("_change_type").isin("delete", "insert", "update_postimage")) .withColumn("ROW_NUMBER", F.row_number().over(window)) .filter("ROW_NUMBE...

  • 588 Views
  • 2 replies
  • 0 kudos
Latest Reply
Louis_Databrick
New Contributor II
  • 0 kudos

Seems to work now actually. No idea what changed, as I tried multiple times exactly in this way and it did.not.work.from pyspark.sql.functions import expr from pyspark.sql.utils import AnalysisException import pyspark.sql.functions as f     data = [(...

  • 0 kudos
1 More Replies
StuartKindness_
by New Contributor II
  • 853 Views
  • 4 replies
  • 2 kudos

How to replace the SSO certifcate on our workspace?

We have Azure AD SSO setup on our workspace but the three year certificate is due to expire on Monday. I have logged onto the Admin Console & Single Sign-on tab. All the options are greyed out and there is no update or edit buttons as can be seen in ...

Databricks_sso_replacep
  • 853 Views
  • 4 replies
  • 2 kudos
Latest Reply
StuartKindness_
New Contributor II
  • 2 kudos

@Debayan​  our version is branch-3.96-1682169174-f2e2f130 if this helps any?

  • 2 kudos
3 More Replies
harraz
by New Contributor III
  • 2187 Views
  • 1 replies
  • 0 kudos

Run result unavailable: run failed with error message Notebook not found:

I'm trying to create a workflow job that fetches the notebook from a remote git repository (Bitbucket cloud)I tried everything in the Path field and nothing is working. Note that the bitbucket repo is connected to databricks already and no issues che...

Screen Shot 2023-05-31 at 6.45.47 PM
  • 2187 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi @harraz (Customer)​ , Could you please confirm if files in repos has been enabled? https://docs.databricks.com/files/workspace.html#configure-support-for-files-in-repos.You can use the command %sh pwd in a notebook inside a repo to check if Files ...

  • 0 kudos
Labels
Top Kudoed Authors