cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Vikad
by New Contributor II
  • 1169 Views
  • 5 replies
  • 2 kudos

Databricks certificaton voucher not recieved

Hi team,I attended the webinar on 21th feb 2023 and also took Lakehouse fundamentals badge, yet I have not received any certification voucher from databricks.regards,vikas

  • 1169 Views
  • 5 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Vikas Singh​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback ...

  • 2 kudos
4 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 973 Views
  • 2 replies
  • 12 kudos

Databricks now supports event-driven workloads, especially for loading cloud files from external locations. This means you can save costs and resource...

Databricks now supports event-driven workloads, especially for loading cloud files from external locations. This means you can save costs and resources by triggering your Databricks jobs only when new files arrive in your cloud storage instead of mou...

ezgif-3-946af786d0
  • 973 Views
  • 2 replies
  • 12 kudos
Latest Reply
Vartika
Moderator
  • 12 kudos

Hi @Hubert Dudek​ We really appreciate you sharing this bit of information.Cheers!

  • 12 kudos
1 More Replies
Naveen_KumarMad
by New Contributor III
  • 5377 Views
  • 13 replies
  • 14 kudos

Resolved! How to find the last modified date of a notebook?

I would like to find the notebooks that are not required and not being used and then I can review and delete them. If there is a way to find last modified date of a notebook programmatically then I can get a list of notebooks, which I can review and ...

  • 5377 Views
  • 13 replies
  • 14 kudos
Latest Reply
Amit_352107
New Contributor III
  • 14 kudos

Hi @Naveen Kumar Madas​ you can go through below code block%shls -lt /dbfs/

  • 14 kudos
12 More Replies
wyzer
by Contributor II
  • 28567 Views
  • 15 replies
  • 7 kudos

Resolved! What's the equivalent of "DECLARE..." in Databricks SQL ?

Hello everyone,I'm new in Databricks SQL, and I'm comming from SQL Server.I would like to know what's the equivalent of :DECLARE @P_Name varchar(50) = 'BackOffice'It's for use it like this : CREATE DATABASE @P_NameThanks.

  • 28567 Views
  • 15 replies
  • 7 kudos
Latest Reply
Amit_352107
New Contributor III
  • 7 kudos

Hi @Salah K.​ you can go through this code block%python P_Name = 'BackOffice'spark.sql(f""" create database {P_name} """)

  • 7 kudos
14 More Replies
StephanieRivera
by Valued Contributor II
  • 2870 Views
  • 4 replies
  • 2 kudos
  • 2870 Views
  • 4 replies
  • 2 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 2 kudos

Hi, You can refer to https://docs.databricks.com/files/unzip-files.html. You can curl the file you want and then it can be unzipped as mentioned in the doc. Please let us know if this helps.Also, please tag @Debayan with your next update which will n...

  • 2 kudos
3 More Replies
RC
by Contributor
  • 717 Views
  • 2 replies
  • 2 kudos

Not able to create a unity metastore in a specified region

Hi Team,I'm not able to create a metastore in a region (us-east-1).It tells me that "This region already contains a metastore. Only a single metastore is allowed per region"But we don't have any metastore. Earlier we had one metastore we had deleted...

  • 717 Views
  • 2 replies
  • 2 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 2 kudos

@Rajath C​ can you please re-check if it has been properly deleted and if still old one has been tied to any of workspaces. also try to delete that storage if no data exists and retry

  • 2 kudos
1 More Replies
repcak
by New Contributor III
  • 1456 Views
  • 1 replies
  • 2 kudos

Init Scripts with mounted azure data lake storage gen2

I'm trying to access init script which is stored on mounted azure data lake storage gen2 to dbfsI mounted storage to dbfs:/mnt/storage/container/script.shand when i try to access it i got an error:Cluster scoped init script dbfs:/mnt/storage/containe...

  • 1456 Views
  • 1 replies
  • 2 kudos
Latest Reply
User16752239289
Valued Contributor
  • 2 kudos

I do not think the init script saved under mount point work and we do not suggest that. If you specify abfss , then the cluster need to be configured so that the cluster can authenticate and access the adls gen2 folder. Otherwise, the cluster will no...

  • 2 kudos
Erik_L
by Contributor II
  • 3063 Views
  • 1 replies
  • 0 kudos

How to merge parquets with different column types

ProblemI have a directory in S3 with a bunch of data files, like "data-20221101.parquet". They all have the same columns: timestamp, reading_a, reading_b, reading_c. In the earlier files, the readings are floats, but in the later ones they are double...

  • 3063 Views
  • 1 replies
  • 0 kudos
Latest Reply
mathan_pillai
Valued Contributor
  • 0 kudos

1) Can you let us know what was the error message when you don't set the schema & use mergeSchema2) What happens when you define schema (with FloatType) & use mergeSchema ? what error message do you get ?

  • 0 kudos
karthik_p
by Esteemed Contributor
  • 1787 Views
  • 9 replies
  • 8 kudos

Tool For Monitoring Security/Health of Databricks Workspace Since from a year we have been looking for a tool to monitor health of data bricks workspa...

Tool For Monitoring Security/Health of Databricks WorkspaceSince from a year we have been looking for a tool to monitor health of data bricks workspace in automated way. we used to monitor below few things in workspace manually clusters JobsTablesACL...

  • 1787 Views
  • 9 replies
  • 8 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 8 kudos

@Harish Koduru​ @Arnold Souza​ are you still seeing issues

  • 8 kudos
8 More Replies
Sujitha
by Community Manager
  • 410 Views
  • 1 replies
  • 3 kudos

Data + AI Summit Virtual - Register Now!  This year’s free virtual experience will include access to live-streamed keynotes, select sessions designed ...

Data + AI Summit Virtual - Register Now! This year’s free virtual experience will include access to live-streamed keynotes, select sessions designed and led by data experts, as well as unlimited access to on-demand sessions soon after the live event....

  • 410 Views
  • 1 replies
  • 3 kudos
Latest Reply
jose_gonzalez
Moderator
  • 3 kudos

Thank you for sharing @Sujitha Ramamoorthy​ !!

  • 3 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 738 Views
  • 1 replies
  • 6 kudos

Exciting news for #azure users! The #databricks runtime 12.2 has been officially released as a long-term support (LTS) version, providing a stable and...

Exciting news for #azure users! The #databricks runtime 12.2 has been officially released as a long-term support (LTS) version, providing a stable and reliable platform for users to build and deploy their applications. As part of this release, the en...

122
  • 738 Views
  • 1 replies
  • 6 kudos
Latest Reply
jose_gonzalez
Moderator
  • 6 kudos

Thank you for sharing @Hubert Dudek​ !!!

  • 6 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 753 Views
  • 1 replies
  • 7 kudos

Starting from #databricks 12.2 LTS, the explode function can be used in the FROM statement to manipulate data in new and powerful ways. This function ...

Starting from #databricks 12.2 LTS, the explode function can be used in the FROM statement to manipulate data in new and powerful ways. This function takes an array column as input and returns a new row for each element in the array, offering new pos...

ezgif-3-f42040b788
  • 753 Views
  • 1 replies
  • 7 kudos
Latest Reply
jose_gonzalez
Moderator
  • 7 kudos

Thank you for sharing @Hubert Dudek​ 

  • 7 kudos
oriole
by New Contributor III
  • 2359 Views
  • 5 replies
  • 2 kudos

Resolved! Spark Driver Crash Writing Large Text

I'm working with a large text variable, working it into single line JSON where Spark can process beautifully. Using a single node 256 GB 32 core Standard_E32d_v4 "cluster", which should be plenty memory for this dataset (haven't seen cluster memory u...

  • 2359 Views
  • 5 replies
  • 2 kudos
Latest Reply
pvignesh92
Honored Contributor
  • 2 kudos

@David Toft​ Hi, The current implementation of dbutils.fs is single-threaded, performs the initial listing on the driver and subsequently launches a Spark job to perform the per-file operations. So I guess the put operation is running on a single cor...

  • 2 kudos
4 More Replies
andrew0117
by Contributor
  • 1036 Views
  • 3 replies
  • 2 kudos

Resolved! Will a table backed by a SQL server database table automatically get updated if the base table in SQL server database is updated?

If I creat a table using the code below: CREATE TABLE IF NOT EXISTS jdbcTableusing org.apache.spark.sql.jdbcoptions( url "sql_server_url", dbtable "sqlserverTable", user "username", password "password")will jdbcTable always be automatically sync...

  • 1036 Views
  • 3 replies
  • 2 kudos
Latest Reply
pvignesh92
Honored Contributor
  • 2 kudos

Hi @andrew li​ There is a feature introduced from DBR11 where you can directly ingest the data to the table from a selected list of sources. As you are creating a table, I believe this command will create a managed table by loading the data from the...

  • 2 kudos
2 More Replies
Labels
Top Kudoed Authors