cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

RC
by Contributor
  • 1513 Views
  • 2 replies
  • 2 kudos

Not able to create a unity metastore in a specified region

Hi Team,I'm not able to create a metastore in a region (us-east-1).It tells me that "This region already contains a metastore. Only a single metastore is allowed per region"But we don't have any metastore. Earlier we had one metastore we had deleted...

  • 1513 Views
  • 2 replies
  • 2 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 2 kudos

@Rajath C​ can you please re-check if it has been properly deleted and if still old one has been tied to any of workspaces. also try to delete that storage if no data exists and retry

  • 2 kudos
1 More Replies
repcak
by New Contributor III
  • 2524 Views
  • 1 replies
  • 2 kudos

Init Scripts with mounted azure data lake storage gen2

I'm trying to access init script which is stored on mounted azure data lake storage gen2 to dbfsI mounted storage to dbfs:/mnt/storage/container/script.shand when i try to access it i got an error:Cluster scoped init script dbfs:/mnt/storage/containe...

  • 2524 Views
  • 1 replies
  • 2 kudos
Latest Reply
User16752239289
Databricks Employee
  • 2 kudos

I do not think the init script saved under mount point work and we do not suggest that. If you specify abfss , then the cluster need to be configured so that the cluster can authenticate and access the adls gen2 folder. Otherwise, the cluster will no...

  • 2 kudos
Erik_L
by Contributor II
  • 6941 Views
  • 1 replies
  • 0 kudos

How to merge parquets with different column types

ProblemI have a directory in S3 with a bunch of data files, like "data-20221101.parquet". They all have the same columns: timestamp, reading_a, reading_b, reading_c. In the earlier files, the readings are floats, but in the later ones they are double...

  • 6941 Views
  • 1 replies
  • 0 kudos
Latest Reply
mathan_pillai
Databricks Employee
  • 0 kudos

1) Can you let us know what was the error message when you don't set the schema & use mergeSchema2) What happens when you define schema (with FloatType) & use mergeSchema ? what error message do you get ?

  • 0 kudos
karthik_p
by Esteemed Contributor
  • 3986 Views
  • 9 replies
  • 8 kudos

Tool For Monitoring Security/Health of Databricks Workspace Since from a year we have been looking for a tool to monitor health of data bricks workspa...

Tool For Monitoring Security/Health of Databricks WorkspaceSince from a year we have been looking for a tool to monitor health of data bricks workspace in automated way. we used to monitor below few things in workspace manually clusters JobsTablesACL...

  • 3986 Views
  • 9 replies
  • 8 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 8 kudos

@Harish Koduru​ @Arnold Souza​ are you still seeing issues

  • 8 kudos
8 More Replies
Sujitha
by Databricks Employee
  • 1088 Views
  • 1 replies
  • 3 kudos

Data + AI Summit Virtual - Register Now!  This year’s free virtual experience will include access to live-streamed keynotes, select sessions designed ...

Data + AI Summit Virtual - Register Now! This year’s free virtual experience will include access to live-streamed keynotes, select sessions designed and led by data experts, as well as unlimited access to on-demand sessions soon after the live event....

  • 1088 Views
  • 1 replies
  • 3 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 3 kudos

Thank you for sharing @Sujitha Ramamoorthy​ !!

  • 3 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 1513 Views
  • 1 replies
  • 6 kudos

Exciting news for #azure users! The #databricks runtime 12.2 has been officially released as a long-term support (LTS) version, providing a stable and...

Exciting news for #azure users! The #databricks runtime 12.2 has been officially released as a long-term support (LTS) version, providing a stable and reliable platform for users to build and deploy their applications. As part of this release, the en...

122
  • 1513 Views
  • 1 replies
  • 6 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 6 kudos

Thank you for sharing @Hubert Dudek​ !!!

  • 6 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 1652 Views
  • 1 replies
  • 7 kudos

Starting from #databricks 12.2 LTS, the explode function can be used in the FROM statement to manipulate data in new and powerful ways. This function ...

Starting from #databricks 12.2 LTS, the explode function can be used in the FROM statement to manipulate data in new and powerful ways. This function takes an array column as input and returns a new row for each element in the array, offering new pos...

ezgif-3-f42040b788
  • 1652 Views
  • 1 replies
  • 7 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 7 kudos

Thank you for sharing @Hubert Dudek​ 

  • 7 kudos
thushar
by Contributor
  • 13797 Views
  • 2 replies
  • 2 kudos

Resolved! Explicit transaction blocks

I know delta tables are supporting the ACID properties and my understanding is Merge, Insert, delete, etc. are inside a transaction by default and if any error occurred during these operations, that transaction will be roll backed. I hope this unders...

  • 13797 Views
  • 2 replies
  • 2 kudos
Latest Reply
pvignesh92
Honored Contributor
  • 2 kudos

@Thushar R​  Yes you are right. As Delta table is keeping a transaction log and maintain version history of your data, it can easily roll back your transaction in case of a failure -> i.e. Once transaction in successfully committed, that is when the ...

  • 2 kudos
1 More Replies
oriole
by New Contributor III
  • 9250 Views
  • 5 replies
  • 2 kudos

Resolved! Spark Driver Crash Writing Large Text

I'm working with a large text variable, working it into single line JSON where Spark can process beautifully. Using a single node 256 GB 32 core Standard_E32d_v4 "cluster", which should be plenty memory for this dataset (haven't seen cluster memory u...

  • 9250 Views
  • 5 replies
  • 2 kudos
Latest Reply
pvignesh92
Honored Contributor
  • 2 kudos

@David Toft​ Hi, The current implementation of dbutils.fs is single-threaded, performs the initial listing on the driver and subsequently launches a Spark job to perform the per-file operations. So I guess the put operation is running on a single cor...

  • 2 kudos
4 More Replies
andrew0117
by Contributor
  • 2227 Views
  • 3 replies
  • 2 kudos

Resolved! Will a table backed by a SQL server database table automatically get updated if the base table in SQL server database is updated?

If I creat a table using the code below: CREATE TABLE IF NOT EXISTS jdbcTableusing org.apache.spark.sql.jdbcoptions( url "sql_server_url", dbtable "sqlserverTable", user "username", password "password")will jdbcTable always be automatically sync...

  • 2227 Views
  • 3 replies
  • 2 kudos
Latest Reply
pvignesh92
Honored Contributor
  • 2 kudos

Hi @andrew li​ There is a feature introduced from DBR11 where you can directly ingest the data to the table from a selected list of sources. As you are creating a table, I believe this command will create a managed table by loading the data from the...

  • 2 kudos
2 More Replies
bd
by New Contributor III
  • 1836 Views
  • 2 replies
  • 3 kudos

Resolved! Documented Autoloader option not supported?

I have a function which is meant to use the `cloudFiles` source to stream file contents from s3. It is configured like this:```stream = ( spark.readStream.format("cloudFiles") .option("cloudFiles.format", "text") .option("cloudFiles.schemaLo...

  • 1836 Views
  • 2 replies
  • 3 kudos
Latest Reply
bd
New Contributor III
  • 3 kudos

thanks, I see how I made that error.

  • 3 kudos
1 More Replies
Moonmoon
by New Contributor III
  • 6790 Views
  • 16 replies
  • 1 kudos

Resolved! Certificate/ Badge delayed

Hi Databricks team,I completed my Databricks certified Data Engineer Associate exam March 17th and after more than 48 hrs, I have not get the Certificate yet. Looks like many other folks are facing the same issue and I am not seeing any resolution pr...

  • 6790 Views
  • 16 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Moonmoon Mukherjee​ Great! Thanks for letting me know. Please it's a request, kindly mark it as the best answer. I would appreciate this.Regards

  • 1 kudos
15 More Replies
harsh_12345
by New Contributor III
  • 3550 Views
  • 7 replies
  • 2 kudos

Resolved! Passed data engineer Associate exam , but didnt recive any badge / certificate .Please help

Passed data engineer Associate exam , but didnt recive any badge / certificate .Please help

  • 3550 Views
  • 7 replies
  • 2 kudos
Latest Reply
sharukh_lodhi
New Contributor III
  • 2 kudos

Hi, I gave the associate data engineer exam on 17 march, but I haven't received the certification.I got an email right after passing the certification that you would receive your certificate after 48 hours.Would you please look into my issue, thanks!...

  • 2 kudos
6 More Replies
User16665996606
by Databricks Employee
  • 3325 Views
  • 4 replies
  • 2 kudos

How to access public URLs via Databricks notebooks

I am trying to run a web application integrated with Gradio on Databricks. However, currently, I have to first run on the local URL and then launch it on the public URL. Are there any potential solutions for them to deploy the app on the public URL o...

  • 3325 Views
  • 4 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Sixuan He​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers you...

  • 2 kudos
3 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels