cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Prannu
by New Contributor II
  • 1806 Views
  • 2 replies
  • 1 kudos

Location of files previously uploaded on DBFS

I have uploaded a csv data file and used it in a spark job three months back. I am now running the same spark job with a new cluster created. Program is running properly. I want to know where I can see the previously uploaded csv data file.

  • 1806 Views
  • 2 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@Pranay Gupta​ you can see that in dbfs root directory, based on path you provided in job. please check .please go to data explorer and select below option that i shown in screen shot

  • 1 kudos
1 More Replies
Ajay-Pandey
by Esteemed Contributor III
  • 25003 Views
  • 6 replies
  • 7 kudos

Resolved! What does "Determining location of DBIO file fragments..." mean, and how do I speed it up?

Determining location of DBIO file fragments. This operation can take some time.What does this mean, and how do I prevent it from having to perform this apparently-expensive operation every time? This happens even when all the underlying tables are De...

  • 25003 Views
  • 6 replies
  • 7 kudos
Latest Reply
Christianben9
New Contributor II
  • 7 kudos

Determining location of DBIO file fragments" is a message that may be displayed during the boot process of a computer running the NetApp Data ONTAP operating system. This message indicates that the system is currently in the process of identifying an...

  • 7 kudos
5 More Replies
labromb
by Contributor
  • 7040 Views
  • 4 replies
  • 8 kudos

Resolved! Create Databricks tables dynamically

Hi, I would like to be able to do something like this...create table if not exists table1using parquetlocation = '/mnt/somelocationsome location needs to be a concatenation of static and code generated string. Documentation suggests that location onl...

  • 7040 Views
  • 4 replies
  • 8 kudos
Latest Reply
PrasanthM
New Contributor III
  • 8 kudos

FString Python can be used. example > spark.sql(f"CREATE TABLE {table_name} (id INT, name STRING, value DOUBLE, state STRING)")

  • 8 kudos
3 More Replies
Mohit_m
by Valued Contributor II
  • 2728 Views
  • 1 replies
  • 5 kudos

How to find out the users who accessed Databricks and from which location

How to find out the users who accessed Databricks and from which location

  • 2728 Views
  • 1 replies
  • 5 kudos
Latest Reply
Mohit_m
Valued Contributor II
  • 5 kudos

You can use Audit logs to fetch this dataQuery:%sqlSELECT DISTINCT userIdentity.email, sourceIPAddressFROM audit_logsWHERE serviceName = "accounts" AND actionName LIKE "%login%"Please find below the docs to analyse the Audit logshttps://docs.databric...

  • 5 kudos
Chris_Shehu
by Valued Contributor III
  • 1700 Views
  • 2 replies
  • 2 kudos
  • 1700 Views
  • 2 replies
  • 2 kudos
Latest Reply
Prabakar
Databricks Employee
  • 2 kudos

Hi @Christopher Shehu​ , if @Piper Wilson​ 's response helped you to solve your question? would you be happy to mark her answer as best so that others can quickly find the solution in the future.

  • 2 kudos
1 More Replies
FemiAnthony
by New Contributor III
  • 4000 Views
  • 4 replies
  • 3 kudos

Resolved! Location of customer_t1 dataset

Can anyone tell me how I can access the customer_t1 dataset that is referenced in the book "Delta Lake - The Definitive Guide "? I am trying to follow along with one of the examples.

  • 4000 Views
  • 4 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

Some files are visualized here https://github.com/vinijaiswal/delta_time_travel/blob/main/Delta%20Time%20Travel.ipynb but it is quite strange that there is no source in repository. I think only one way is to write to Vini Jaiswal on github.

  • 3 kudos
3 More Replies
User16826992666
by Valued Contributor
  • 1649 Views
  • 1 replies
  • 0 kudos

What is the default location where dataframes are written if I don't specify a location?

If I save a dataframe without specifying a location, where will it end up?

  • 1649 Views
  • 1 replies
  • 0 kudos
Latest Reply
brickster_2018
Databricks Employee
  • 0 kudos

You cant save a dataframe without specifying a location. If you are using saveAsTable API then the table will be created in the hive warehouse location. The default location is user.hive.warehouse

  • 0 kudos
Labels