cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Prannu
by New Contributor II
  • 1058 Views
  • 2 replies
  • 1 kudos

Location of files previously uploaded on DBFS

I have uploaded a csv data file and used it in a spark job three months back. I am now running the same spark job with a new cluster created. Program is running properly. I want to know where I can see the previously uploaded csv data file.

  • 1058 Views
  • 2 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@Pranay Gupta​ you can see that in dbfs root directory, based on path you provided in job. please check .please go to data explorer and select below option that i shown in screen shot

  • 1 kudos
1 More Replies
Ajay-Pandey
by Esteemed Contributor III
  • 17836 Views
  • 6 replies
  • 7 kudos

Resolved! What does "Determining location of DBIO file fragments..." mean, and how do I speed it up?

Determining location of DBIO file fragments. This operation can take some time.What does this mean, and how do I prevent it from having to perform this apparently-expensive operation every time? This happens even when all the underlying tables are De...

  • 17836 Views
  • 6 replies
  • 7 kudos
Latest Reply
Christianben9
New Contributor II
  • 7 kudos

Determining location of DBIO file fragments" is a message that may be displayed during the boot process of a computer running the NetApp Data ONTAP operating system. This message indicates that the system is currently in the process of identifying an...

  • 7 kudos
5 More Replies
labromb
by Contributor
  • 4090 Views
  • 6 replies
  • 8 kudos

Resolved! Create Databricks tables dynamically

Hi, I would like to be able to do something like this...create table if not exists table1using parquetlocation = '/mnt/somelocationsome location needs to be a concatenation of static and code generated string. Documentation suggests that location onl...

  • 4090 Views
  • 6 replies
  • 8 kudos
Latest Reply
Kaniz
Community Manager
  • 8 kudos

Hi @Brian Labrom​ ​, We haven’t heard from you since the last response from @Prasanth Mathesh​ and @Pat Sienkiewicz​, and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community, as it can...

  • 8 kudos
5 More Replies
Mohit_m
by Valued Contributor II
  • 1567 Views
  • 1 replies
  • 5 kudos

How to find out the users who accessed Databricks and from which location

How to find out the users who accessed Databricks and from which location

  • 1567 Views
  • 1 replies
  • 5 kudos
Latest Reply
Mohit_m
Valued Contributor II
  • 5 kudos

You can use Audit logs to fetch this dataQuery:%sqlSELECT DISTINCT userIdentity.email, sourceIPAddressFROM audit_logsWHERE serviceName = "accounts" AND actionName LIKE "%login%"Please find below the docs to analyse the Audit logshttps://docs.databric...

  • 5 kudos
Chris_Shehu
by Valued Contributor III
  • 976 Views
  • 2 replies
  • 2 kudos
  • 976 Views
  • 2 replies
  • 2 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 2 kudos

Hi @Christopher Shehu​ , if @Piper Wilson​ 's response helped you to solve your question? would you be happy to mark her answer as best so that others can quickly find the solution in the future.

  • 2 kudos
1 More Replies
FemiAnthony
by New Contributor III
  • 2008 Views
  • 5 replies
  • 3 kudos

Resolved! Location of customer_t1 dataset

Can anyone tell me how I can access the customer_t1 dataset that is referenced in the book "Delta Lake - The Definitive Guide "? I am trying to follow along with one of the examples.

  • 2008 Views
  • 5 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

Some files are visualized here https://github.com/vinijaiswal/delta_time_travel/blob/main/Delta%20Time%20Travel.ipynb but it is quite strange that there is no source in repository. I think only one way is to write to Vini Jaiswal on github.

  • 3 kudos
4 More Replies
User16826992666
by Valued Contributor
  • 1059 Views
  • 1 replies
  • 0 kudos

What is the default location where dataframes are written if I don't specify a location?

If I save a dataframe without specifying a location, where will it end up?

  • 1059 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16869510359
Esteemed Contributor
  • 0 kudos

You cant save a dataframe without specifying a location. If you are using saveAsTable API then the table will be created in the hive warehouse location. The default location is user.hive.warehouse

  • 0 kudos
Labels