by
Prannu
• New Contributor II
- 2060 Views
- 2 replies
- 1 kudos
I have uploaded a csv data file and used it in a spark job three months back. I am now running the same spark job with a new cluster created. Program is running properly. I want to know where I can see the previously uploaded csv data file.
- 2060 Views
- 2 replies
- 1 kudos
Latest Reply
@Pranay Gupta you can see that in dbfs root directory, based on path you provided in job. please check .please go to data explorer and select below option that i shown in screen shot
1 More Replies
- 27325 Views
- 6 replies
- 7 kudos
Determining location of DBIO file fragments. This operation can take some time.What does this mean, and how do I prevent it from having to perform this apparently-expensive operation every time? This happens even when all the underlying tables are De...
- 27325 Views
- 6 replies
- 7 kudos
Latest Reply
Determining location of DBIO file fragments" is a message that may be displayed during the boot process of a computer running the NetApp Data ONTAP operating system. This message indicates that the system is currently in the process of identifying an...
5 More Replies
- 7993 Views
- 4 replies
- 8 kudos
Hi, I would like to be able to do something like this...create table if not exists table1using parquetlocation = '/mnt/somelocationsome location needs to be a concatenation of static and code generated string. Documentation suggests that location onl...
- 7993 Views
- 4 replies
- 8 kudos
Latest Reply
FString Python can be used. example > spark.sql(f"CREATE TABLE {table_name} (id INT, name STRING, value DOUBLE, state STRING)")
3 More Replies
- 3247 Views
- 1 replies
- 5 kudos
How to find out the users who accessed Databricks and from which location
- 3247 Views
- 1 replies
- 5 kudos
Latest Reply
You can use Audit logs to fetch this dataQuery:%sqlSELECT DISTINCT userIdentity.email, sourceIPAddressFROM audit_logsWHERE serviceName = "accounts" AND actionName LIKE "%login%"Please find below the docs to analyse the Audit logshttps://docs.databric...
- 4386 Views
- 4 replies
- 3 kudos
Can anyone tell me how I can access the customer_t1 dataset that is referenced in the book "Delta Lake - The Definitive Guide "? I am trying to follow along with one of the examples.
- 4386 Views
- 4 replies
- 3 kudos
Latest Reply
Some files are visualized here https://github.com/vinijaiswal/delta_time_travel/blob/main/Delta%20Time%20Travel.ipynb but it is quite strange that there is no source in repository. I think only one way is to write to Vini Jaiswal on github.
3 More Replies
- 1885 Views
- 1 replies
- 0 kudos
If I save a dataframe without specifying a location, where will it end up?
- 1885 Views
- 1 replies
- 0 kudos
Latest Reply
You cant save a dataframe without specifying a location. If you are using saveAsTable API then the table will be created in the hive warehouse location. The default location is user.hive.warehouse