- 1065 Views
- 2 replies
- 2 kudos
Hi,I am using the VM family Lasv3, which incorporate a NVMe SSD. I would like to take advantage of this huge amount of space but I cannot find where this disk is mounted. Does someone know where this disk is mounted and if it can be used as local dri...
- 1065 Views
- 2 replies
- 2 kudos
Latest Reply
Hi @Alvaro Moure​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...
1 More Replies
by
KVNARK
• Honored Contributor II
- 639 Views
- 1 replies
- 4 kudos
There is a use-case where we want to REVOKE access from users so that they can't run VACUUM Command on Delta Table.Can anyone please help here.
- 639 Views
- 1 replies
- 4 kudos
Latest Reply
Hello @KVNARK .​ We cannot specifically restrict Vacuum operation alone.You need to remove "MODIFY" access on the table and restrict only to the "Read" (SELECT) operationPlease note if you restrict to only "read" it will also affect all the write, up...
- 6517 Views
- 3 replies
- 6 kudos
Hi,I have been trying to deploy Access Connector resource on Azure using Azure Pipelines (YAML) and a Bicep template but I cannot find a solution to this error:ERROR: {"status":"Failed","error":{"code":"DeploymentFailed","message":"At least one resou...
- 6517 Views
- 3 replies
- 6 kudos
Latest Reply
Hi,I fixed this issue by adding the service principal to the list of service principals in the Account Console. My guess is that after the access connector is created an API call is made to the Databricks account and the service principal making that...
2 More Replies
by
Yatoom
• New Contributor II
- 1327 Views
- 2 replies
- 2 kudos
We are building a platform where we automatically execute Databricks jobs using Python packages delivered by our end-users. We want to create a mount point so that we can deliver the cluster's driver logs to an external storage. However, we don't wan...
- 1327 Views
- 2 replies
- 2 kudos
- 3527 Views
- 5 replies
- 18 kudos
Hi all,Is anyway that we can access or push data in delta sharing by using Microsoft excel?
- 3527 Views
- 5 replies
- 18 kudos
Latest Reply
hey @Ajay Pandey​ yes recently the new excel feature also comes in the market that we can enable the delta sharing from excel also so whatever the changes you will made to delta , it will automaticaly get saved in the excel file also ,refer this lin...
4 More Replies
- 1593 Views
- 1 replies
- 0 kudos
I have several functions accessing the same createorreplacetempview("viewname"). Does this cause any issues with multiple functions accessing it in a distributed environment?def get_data_sql(spark_session, data_frame, data_element):
data_fram...
- 1593 Views
- 1 replies
- 0 kudos
Latest Reply
there is two type of viewsone is global view - it will be available for whole cluster and notebook but it will removed after cluster restartand another is Temp view- that will be available for only notebook level, and other notebook will not able to ...
- 677 Views
- 0 replies
- 1 kudos
I'm able to make it to the Permission page of the schema and table I'm trying to do access control on within the Data Explorer page.At first you can only grant permissions but not revoke anything. Only after you have made new grants can you revoke w...
- 677 Views
- 0 replies
- 1 kudos
by
159312
• New Contributor III
- 1288 Views
- 4 replies
- 2 kudos
I have a notebook used for a dlt pipeline. The pipeline should perform an extra task if the pipeline is run as a full refresh. Right now, I have to set an extra configuration parameter when I run a full refresh. Is there a way to programmatically...
- 1288 Views
- 4 replies
- 2 kudos
Latest Reply
Hi @Ben Bogart​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
3 More Replies
- 5440 Views
- 7 replies
- 6 kudos
Hi,I'm new to databricks but am positively surprised by the product. We use databricks delta tables as source to build a tabular model, which will serve as data source for Power Bi. To develop our tabular model we use Visual studio to import tables ...
- 5440 Views
- 7 replies
- 6 kudos
Latest Reply
Hi @geert vanhove​ ​ , Just a friendly follow-up. Do you still need help, or have you resolved your problem with the above solutions? Please let us know.
6 More Replies
by
amil
• New Contributor
- 300 Views
- 1 replies
- 0 kudos
Hi Kaniz ,I am unable to access data bricks Community edition ever after solving the puzzle.Mail : amilsivabalan@gmail.comKindly help.Regards,Sivabalan S
- 300 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @sivabalan Selvaraj​ , Thank you for reaching out!Let us look into this for you, and we'll follow up with an update.
- 1840 Views
- 2 replies
- 3 kudos
I have created a job that contains a notebook that reads a file from Azure Storage. The file-name contains the date of when the file was transferred to the storage. A new file arrives every Monday, and the read-job is scheduled to run every Monday. I...
- 1840 Views
- 2 replies
- 3 kudos
Latest Reply
Hi @Karolin Albinsson​ , Just a friendly follow-up. Do you still need help, or @Hubert Dudek (Customer)​ 's response help you to find the solution? Please let us know.
1 More Replies
- 1083 Views
- 3 replies
- 4 kudos
Last week, I cannot loginto https://community.cloud.databricks.com/login.html all of a sudden. I tried to set the password, also didn't receive the reset email. It says "Invalid email address or password Note: Emails/usernames are case-sensitive".I e...
- 1083 Views
- 3 replies
- 4 kudos
Latest Reply
Hi @Xueqing Liu​ , Please email your credentials to community@databricks.com with all the relevant screenshots and we shall help you resolve them.
2 More Replies
by
Dusko
• New Contributor III
- 2041 Views
- 8 replies
- 2 kudos
Hi, I’m trying to read file from S3 root bucket. I can ls all the files but I can’t read it because of access denied. When I mount the same S3 root bucket under some other mountPoint, I can touch and read all the files. I also see that this new mount...
- 2041 Views
- 8 replies
- 2 kudos
Latest Reply
Dusko
New Contributor III
Hi @Atanu Sarkar​ , @Piper Wilson​ ,​thanks for the replies. Well I don't understand the fact about ownership. I believe that rootbucket is still under my ownership (I created it and I could upload/delete any files through browser without any problem...
7 More Replies
- 13352 Views
- 6 replies
- 6 kudos
Hi Team,when we try to mount or access the blob storage where soft delete enabled. But it is getting failed with below errororg.apache.hadoop.fs.FileAlreadyExistsException: Operation failed: "This endpoint does not support BlobStorageEvents or So...
- 13352 Views
- 6 replies
- 6 kudos
Latest Reply
Hi @Sailaja B​ , As per the Managed identities for Azure resource authentication document: NoteIf your blob account enables soft delete, system-assigned/user-assigned managed identity authentication is not supported in Data Flow.If you access the blo...
5 More Replies