- 1156 Views
- 7 replies
- 0 kudos
Hi Team,My requirement is to move build a solution to move zos(db2) CDC data to Delta table on Realtime bases(at least near realtime) , data volume and number of tables are little huge (100 tables) I have researched I dont find any inbuild options in...
- 1156 Views
- 7 replies
- 0 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
6 More Replies
- 581 Views
- 2 replies
- 0 kudos
my project I want if job take longer time then it will terminate and again it will try even if there is timeout error and in databricks launched status should show retry by scheduler and it should follow min_retry_interval_millis before start retry...
- 581 Views
- 2 replies
- 0 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
1 More Replies
- 434 Views
- 2 replies
- 0 kudos
I have a notebook that produces lots of excel files which I want downloading on my local machine.I can only currently download one by one which takes a long time when there are a lot of them.Is there a way without using Azure CLI to download all of t...
- 434 Views
- 2 replies
- 0 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
1 More Replies
- 885 Views
- 5 replies
- 0 kudos
I tried to run this query and failing to load the data .What do I need to do load from federated data sources using DLT if this is not correct CREATE OR REPLACE LIVE TABLE bulkuploadhistory
COMMENT 'Table generated for bulkuploadhistory.'
TBLPROPERTI...
- 885 Views
- 5 replies
- 0 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
4 More Replies
- 1636 Views
- 3 replies
- 1 kudos
Hi,I would like to create / update dashboard definition based on the json file. How can one do it? I tried the following:databricks api post /api/2.0/preview/sql/dashboards/$dashboard_id --json @file.json But it does not update the widgets...How can...
- 1636 Views
- 3 replies
- 1 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
2 More Replies
- 365 Views
- 2 replies
- 0 kudos
Are there any in-built reports available in Databricks UI that list the users and roles with access to the workspace?
- 365 Views
- 2 replies
- 0 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
1 More Replies
- 636 Views
- 3 replies
- 1 kudos
Hi Currently all data reauired resides in Az sql database. We have a project in which we need to query on demand this data in Salesforce data cloud to be further used for reporting in CRMA dashboard.do we need to move this data from az sql to delta l...
- 636 Views
- 3 replies
- 1 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
2 More Replies
- 661 Views
- 2 replies
- 0 kudos
Hello,I am wondering if there is a way in Databricks to run a job continuously except for 1 or 2 hours every night in which the cluster could restart. We are using interactive clusters for our jobs and development in Dev and UAT. In Prod we are still...
- 661 Views
- 2 replies
- 0 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
1 More Replies
- 3185 Views
- 2 replies
- 0 kudos
I understand that DLT is a separate job compute but I would like to use an existing all purpose cluster for the DLT pipeline. Is there a way I can achieve this?
- 3185 Views
- 2 replies
- 0 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
1 More Replies
by
feed
• New Contributor III
- 3809 Views
- 7 replies
- 3 kudos
TesseractNotFoundError: tesseract is not installed or it's not in your PATH. See README file for more information. in databricks
- 3809 Views
- 7 replies
- 3 kudos
Latest Reply
%sh apt-get install -y tesseract-ocr this command is not working in my new Databricks free trail account, earlier it worked fine in my old Databricks instance. I get below error: E: Could not open lock file /var/lib/dpkg/lock-frontend - open (13: Per...
6 More Replies
- 1070 Views
- 2 replies
- 1 kudos
Hello All,Following command on running through databricks notebook is not working Command%sh# Bash code to print 'Hello, PowerShell!'echo 'Hello, PowerShell!'# powershell.exe -ExecutionPolicy Restricted -File /dbfs:/FileStore/Read_Vault_Inventory.ps1...
- 1070 Views
- 2 replies
- 1 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
1 More Replies
by
hv129
• New Contributor
- 857 Views
- 2 replies
- 1 kudos
I have around 25GBs of data in my Azure storage. I am performing data ingestion using Autoloader in databricks. Below are the steps I am performing:Setting the enableChangeDataFeed as true.Reading the complete raw data using readStream.Writing as del...
- 857 Views
- 2 replies
- 1 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
1 More Replies
- 442 Views
- 3 replies
- 0 kudos
I'd like to create a storage credential for an Azure Storage Account in an AWS workspace. I then plan to use this storage credential to create an external volume.Is this possible, and if so what are the steps? Thanks for any help!
- 442 Views
- 3 replies
- 0 kudos
Latest Reply
Thanks for your help.I'm struggling to create the Storage Credential. I have created a managed identity via an Azure Databricks Access Connector and am making an API call based on what I'm reading in the API docs: Create a storage credential | Storag...
2 More Replies
- 439 Views
- 2 replies
- 0 kudos
We are trying to retrieve xml file name using _metadata but not working. we are not able to use input _file_name() also as we are using shared cluster.we are reading the xml files using com.datadricks.spark.xml library
- 439 Views
- 2 replies
- 0 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
1 More Replies
by
ElaPG
• New Contributor III
- 1516 Views
- 7 replies
- 1 kudos
Is there any possibility to restrict usage of specified commands (like mount/unmount or SQL grant) based on group assignment? I do not want everybody to be able to execute these commands.
- 1516 Views
- 7 replies
- 1 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
6 More Replies