- 1404 Views
- 0 replies
- 0 kudos
Hi,While running a notebook during a nightrun on Azure Databricks, it got stuck on Initializing RocksDB. We are not using any streaming data nor have enabled RocksDB. Anyone has any clue how to disable RocksDB or prevent this in the future?Thanks!
- 1404 Views
- 0 replies
- 0 kudos
- 1432 Views
- 0 replies
- 0 kudos
Is there a way to build (virtual) multidimensional models in Databricks and query them via MDX? Or any third party addon?
- 1432 Views
- 0 replies
- 0 kudos
- 1628 Views
- 1 replies
- 1 kudos
I am using the JDBC driver to load comments saved in Databricks, associated to tables and columns.Comments saved in Chinese are returned in the bad encoding. I useDatabaseMetaData.getColumns().getComments()
- 1628 Views
- 1 replies
- 1 kudos
Latest Reply
@dprutean Can you share an example with the expected results vs the actual results?
- 4768 Views
- 3 replies
- 5 kudos
Hi community,I suddenly found myself confused, and this might sound like an obvious answer for some but not for me at least in this moment.I am not getting why companies use databricks SQL and Redshift at the same time?I mean, with databricks platfor...
- 4768 Views
- 3 replies
- 5 kudos
Latest Reply
Hi @eimis_pacheco
We haven't heard from you since the last response from @-werners- , and I was checking back to see if her suggestions helped you.
Or else, If you have any solution, please share it with the community, as it can be helpful to others...
2 More Replies
- 4539 Views
- 6 replies
- 0 kudos
Unable start SQL warehouse AWS. Warehouse is in starting state for a very long time and error is thrown
- 4539 Views
- 6 replies
- 0 kudos
Latest Reply
Can you please check on particular SQL warehouse instances type quota with cloud provider (Azure/AWS) . may be that particular instance type quota is over with your account . Same issue we have faced and requested AWS to increase the quota of that in...
5 More Replies
by
dsugs
• New Contributor II
- 13217 Views
- 4 replies
- 2 kudos
So I've been trying to write a file to S3 bucket giving it a custom name, everything I try just ends up with the file being dumped into a folder with the specified name so the output is like ".../file_name/part-001.parquet". instead I want the file t...
- 13217 Views
- 4 replies
- 2 kudos
Latest Reply
Spark feature where to avoid network io it writes each shuffle partition as a 'part...' file on disk and each file as you said will have compression and efficient encoding by default.So Yes it is directly related to parallel processing !!
3 More Replies
- 8002 Views
- 5 replies
- 5 kudos
The client receives data from a third party as weekly "datadumps" of a MySQL database copied into an Azure Blob Storage account container (I suspect this is done manually, I also suspect the changes between the approx 7GB files are very small). I nee...
- 8002 Views
- 5 replies
- 5 kudos
Latest Reply
Hi @Sylvia VB​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers you...
4 More Replies
- 8255 Views
- 4 replies
- 1 kudos
I have a notebook where i want to use the workflow name and task name that it will be running under. How do i access these information?
- 8255 Views
- 4 replies
- 1 kudos
Latest Reply
Please take a look at these docs, I think they are what you need: https://docs.databricks.com/workflows/jobs/task-parameter-variables.html
3 More Replies