by
dsugs
• New Contributor II
- 4811 Views
- 4 replies
- 2 kudos
So I've been trying to write a file to S3 bucket giving it a custom name, everything I try just ends up with the file being dumped into a folder with the specified name so the output is like ".../file_name/part-001.parquet". instead I want the file t...
- 4811 Views
- 4 replies
- 2 kudos
Latest Reply
Spark feature where to avoid network io it writes each shuffle partition as a 'part...' file on disk and each file as you said will have compression and efficient encoding by default.So Yes it is directly related to parallel processing !!
3 More Replies
- 1695 Views
- 1 replies
- 1 kudos
From the DLT documentation it seems that the LIVE TABLE is conceptually the same as MATERIALIZED VIEW. When should I use one over another?
- 1695 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @igorstar
We haven't heard from you since the last response from @Mo , and I was checking back to see if her suggestions helped you.
Or else, If you have any solution, please share it with the community, as it can be helpful to others.
Also, Ple...
- 2603 Views
- 5 replies
- 5 kudos
The client receives data from a third party as weekly "datadumps" of a MySQL database copied into an Azure Blob Storage account container (I suspect this is done manually, I also suspect the changes between the approx 7GB files are very small). I nee...
- 2603 Views
- 5 replies
- 5 kudos
Latest Reply
Hi @Sylvia VB​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers you...
4 More Replies
- 3124 Views
- 4 replies
- 1 kudos
I have a notebook where i want to use the workflow name and task name that it will be running under. How do i access these information?
- 3124 Views
- 4 replies
- 1 kudos
Latest Reply
Please take a look at these docs, I think they are what you need: https://docs.databricks.com/workflows/jobs/task-parameter-variables.html
3 More Replies
by
apiury
• New Contributor III
- 370 Views
- 0 replies
- 0 kudos
Hi! I'm developing a .NET app and i want to use the databricks warehouse as database. I have gold delta tables that i want to query. In the documentation, i can see a ODBC/JDBC driver, are those connector fast? there are another way to connect? what ...
- 370 Views
- 0 replies
- 0 kudos
- 272 Views
- 0 replies
- 0 kudos
WhereScape supports now Databricks as a target platform.
- 272 Views
- 0 replies
- 0 kudos
- 298 Views
- 0 replies
- 0 kudos
Super excited about learning how AI can help improve performance and throughout for our database customers.
- 298 Views
- 0 replies
- 0 kudos
- 357 Views
- 0 replies
- 0 kudos
We have been evaluating Databricks SQL and its capability to be used as DW. We are using Unity catalog in our implementation.There seems to be a functionality mismatch between Azure and AWS versions as where table rename is supported on Azure side, i...
- 357 Views
- 0 replies
- 0 kudos
by
Evan
• New Contributor
- 283 Views
- 0 replies
- 0 kudos
Can't wait for the open testing this early fall!
- 283 Views
- 0 replies
- 0 kudos
- 228 Views
- 0 replies
- 0 kudos
Whenever I was working on a long notebook which is over 100 cells, long latency makes me not productive. Sometimes I should keep working on a same notebook for the context. How can I keep low latency?
- 228 Views
- 0 replies
- 0 kudos
- 822 Views
- 2 replies
- 1 kudos
I'd like to create an SQL table in a notebook that's visible to a specific run (session?) of a notebook. Meaning, even if 2 different users run that notebook at the same time, there should be no conflict. And the table should go away once the noteboo...
- 822 Views
- 2 replies
- 1 kudos
Latest Reply
For now ‘CREATE TEMPORARY VIEW’ is the way to go. Once you read from it once, the following reads are going to be cached so it won’t be recomputed every time.
1 More Replies
- 383 Views
- 0 replies
- 0 kudos
Does anyone have any experience migrating a gold layer from Azure Analysis Services (AAS) tabular structure to one centered around the lakehouse architecture? My company currently uses AAS to serve data to all levels of users across the company, but ...
- 383 Views
- 0 replies
- 0 kudos
- 192 Views
- 0 replies
- 0 kudos
Lakehouseiq looks really cool! Lakehouseiq is a Llm agent that builds sql queries from natural language. Ties into their other products like biz context lineage and previous queries.
- 192 Views
- 0 replies
- 0 kudos
- 284 Views
- 0 replies
- 0 kudos
Love the idea of creating and sharing apps in the Databricks Marketplace as shown during the keynote speech today, but how do I create them?
- 284 Views
- 0 replies
- 0 kudos
- 826 Views
- 1 replies
- 0 kudos
How do I time how long does my code run in DBX.
- 826 Views
- 1 replies
- 0 kudos
Latest Reply
Hi, you can use the standard python timeit function. ''' %timeitprint("hello world")'''