Looking forward to using Lakehouse Federation
With databases all over the place I'm keen to have everything accessible through Databricks.
- 428 Views
- 0 replies
- 0 kudos
With databases all over the place I'm keen to have everything accessible through Databricks.
Really excited to use lakehouse IQ Its like having chat gpt plugged into your workspace.
Started with the keynote session and instantly got two ideas Safe by design and data lineage by design, excited with changes for spark3.4 and lakehouseIQ
Looking forward to the ease of data security and sharing
Can we get the lineage for onprem spark application running i. Loval
Yes, you can create private endpoint links and application access (depends on your cloud provider) to connect with your on-premise systems and unity catalog can sync that well. Lineage is supported for all languages and is captured down to the column...
How to get a list of file loaded by auto loader?
Eager to hear delta new capabilities
Having a great time at the summit and learning about the advances with AI. The advantages of adding the ability to use english as a new programming language and the advantage that it will bring to the companies that adapt to the future of Databricks....
Amazing! Excited to be part of my first DAIS. Incredible contributions. Looking forward to putting some of the tools and features into practice.
Having a great time at the data summit! It is an amazing experience and is organized very well! #databricks
I have 2 datasets getting loaded into a common silver table. These are event driven and notebooks are triggered when a file is dropped into the storage account. When the files come in at the same time, one dataset fails with concurrent append excepti...
Databricks provides ACID guarantees since the inception of Delta format. In order to ensure the C - Consistency is addressed, it limits concurrent workflows to perform updates at the same time, like other ACID compliant SQL engines. The key differenc...
No difference, both are very similar only difference is that in the SQL editor you cannot take advantage of things like SQL serverless clusters, better visibility of data objects and a more simplified exploration tooling. SQL in notebooks mostly run ...
What's the effort involved with converting a databricks standard workspace to a premium? Is it 1 click of a button? Or are there other considerations you need to think about?
It’s pretty easy! You can use the Azure API or the CLI. To upgrade, use the Azure Databricks workspace creation API to recreate the workspace with exactly the same parameters as the Standard workspace, specifying the sku property as Premium. To use t...
I'm trying to download a PDF file and store it in FileStore using this code in a Notebook: with open('/dbfs/FileStore/file.pdf', 'wb') as f: f.write(requests.get('https://url.com/file.pdf').content) But I'm getting this error:FileNotFoundError: [...
Hello,I'm exercising a migration of an azure delta table (10TB) from Azure Standard performance tier to Azure Premium. The plan is to create a new storage account and copy the table into it. Then we will switch to the new table. The table contains r...
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group