- 2369 Views
- 6 replies
- 8 kudos
From Associate to Professional: My Learning Plan to ace all Databricks Data Engineer Certifications
In today’s data-driven world, the role of a data engineer is critical in designing and maintaining the infrastructure that allows for the efficient collection, storage, and analysis of large volumes of data. Databricks certifications holds significan...
- 2369 Views
- 6 replies
- 8 kudos
- 8 kudos
This is great! I have worked with Databricks for almost three years and has decided to pursue the Databricks Engineer Professional certification. This will certainly help setting up an effective plan.
- 8 kudos
- 3371 Views
- 6 replies
- 7 kudos
🚀 Databricks Custom Apps! 🚀
Whether you're a data scientist or a sales executive, Databricks is making it easier than ever to build, host, and share secure data applications. With our platform, you can now run any Python code on serverless compute, share it with non-technical c...
- 3371 Views
- 6 replies
- 7 kudos
- 7 kudos
Can we somehow play with hosting, and expose this app outside?
- 7 kudos
- 1174 Views
- 5 replies
- 1 kudos
Writing append blob files to unity catalog volum
The workspace is assigned to unity catalog, and all the access to the ADLS Gen2 is now handled via unity catalog only, means no SPN, no connection string, access keys etc. I have to create append blob files in a volume, Is this is possible in a works...
- 1174 Views
- 5 replies
- 1 kudos
- 1 kudos
Now I got your point. No, you can't create Append Blob files directly in Volumes, as this is a native Azure functionality. A volume is basically just an abstraction over a native storage.You will still need to use libraries like azure-storage-blob wi...
- 1 kudos
- 2546 Views
- 1 replies
- 0 kudos
Unity cataog
Disaster recovery is possible in Unity catalog now?Means, for data level, we have enabled with geo redundancy, what about the objects, permissions, an other components in Unity catalog ? Can we restore the unity catalog metadata in another region ?
- 2546 Views
- 1 replies
- 0 kudos
- 0 kudos
Official product release in development will be available as PrPr in a few months.
- 0 kudos
- 1422 Views
- 2 replies
- 0 kudos
Resolved! Standardized Framework to update Databricks job definition using CI/CD
Hi Databricks support, I am looking for a standardized Databricks framework to update job definition using DevOps from non-production till it get productionized. Our current process of updating the Databricks job definition is as follows:In our sourc...
- 1422 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi from the Git folders/Repos PM: DAB is the way to go, and we are working on an integration to author DABs directly in the workspace. Here's a DAIS talk where the DAB PM and I demo'ed some recommendations for source controlling jobs: https://www.da...
- 0 kudos
- 3196 Views
- 3 replies
- 5 kudos
Build & Refresh a Calendar Dates Table
IntroductionMaintaining accurate and up-to-date calendar date tables is crucial for reliable reporting, yet manual updates can be time-consuming and prone to error. This fundamental component serves as the backbone for date-based analysis, enabling a...
- 3196 Views
- 3 replies
- 5 kudos
- 4348 Views
- 3 replies
- 1 kudos
Reading Excel files folder
Dears,One of the tasks needed by DE is to ingest data from files, for example, Excel file.Thanks for OnerFusion-AI for the below thread that give us the steps of reading from one file https://community.databricks.com/t5/get-started-discussions/how-to...
- 4348 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @AhmedAlnaqa ,Can we read from ADLS location too by using abfss ?Thanks
- 1 kudos
- 647 Views
- 1 replies
- 3 kudos
Notification destination API
You can now create, delete, and update notification destinations through the Databricks API.The notification destinations API lets you programmatically manage a workspace's notification destinations. Notification destinations are used to send notific...
- 647 Views
- 1 replies
- 3 kudos
- 3 kudos
Hi,Thank you for sharing this Ajay. We appreciate you keeping the community informed!Thanks,Anushree
- 3 kudos
- 1593 Views
- 2 replies
- 4 kudos
Onboarding your new Databricks AI/BI Genie
The integration of AI and BI into the modern data stack has been a game-changer for businesses seeking to leverage data-driven insights. Databricks, a leader in this innovative frontier, has introduced the AI/BI Genie, a tool designed to democratize ...
- 1593 Views
- 2 replies
- 4 kudos
- 4 kudos
Hi,The AI/BI Genie is a fantastic innovation! By enabling natural language queries and learning from user interactions, it makes data analytics more accessible and insightful. It’s a powerful tool for businesses looking to enhance their data-driven d...
- 4 kudos
- 1027 Views
- 1 replies
- 2 kudos
The Importance of Databricks in Today's Data Market
Unlocking the Power of Data with Databricks In the rapidly evolving landscape of data and analytics, Databricks has emerged as a transformative force, reshaping how organizations handle big data, data engineering, and machine learning. As we naviga...
- 1027 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @Rishabh-Pandey,Thank you for sharing this! Databricks is really making a difference with its unified platform and powerful features. Excited to see how it will continue to shape the future of data! Thanks,
- 2 kudos
- 6208 Views
- 1 replies
- 1 kudos
Databricks on Databricks: Kicking off the Journey to Governance with Unity Catalog
In the world of data engineering and analytics, governance is paramount. Databricks has taken a significant step forward in this arena with the introduction of Unity Catalog, a unified governance solution for all data and AI assets. This journey to e...
- 6208 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi Rishabh ,Unity Catalog is a major advancement in data governance, enhancing how we manage and secure data and AI assets in Databricks. Exciting to see these improvements!Thanks,Anushree
- 1 kudos
- 737 Views
- 1 replies
- 2 kudos
How to detect the Risks in Claims Data Using Databricks and PySpark
As a data engineer with experience in Databricks and other data engineering tools, I know that processing claims data and detecting risks early can really help in insurance claims processing. In this article, I’ll show you how to use Databricks and P...
- 737 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi, Thanks for sharing this! Kudos for breaking it down so clearly. I’m sure, it will help other community members. Thanks,Anushree
- 2 kudos
- 393 Views
- 0 replies
- 2 kudos
September 2024 - Databricks SQL fixes
Added customization options for number formats in dashboard widgets, including currencies and percentages.The width of the Values column in pivot tables now auto-adjusts based on the width of the cell value names.The time displayed in dashboard widge...
- 393 Views
- 0 replies
- 2 kudos
- 440 Views
- 1 replies
- 0 kudos
SQL code for appending a notebook result into an existing database table
I am attempting to append the results from a notebook query results table into an existing databricks database table. By chance would someone share an example of the sql code with me?
- 440 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @wheersink ,So let's say you created following table with some sample values.%sql CREATE TABLE dev.default.employee ( id INT, name STRING, age INT, department STRING ); INSERT INTO dev.default.employee VALUES (1, 'John Doe', 30, 'Financ...
- 0 kudos
- 1196 Views
- 1 replies
- 2 kudos
🚀 Boost Your Data Pipelines with Dynamic, Data-Driven Databricks Workflows (For Each Task)! 💡
Unlock the power of the For Each task in Databricks to seamlessly iterate over collections—whether it's a list of table names or any value—and dynamically run tasks with specific parameter values. This powerful feature lets you automate repetitive pr...
- 1196 Views
- 1 replies
- 2 kudos
- 708 Views
- 0 replies
- 2 kudos
Databricks August Update
August 2024 | Databricks on AWS
- 708 Views
- 0 replies
- 2 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
ADF Linked Service
1 -
ADF Pipeline
1 -
AI
1 -
API
1 -
Append blob
1 -
Automation
1 -
AWS
1 -
Azure databricks
1 -
Azure DevOps
1 -
Azure devops integration
1 -
ChangingSchema
1 -
CICD
1 -
CICDForDatabricksWorkflows
1 -
Clone
1 -
Cluster
1 -
Cluster Pools
1 -
compute policies
1 -
compute policy
1 -
Cost
1 -
Cost Optimization Effort
1 -
custom compute policy
1 -
CustomLibrary
1 -
Data Engineering
1 -
Data Mesh
1 -
Data Processing
1 -
Databricks Delta Table
1 -
Databricks Demo Center
1 -
Databricks jobs
1 -
Databricks Migration
1 -
Databricks Mlflow
1 -
Databricks Support
1 -
Databricks Unity Catalog
2 -
Databricks Workflows
1 -
DatabricksJobsAPI
1 -
DatabricksML
1 -
DatabricksWorkflowsCICD
1 -
Date
1 -
Delta Lake
3 -
Devops
1 -
DimensionTables
1 -
Dns
1 -
Dynamic
1 -
Governance
1 -
Hive metastore
1 -
Jobs & Workflows
1 -
LakeFlow
1 -
Library Installation
1 -
Medallion Architecture
1 -
Mlops
1 -
MSExcel
1 -
Networking
1 -
Partner
1 -
Private Link
1 -
Pyspark Code
1 -
Question
1 -
Scala Code
1 -
Schema
1 -
Schema Evaluation
1 -
Serverless
1 -
Serverless SQL Datawarehouse
1 -
Spark
4 -
SparkSQL
1 -
Support Ticket
1 -
Sync
1 -
ucx
1 -
Unity Catalog
3 -
Unity Catlog
1 -
Workflow Jobs
1 -
Workflows
1
- « Previous
- Next »
User | Count |
---|---|
40 | |
12 | |
10 | |
7 | |
7 |