- 667 Views
- 1 replies
- 0 kudos
Data Ingestion into DLT from Azure Event hub batch processing
I am building my first DLT pipeline and I want to ingest data from Azure event hub for batch processing.But, I can just see documentation for streaming by using kafka.Can we do batch processing with DLT & Azure Event hub?
- 667 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @pranay , There seems to be a new documented way to achieve this: https://learn.microsoft.com/en-us/azure/databricks/delta-live-tables/event-hubs Thanks, Gab
- 0 kudos
- 158 Views
- 0 replies
- 0 kudos
save page views in Databricks?
Hi everyone,I'm considering an architecture that will store page views for later analysis on Dynamo so to move them later on in Databricks. I wonder if there's a use case to save those pages views in Databricks directly with the same level of perform...
- 158 Views
- 0 replies
- 0 kudos
- 804 Views
- 5 replies
- 0 kudos
Resolved! Issues with Content Writing on Databricks Community
Hi @Sujitha , @Rishabh_Tiwari ,I wanted to bring to your attention that whenever I’m writing content on Databricks, I often encounter errors due to invalid HTML. Additionally, some terms seem to be prohibited by the Databricks community, which is puz...
- 804 Views
- 5 replies
- 0 kudos
- 0 kudos
@Rishabh-Pandey I understand. Please be assured that I am actively working on this and tracking these posts to update our filter on a regular basis. If you come across something similar again, feel free to tag me, and I'll take care of that.
- 0 kudos
- 344 Views
- 0 replies
- 1 kudos
Live, Virtual Workshop How to build a Golden Data Warehouse in Financial Services with Databricks
Reasons to join: Most Financial Services organizations have major on-prem investments. You can use that as your starting point to activate your organization on gold-level insights in the cloud.Providing a path to easier and quicker migration to the c...
- 344 Views
- 0 replies
- 1 kudos
- 351 Views
- 0 replies
- 0 kudos
OCRmyPDF in Databricks
Hello,Do any of you have experience with using OCRmyPDF in Databricks? I have tried to install it in various was with different versions, but my notebook keep crashing with the error:The Python process exited with exit code 139 (SIGSEGV: Segmentation...
- 351 Views
- 0 replies
- 0 kudos
- 282 Views
- 1 replies
- 1 kudos
Databricks Certifications
Hello Everyone , My name is Sourav Das. I am from Kolkata, currently working as Azure Data Engineer in Cognizant.I have cleared multiple databricks certifications(Databricks data engineer associate, databricks data engineer professional, databricks d...
- 282 Views
- 1 replies
- 1 kudos
- 1 kudos
Good luck. You can continue to improve your skills by helping other community members on this platform.
- 1 kudos
- 858 Views
- 5 replies
- 0 kudos
Filestore endpoint not visible in Databricks community edition
In community edition of Databricks after multiple attempts of enable, refreshes, unable to navigate to File store endpoint.Under catalog it is not visible
- 858 Views
- 5 replies
- 0 kudos
- 0 kudos
Follow these alternate solutions. https://community.databricks.com/t5/data-engineering/databricks-community-edition-dbfs-alternative-solutions/td-p/94933
- 0 kudos
- 235 Views
- 1 replies
- 0 kudos
Migrating ML Model Experiments Using Python REST APIs
Hi everyone,I’m looking to migrate ML model experiments from a source Databricks workspace to a target workspace. Specifically, I want to use Python and the available REST APIs for this process.Can anyone help me on this!Thanks in advance!
- 235 Views
- 1 replies
- 0 kudos
- 0 kudos
You can use https://github.com/mlflow/mlflow-export-import utility. The example given below doesn't use Python but uses CLI and CICD pipeline to do the same. https://medium.com/@gchandra/databricks-copy-ml-models-across-unity-catalog-metastores-188...
- 0 kudos
- 1191 Views
- 2 replies
- 1 kudos
How to Pass Dynamic Parameters (e.g., Current Date) in Databricks Workflow UI?
I'm setting up a job in the Databricks Workflow UI and I want to pass a dynamic parameter, like the current date (run_date), each time the job runs.In Azure Data Factory, I can use expressions like @utcnow() to calculate this at runtime. However, I w...
- 1191 Views
- 2 replies
- 1 kudos
- 1 kudos
As szymon mentioned, dynamic parameter values exist, but the functionality is still far from what Data Factory has to offer.I am pretty sure though that this will be extended.So for the moment I suggest you do the value derivation in data factory, an...
- 1 kudos
- 4843 Views
- 1 replies
- 0 kudos
Delta table definition - Identity column
Hello,Would anyone know if it is possible to create a delta table using Python that includes a column that is generated by default as identity (identity column for which the value inserted can be manually overriden)?There seems to be a way to create ...
- 4843 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @oleprince ,As far as I know, it's not possible yet to create tables with Identity columns using pyspark (DeltaTable api). You can create generated columns, but Identity columns are not allowed.The only way to achieve this is through Spark Sql.
- 0 kudos
- 399 Views
- 1 replies
- 0 kudos
Addressing Memory Constraints in Scaling XGBoost and LGBM: A Comprehensive Approach for High-Volume
Scaling XGBoost and LightGBM models to handle exceptionally large datasets—those comprising billions to tens of billions of rows—presents a formidable computational challenge, particularly when constrained by the limitations of in-memory processing o...
- 399 Views
- 1 replies
- 0 kudos
- 0 kudos
Well, that's a superb article! Thank you for this great information, you write very well which I like very much. I am really impressed by your post. run 3
- 0 kudos
- 1112 Views
- 1 replies
- 1 kudos
Feature Request: GUI: Additional Collapse options
When you're using a very large notebook sometimes it gets frustrating scrolling through all the code blocks. It would be nice to have a few additional options to make this easier. 1) Add a collapse all code cells button to the top.2) Add a collapse a...
- 1112 Views
- 1 replies
- 1 kudos
- 581 Views
- 1 replies
- 0 kudos
Resolved! Does a queued databricks job incur cost?
Does a queued databricks job incur cost?
- 581 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @qwerty3 ,No, it does not. When a job is queued (waiting for an available cluster or resources), there is no compute usage, so no charges are incurred for Databricks units (DBUs) or cloud infrastructure (VMs). The queued state is essentially a wai...
- 0 kudos
- 1597 Views
- 11 replies
- 1 kudos
Resolved! databricks Asset Bundle
i have come accross a documentation on asset bundles long back whcih states that when you typedatabricks bundle initit gives us option to choose a project type. But i see the below error when i do that i see the below erroris there a way, i can take ...
- 1597 Views
- 11 replies
- 1 kudos
- 711 Views
- 1 replies
- 2 kudos
column mask on <tinyint>Y columns gives error
My table breaks when I try to mask a column with a name like `<tinyint>Y` -- Create a table with a masked column> CREATE FUNCTION mask_int_col(col_val INTEGER) RETURN CASE WHEN is_member('HumanResourceDept') THEN col_val ELSE CAST(NULL as INTEGER) EN...
- 711 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @DW ,I have replicated your scenario and encountered the same error when applying a column mask to a column named 1Y in Databricks SQL.In short, it makes sense simply to follow Databricks documentation and use the SQL naming conventions, so that c...
- 2 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »