- 1993 Views
- 1 replies
- 0 kudos
Sqoop Migration best practices
Hi ,Could you please share with us the approach and best practices for migrating from hadoop-SQOOP to Databricks?
- 1993 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, You can try checking the below resources: https://www.databricks.com/resources/ebook/migration-guide-hadoop-to-databricks https://www.databricks.com/solutions/migration/hadoop https://www.databricks.com/blog/2021/08/06/5-key-steps-to-successfull...
- 0 kudos
- 1307 Views
- 1 replies
- 0 kudos
OOZE Jobs migration to databricks
Hi ,Could you please share with us the approach and best practices for migrating from hadoop-ooze jobs to Databricks?
- 1307 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, You can try checking the below resources on Hadoop migration: https://www.databricks.com/resources/ebook/migration-guide-hadoop-to-databricks https://www.databricks.com/solutions/migration/hadoop https://www.databricks.com/blog/2021/08/06/5-key-...
- 0 kudos
- 2667 Views
- 1 replies
- 0 kudos
HDFS to Databricks
Hi ,Could you please share with us the approach and best practices for migrating from hadoop-HDFS to Databricks?
- 2667 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, You can try checking the below resources: https://www.databricks.com/resources/ebook/migration-guide-hadoop-to-databricks https://www.databricks.com/solutions/migration/hadoop https://www.databricks.com/blog/2021/08/06/5-key-steps-to-successfull...
- 0 kudos
- 1699 Views
- 1 replies
- 0 kudos
Notebook runs with error when run as a job
I am using a notebook to copy over my database on a schedule (I had no success connecting through the Data Explorer UI). When I run the notebook on its own, it works. When I run it as a scheduled job, I get this error. org.apache.spark.SparkSQLExcept...
- 1699 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, the error code is minimal, could you please post the whole error if that is possible? Also please tag @Debayan​ with your next response which will notify me, Thank you!
- 0 kudos
- 1365 Views
- 1 replies
- 0 kudos
Suspension of Data Engineer Professional exam
Hi Databricks TeamI had scheduled my exam on 6th sep 2023, during exam same pop up came up, stating that I am looking in some other direction. I told them that my laptop mouse is not working properly, so I was looking at it. But still they suspended ...
- 1365 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @priyakant1 ,Have you got any response from the team, like did they reschedule your exam?
- 0 kudos
- 3105 Views
- 1 replies
- 0 kudos
My exam has suspended , Need help Urgently (21/08/2023)
Hello Team,I encountered Pathetic experience while attempting my 1st DataBricks certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. wasted my time and then suspended my exam.I want to file a complain...
- 3105 Views
- 1 replies
- 0 kudos
- 0 kudos
Sub: My exam Datbricks Data Engineer Associate got suspended_need immediate help please (10/09/2023)I encountered Pathetic experience while attempting my DataBricks Data engineer certification. Abruptly, Proctor asked me to show my desk, after showin...
- 0 kudos
- 3541 Views
- 1 replies
- 1 kudos
Resolved! Records are missing while filtering the dataframe in multithreading
Hi, I need to process nearly 30 files from different locations and insert records to RDS. I am using multi-threading to process these files parallelly like below. Test data: I have configuration like below based on column 4: If colum...
- 3541 Views
- 1 replies
- 1 kudos
- 1 kudos
Looks like you are comparing to strings like "1", not values like 1 in your filter condition. It's hard to say, there are some details missing like the rest of the code and the DF schema, and what output you are observing.
- 1 kudos
- 10182 Views
- 1 replies
- 0 kudos
Resolved! Recover Account Owner
Need help recovering account owner.Problem: Account owner cannot sign in with its password after SSO was configured. Account owner is a DL for team ownership so it doesn't have an AWS account and can't configure in AD group since it has "+" in the em...
- 10182 Views
- 1 replies
- 0 kudos
- 0 kudos
Resolved by temporarily disabling SSO with Active directory that wasn't allowing an email to be created with "+"
- 0 kudos
- 4299 Views
- 1 replies
- 0 kudos
Increase cores for Spark History Server
By default SHS uses spark.history.fs.numReplayThreads = 25% of avaliable cores (Number of threads that will be used by history server to process event logs)How can we increase the number of cores for Spark History Server ?
- 4299 Views
- 1 replies
- 0 kudos
- 3358 Views
- 1 replies
- 0 kudos
Will the cells in the notebook keep running even if the browser is closed?
The Execute ML Model Pipeline celll has finished running, it took 2.27 days to finish running. However, the code in the following cell called Process JSON Output needs to take a very long time again to run. Can I simply close the browser and shut dow...
- 3358 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, I have just tested it internally, even if the browser is closed the notebook keeps on running. You can start with a quick job to test it. Also, please tag @Debayan with your next response so that I will get notified.
- 0 kudos
- 6454 Views
- 0 replies
- 0 kudos
Is it good to process files in multithreading?
Hi,I need to process nearly 30 files from different locations and insert records to RDS.I am using multi-threading to process these files parallelly like below. def process_files(file_path): <process files here> 1. Find bad records based on fie...
- 6454 Views
- 0 replies
- 0 kudos
- 2041 Views
- 1 replies
- 0 kudos
Data Insertion
Scenario: Data from blob storage to SQL db once a week.I have 15(from current date to next 15 days) days data into the blob storage, stored date wise in parquet format, and after seven days the next 15 days data will be inserted. Means till 7th day t...
- 2041 Views
- 1 replies
- 0 kudos
- 3696 Views
- 1 replies
- 0 kudos
Server error: OK - Notebook
Hi I am currently having a weird notebook behavior. Every time I write, I am getting the following error. My gut feeling is that it causes by the Auto-save feature.Cheers,Gil
- 3696 Views
- 1 replies
- 0 kudos
- 1335 Views
- 1 replies
- 0 kudos
Databricks data engineer associate Exam got suspended.
Hello Team, I encountered Pathetic experience while attempting my 1st DataBricks certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. wasted my time and then suspended my exam. I want to file a compla...
- 1335 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Sivaji Sorry to hear you had a bad experience, and that you got a slow response here in the community. I see that you have taken and passed the exam, Congratulations!For the future, our support team handles cases from here first so it tends to be...
- 0 kudos
- 2064 Views
- 1 replies
- 0 kudos
Is it a bug in DEEP CLONE?
Hi,I'm trying to modify a delta table using following approach:Shallow clone of the table (source_table)Modification of the the clone (clonned_table)Deep clone of the modified table to the source table.Source delta table has 26 752 rows. Current Delt...
- 2064 Views
- 1 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 119 | |
| 57 | |
| 42 | |
| 34 |