- 133 Views
- 3 replies
- 0 kudos
Supporting Material for self phased Data Analysis with Databricks Course
Hi All,Newbie here, Any idea where I can find the supporting materials that are used in the online "Data Analysis with Databricks Course" that the instructor is using?. It seems to be having the scripts to create schema, tables, etcThanks in advance
- 133 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello @BSalla, here is our suggested learning path. I hope it helps you!
- 0 kudos
- 93 Views
- 2 replies
- 1 kudos
Finding Materials for Databricks Course
Hi All,Where can I find the supporting materials used by the instructor in the online "Data Analysis with Databricks" course? It appears to include scripts for creating schemas, tables, and other database structures.Thanks in advance.
- 93 Views
- 2 replies
- 1 kudos
- 1 kudos
Hello @RoseCliver1 and @BSalla! To access the course supporting materials, please raise a support ticket here.
- 1 kudos
- 842 Views
- 2 replies
- 0 kudos
Talend ETL code to Databricks
Hi Team,What is the best way to transfer Talend ETL code to Databricks and what are the best methods/practices for migrating Talend ETL's to Databricks (notebook, code conversion/migration strategy, workflow's etc)?Regards,Janga
- 842 Views
- 2 replies
- 0 kudos
- 0 kudos
You can have a look at BladeBridge converters. They support Talend to Databricks route. www.bladebridge.com
- 0 kudos
- 61 Views
- 1 replies
- 0 kudos
Integrating SAP HANA with Databricks: Exploring Data Transformation and Optimization
Hi everyone,I'm currently working on a project that involves integrating SAP HANA with Databricks to transform and analyze data. I’m using methods like JDBC/ODBC connectors to extract data from SAP HANA into Databricks, but I’ve run into a few hurdle...
- 61 Views
- 1 replies
- 0 kudos
- 0 kudos
hey @edisionthomas are you directly connecting to underlying HANA database and reading the tables? That would not be quite efficient as by default with jdfc, data sourced will be scanned sequentially. You would need to work on the ways to parallelise...
- 0 kudos
- 441 Views
- 4 replies
- 2 kudos
Resolved! Issue with Non-Bulk Inserts Using JDBC on Databricks Runtime 14.1
Hello team,I am experiencing an issue with insert operations on Databricks using the JDBC driver. In my SAS Viya system, the DatabricksJDBC42.jar driver version 2.6.40 is configured. I’ve noticed that, up to Databricks Runtime version 13.1, insert op...
- 441 Views
- 4 replies
- 2 kudos
- 2 kudos
Were you using parameterized queries? I think its native support starts from version 14.1.See if this can mitigate the issue by EnableNativeParameterizedQuery=0
- 2 kudos
- 3527 Views
- 2 replies
- 0 kudos
Databricks Notebook says "Connecting.." for some users
For some users, after clicking on a notebook the screen says "connecting..." and the notebook does not open.The users are using Chrome browser and the same happens with Edge as well.What could be the reason?
- 3527 Views
- 2 replies
- 0 kudos
- 0 kudos
Same issue here, just tried to get set up with community edition but no luck
- 0 kudos
- 113 Views
- 1 replies
- 1 kudos
Resolved! read file from local machine(my computer) and create a dataframe
I want to create a notebook and add a widget which will allow the user to select a file from local machine(my computer) and read the contents of the file and create a dataframe. is it possible? and how? in dbutils.widgets i dont have any options for ...
- 113 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @khaansaab ,Currently, there is now out of the box mechanism that will allow you to that. As a workaround,you can create a UC volume and tell your users to upload files into that volume. Then you can create notebook that will have file_name parame...
- 1 kudos
- 3254 Views
- 2 replies
- 0 kudos
Resolved! Configure Databricks in VSCode through WSL
Hi,I am having a hard time configuring my Databricks workspace when working in VSCode via WSL. When following the steps to setup Databricks authentication I am receiving the following error on the Step 5 of "Step 4: Set up Databricks authentication"....
- 3254 Views
- 2 replies
- 0 kudos
- 0 kudos
What worked for me was NOT opening the browser using the pop-up (which generated the 3-legged-OAuth flow error), but clicking on the link provided by the CLI (or copy paste the link on the browser)
- 0 kudos
- 5093 Views
- 1 replies
- 0 kudos
SAP Successfator
Hi Team,We are working on a new Data Product onboarding to the current Databricks Lakehouse Platform.The first step is foundation where we should get data from SAP success factors to S3+ Bronze layer and then do the initial setup of Lakehouse+Power B...
- 5093 Views
- 1 replies
- 0 kudos
- 7292 Views
- 3 replies
- 0 kudos
Space in Column names when writing to Hive
All,I have the following code.df_Warehouse_Utilization = ( spark.table("hive_metastore.dev_ork.bin_item_detail") .join(df_DIM_Bins,col('bin_tag')==df_DIM_Bins.BinKey,'right') .groupby(col('BinKey')) .agg(count_distinct(when(col('serial_lo...
- 7292 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi,I have faced this issue a few times. When we are overwriting the dataframes to hive catalog in databricks, it doesn't naturally allow for column names to have spaces or special characters. However, you can add an option statement to bypass that ru...
- 0 kudos
- 4094 Views
- 18 replies
- 4 kudos
Mounting Data IOException
Hello,I am currently taking a course from Coursera for data science using SQL. For one of our assignments we need to mount some data by running a script that has been provided to us by the class. When I run the script I receive the following error. I...
- 4094 Views
- 18 replies
- 4 kudos
- 4 kudos
I have resolved this issue. Please see the Databricks notebook below. Thanks to Tawfeeq for the helpful method.https://databricks-prod-cloudfront.cloud.databricks.com/public/4027ec902e239c93eaaa8714f173bcfc/2108650195107345/4043517972039275/874892524...
- 4 kudos
- 568 Views
- 1 replies
- 0 kudos
Connect to SQL Developer using Custom JDBC
Hello,I'm trying to connect databricks SQL to the SQL Developer using custom JDBC.I'm getting errorjdbc:databricks:<server>:443;HttpPath=<HttpPath>;UID=token;PWD=<password> RegardsNaga
- 568 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi,We are trying to test if we can connect sql developer to databricks. did it work for you?Regards, shalini
- 0 kudos
- 199 Views
- 1 replies
- 0 kudos
Is it possible to configure Databricks to leverage an AI PC’s NVMe storage for faster I/O?
Hello Databricks Community,I’m working on improving the performance of my Databricks workflows and exploring different storage options. I have an AI PC with NVMe storage, and I want to know if it’s possible to set up Databricks to use this storage fo...
- 199 Views
- 1 replies
- 0 kudos
- 0 kudos
This is not how you should be using Databricks. Databricks is SaaS/Paas and it's main operating model is operate on data stored in Cloud Object Storage (GCS, S3, ADLS Gen 2). It is optimized for cloud storage, morever it is designed to be very fast...
- 0 kudos
- 466 Views
- 6 replies
- 0 kudos
Unity Catalog About Metastore
Registered on 2024/10 from AWS marketplace.We have created a customer management VPC and manually created the workspace.No specific metastore settings were made when the workspace was created.In the catalog screen of the account console,unity catalog...
- 466 Views
- 6 replies
- 0 kudos
- 0 kudos
Hello @tyorisoo,I hope you are doing well!Metastore manages metadata, not catalog information, schema information, table information, function information, access control information, etc. In the current state, the metastore configuration is not done...
- 0 kudos
- 450 Views
- 3 replies
- 0 kudos
Databricks CI/CD
Hi Team,How can we implement CI/CD in Databricks and its best practices, and tools to consider?Regards,Janga
- 450 Views
- 3 replies
- 0 kudos
- 0 kudos
Thank you so much for sharing the link.
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
12.2 LST
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API
2 -
API Documentation
2 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
2 -
Azure data disk
1 -
Azure databricks
10 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
4 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Best Practices
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Bug
1 -
Catalog
1 -
Certification
1 -
Certification Exam
1 -
Certification Voucher
1 -
CICD
2 -
Cli
1 -
Cloud_files_state
1 -
cloudera sql
1 -
CloudFiles
1 -
Cluster
3 -
clusterpolicy
1 -
Code
1 -
Community Group
1 -
Community Social
1 -
Compute
2 -
conditional tasks
1 -
Cost
2 -
Credentials
1 -
CustomLibrary
1 -
CustomPythonPackage
1 -
DABs
1 -
Data Engineering
2 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
DataAISummit2023
1 -
DatabrickHive
1 -
databricks
2 -
Databricks Academy
1 -
Databricks Alerts
1 -
Databricks Audit Logs
1 -
Databricks Certified Associate Developer for Apache Spark
1 -
Databricks Cluster
1 -
Databricks Clusters
1 -
Databricks Community
1 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Documentation
1 -
Databricks JDBC
1 -
Databricks Job
1 -
Databricks jobs
2 -
Databricks Lakehouse Platform
1 -
Databricks notebook
1 -
Databricks Notebooks
2 -
Databricks Platform
1 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks SQL
1 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
3 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
DatabricksJobCluster
1 -
DataDays
1 -
DataMasking
2 -
dbdemos
1 -
DBRuntime
1 -
DDL
1 -
deduplication
1 -
Delt Lake
1 -
Delta
12 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
6 -
Delta Sharing
2 -
deltaSharing
1 -
denodo
1 -
Deny assignment
1 -
Devops
1 -
DLT
9 -
DLT Pipeline
6 -
DLT Pipelines
5 -
DLTCluster
1 -
Documentation
2 -
Dolly
1 -
Download files
1 -
dropduplicatewithwatermark
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
1 -
Feature Store
1 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
glob
1 -
Good Documentation
1 -
Google Bigquery
1 -
hdfs
1 -
Help
1 -
How to study Databricks
1 -
informatica
1 -
Jar
1 -
Java
1 -
JDBC Connector
1 -
Job Cluster
1 -
Job Task
1 -
Kubernetes
1 -
Lineage
1 -
LLMs
1 -
Login
1 -
Login Account
1 -
Machine Learning
1 -
MachineLearning
1 -
masking
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Metastore
1 -
MlFlow
2 -
Mlops
1 -
Model Serving
1 -
Model Training
1 -
Mount
1 -
Networking
1 -
nic
1 -
Okta
1 -
ooze
1 -
os
1 -
Password
1 -
Permission
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
1 -
policies
1 -
PostgresSQL
1 -
Pricing
1 -
pubsub
1 -
Pyspark
1 -
Python
2 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
RBAC
1 -
Repos Support
1 -
Reserved VM's
1 -
Reset
1 -
run a job
1 -
runif
1 -
S3
1 -
SAP SUCCESS FACTOR
1 -
Schedule
1 -
SCIM
1 -
Serverless
1 -
Service principal
1 -
Session
1 -
Sign Up Issues
2 -
Significant Performance Difference
1 -
Spark
2 -
sparkui
2 -
Splunk
1 -
sqoop
1 -
Start
1 -
Stateful Stream Processing
1 -
Storage Optimization
1 -
Structured Streaming ForeachBatch
1 -
suggestion
1 -
Summit23
2 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
tabrikck
1 -
Tags
1 -
Troubleshooting
1 -
ucx
2 -
Unity Catalog
1 -
Unity Catalog Error
2 -
Unity Catalog Metastore
1 -
UntiyCatalog
1 -
Update
1 -
user groups
1 -
Venicold
3 -
volumes
2 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
with open
1 -
Women
1 -
Workflow
2 -
Workspace
2
- « Previous
- Next »