- 448 Views
- 3 replies
- 0 kudos
Supporting Material for self phased Data Analysis with Databricks Course
Hi All,Newbie here, Any idea where I can find the supporting materials that are used in the online "Data Analysis with Databricks Course" that the instructor is using?. It seems to be having the scripts to create schema, tables, etcThanks in advance
- 448 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello @BSalla, here is our suggested learning path. I hope it helps you!
- 0 kudos
- 342 Views
- 2 replies
- 1 kudos
Finding Materials for Databricks Course
Hi All,Where can I find the supporting materials used by the instructor in the online "Data Analysis with Databricks" course? It appears to include scripts for creating schemas, tables, and other database structures.Thanks in advance.
- 342 Views
- 2 replies
- 1 kudos
- 1 kudos
Hello @RoseCliver1 and @BSalla! To access the course supporting materials, please raise a support ticket here.
- 1 kudos
- 3766 Views
- 2 replies
- 1 kudos
Databricks Notebook says "Connecting.." for some users
For some users, after clicking on a notebook the screen says "connecting..." and the notebook does not open.The users are using Chrome browser and the same happens with Edge as well.What could be the reason?
- 3766 Views
- 2 replies
- 1 kudos
- 1 kudos
Same issue here, just tried to get set up with community edition but no luck
- 1 kudos
- 679 Views
- 1 replies
- 1 kudos
Resolved! read file from local machine(my computer) and create a dataframe
I want to create a notebook and add a widget which will allow the user to select a file from local machine(my computer) and read the contents of the file and create a dataframe. is it possible? and how? in dbutils.widgets i dont have any options for ...
- 679 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @khaansaab ,Currently, there is now out of the box mechanism that will allow you to that. As a workaround,you can create a UC volume and tell your users to upload files into that volume. Then you can create notebook that will have file_name parame...
- 1 kudos
- 3738 Views
- 2 replies
- 1 kudos
Resolved! Configure Databricks in VSCode through WSL
Hi,I am having a hard time configuring my Databricks workspace when working in VSCode via WSL. When following the steps to setup Databricks authentication I am receiving the following error on the Step 5 of "Step 4: Set up Databricks authentication"....
- 3738 Views
- 2 replies
- 1 kudos
- 1 kudos
What worked for me was NOT opening the browser using the pop-up (which generated the 3-legged-OAuth flow error), but clicking on the link provided by the CLI (or copy paste the link on the browser)
- 1 kudos
- 8539 Views
- 3 replies
- 0 kudos
Space in Column names when writing to Hive
All,I have the following code.df_Warehouse_Utilization = ( spark.table("hive_metastore.dev_ork.bin_item_detail") .join(df_DIM_Bins,col('bin_tag')==df_DIM_Bins.BinKey,'right') .groupby(col('BinKey')) .agg(count_distinct(when(col('serial_lo...
- 8539 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi,I have faced this issue a few times. When we are overwriting the dataframes to hive catalog in databricks, it doesn't naturally allow for column names to have spaces or special characters. However, you can add an option statement to bypass that ru...
- 0 kudos
- 5102 Views
- 18 replies
- 4 kudos
Mounting Data IOException
Hello,I am currently taking a course from Coursera for data science using SQL. For one of our assignments we need to mount some data by running a script that has been provided to us by the class. When I run the script I receive the following error. I...
- 5102 Views
- 18 replies
- 4 kudos
- 4 kudos
I have resolved this issue. Please see the Databricks notebook below. Thanks to Tawfeeq for the helpful method.https://databricks-prod-cloudfront.cloud.databricks.com/public/4027ec902e239c93eaaa8714f173bcfc/2108650195107345/4043517972039275/874892524...
- 4 kudos
- 662 Views
- 1 replies
- 0 kudos
Connect to SQL Developer using Custom JDBC
Hello,I'm trying to connect databricks SQL to the SQL Developer using custom JDBC.I'm getting errorjdbc:databricks:<server>:443;HttpPath=<HttpPath>;UID=token;PWD=<password> RegardsNaga
- 662 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi,We are trying to test if we can connect sql developer to databricks. did it work for you?Regards, shalini
- 0 kudos
- 713 Views
- 6 replies
- 0 kudos
Unity Catalog About Metastore
Registered on 2024/10 from AWS marketplace.We have created a customer management VPC and manually created the workspace.No specific metastore settings were made when the workspace was created.In the catalog screen of the account console,unity catalog...
- 713 Views
- 6 replies
- 0 kudos
- 0 kudos
Hello @tyorisoo,I hope you are doing well!Metastore manages metadata, not catalog information, schema information, table information, function information, access control information, etc. In the current state, the metastore configuration is not done...
- 0 kudos
- 1302 Views
- 3 replies
- 0 kudos
Databricks CI/CD
Hi Team,How can we implement CI/CD in Databricks and its best practices, and tools to consider?Regards,Janga
- 1302 Views
- 3 replies
- 0 kudos
- 0 kudos
Thank you so much for sharing the link.
- 0 kudos
- 1440 Views
- 2 replies
- 1 kudos
What is the quota limit for using create user token api?
Hi Community, I was going through this doc: https://docs.databricks.com/api/workspace/tokens/create to and got to know, that there is a quota limit to how many token one can generate using the api: POST /api/2.0/token/create, having breached the thre...
- 1440 Views
- 2 replies
- 1 kudos
- 1 kudos
Hello@Surajv, Q1: What is and how to find out the quota limit? The quota limit for creating user tokens via the API (POST /api/2.0/token/create) is essential to manage token usage. Each user can have multiple personal access tokens in a Databricks wo...
- 1 kudos
- 439 Views
- 1 replies
- 0 kudos
Databricks Initial Costs AWS
I have a new premium account. I set up a cost dashboard (see attached) after I create a new workspace using AWS Quickstart, where I see some costs. Why do I have this If I am not using Databricks at all? How can I save the costs?
- 439 Views
- 1 replies
- 0 kudos
- 0 kudos
Are you seeing this data from the Usage tab in the Account console? Does it allow you to filter it by SKU?
- 0 kudos
- 9426 Views
- 4 replies
- 2 kudos
Masking techniques for more PII columns
Hi Databricks Team,We would appreciate it if you could inform us about the situations when Column-Masking, Row-Level Filtering, and Attributed-Based Masking should be utilized, as well as the recommended technique for handling large data volumes cont...
- 9426 Views
- 4 replies
- 2 kudos
- 2 kudos
Agree with @Meghla-C , https://databricks.aha.io/ideas/ideas/DB-I-7941 this was the feature request and if you see the status, it is in preview.
- 2 kudos
- 347 Views
- 0 replies
- 0 kudos
Has Anyone Used Databricks for Financial Auditing and Compliance?
I’m exploring how Databricks can support financial auditing and compliance, especially with platforms like aauditing. Has anyone here used Databricks for data analysis or reporting in the context of auditing services? Any insights on workflows or bes...
- 347 Views
- 0 replies
- 0 kudos
- 3353 Views
- 4 replies
- 0 kudos
GDAL on Databricks Cluster Runtime 12.2 LTS
I need gdal in my course work.After reading this post, I used init script as follows to install gdal into runtime 12.2 LTS dbutils.fs.put("/databricks/scripts/gdal_install.sh",""" #!/bin/bash sudo add-apt-repository ppa:ubuntugis/ppa sudo apt-get up...
- 3353 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi, in case anyone is still struggling here. I found I could not get the init script approach to work, but if I just run a shell command to install gdal at the start of my notebook it works fine. You might note, however, that this installs gdal versi...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
12.2 LST
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API
2 -
API Documentation
2 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
2 -
Azure data disk
1 -
Azure databricks
10 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
4 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Best Practices
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Bug
1 -
Catalog
1 -
Certification
1 -
Certification Exam
1 -
Certification Voucher
1 -
CICD
2 -
cleanroom
1 -
Cli
1 -
Cloud_files_state
1 -
cloudera sql
1 -
CloudFiles
1 -
Cluster
3 -
clusterpolicy
1 -
Code
1 -
Community Group
1 -
Community Social
1 -
Compute
2 -
conditional tasks
1 -
Connection
1 -
Cost
2 -
Credentials
1 -
CustomLibrary
1 -
CustomPythonPackage
1 -
DABs
1 -
Data Engineering
2 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
DataAISummit2023
1 -
DatabrickHive
1 -
databricks
2 -
Databricks Academy
1 -
Databricks Alerts
1 -
Databricks Audit Logs
1 -
Databricks Certified Associate Developer for Apache Spark
1 -
Databricks Cluster
1 -
Databricks Clusters
1 -
Databricks Community
1 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Documentation
1 -
Databricks JDBC
1 -
Databricks Job
1 -
Databricks jobs
2 -
Databricks Lakehouse Platform
1 -
Databricks notebook
1 -
Databricks Notebooks
2 -
Databricks Platform
1 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks SQL
1 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
3 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
dbdemos
1 -
DBRuntime
1 -
DDL
1 -
deduplication
1 -
Delt Lake
1 -
Delta
13 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
6 -
Delta Sharing
2 -
deltaSharing
1 -
denodo
1 -
Deny assignment
1 -
Devops
1 -
DLT
9 -
DLT Pipeline
6 -
DLT Pipelines
5 -
DLTCluster
1 -
Documentation
2 -
Dolly
1 -
Download files
1 -
dropduplicatewithwatermark
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
1 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
glob
1 -
Good Documentation
1 -
Google Bigquery
1 -
hdfs
1 -
Help
1 -
How to study Databricks
1 -
I have a table
1 -
informatica
1 -
Jar
1 -
Java
1 -
JDBC Connector
1 -
Job Cluster
1 -
Job Task
1 -
Kubernetes
1 -
LightGMB
1 -
Lineage
1 -
LLMs
1 -
Login
1 -
Login Account
1 -
Machine Learning
1 -
MachineLearning
1 -
masking
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Metastore
1 -
MlFlow
2 -
Mlops
1 -
Model Serving
1 -
Model Training
1 -
Mount
1 -
Networking
1 -
nic
1 -
Okta
1 -
ooze
1 -
os
1 -
Password
1 -
Permission
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
policies
1 -
PostgresSQL
1 -
Pricing
1 -
pubsub
1 -
Pyspark
1 -
Python
2 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
RBAC
1 -
Repos Support
1 -
Reserved VM's
1 -
Reset
1 -
run a job
1 -
runif
1 -
S3
1 -
SAP SUCCESS FACTOR
1 -
Schedule
1 -
SCIM
1 -
Serverless
1 -
Service principal
1 -
Session
1 -
Sign Up Issues
2 -
Significant Performance Difference
1 -
Spark
2 -
sparkui
2 -
Splunk
1 -
sqoop
1 -
Start
1 -
Stateful Stream Processing
1 -
Storage Optimization
1 -
Structured Streaming ForeachBatch
1 -
suggestion
1 -
Summit23
2 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
tabrikck
1 -
Tags
1 -
Troubleshooting
1 -
ucx
2 -
Unity Catalog
1 -
Unity Catalog Error
2 -
Unity Catalog Metastore
1 -
UntiyCatalog
1 -
Update
1 -
user groups
1 -
Venicold
3 -
volumes
2 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
with open
1 -
Women
1 -
Workflow
2 -
Workspace
2
- « Previous
- Next »