- 6561 Views
- 4 replies
- 0 kudos
Use Python notebook to read data from Databricks
I'm very new to Databricks. I hope this is the right place to ask this question.I want to use PySpark in a notebook to read data from a Databricks database with the below codes. databricks_host = "adb-xxxx.azuredatabricks.net" http_path = "/sql/1.0/w...
- 6561 Views
- 4 replies
- 0 kudos
- 0 kudos
I would try changing the query to something like the following, it should return the column names in the table so you can see if the jdbc call is actually returning the data correctlySELECT * FROM wphub_poc.gold.v_d_building limit 10
- 0 kudos
- 6918 Views
- 8 replies
- 22 kudos
Turn Off Auto-reveal of Navigation Sidebar
I work with the navigation sidebar closed and use the stacked hamburgers symbol in the upper left to reveal it when I want. Now, if you mouse over the left edge of the browser window too slowly it will auto-reveal the navigation sidebar. I do not wan...
- 6918 Views
- 8 replies
- 22 kudos
- 22 kudos
I've checked with the team, and there's no way to turn this off. However, they are making adjustments to improve the experience, and a fix to refine the sidebar functionality is on the way.
- 22 kudos
- 1092 Views
- 1 replies
- 0 kudos
New Group in Denver
Can we create a new Group in Denver?
- 1092 Views
- 1 replies
- 0 kudos
- 4083 Views
- 3 replies
- 0 kudos
Virtual Learning Festival Enrollment
Hi everyone,I tried to enroll to Virtual Learning Festival: 9 April - 30 April but upon clicking the Customers & Prospects link for LEARNING PATHWAY 1: ASSOCIATE DATA ENGINEERING I got the error (refer attached image).Thank you in advance for the hel...
- 4083 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @Advika,Refer attached image. I thought it was attached in my question. I really new to here.
- 0 kudos
- 814 Views
- 1 replies
- 0 kudos
Expectation in DLT using multiple columns
Is it possible to define an expectation in DLT pipeline using multiple columns?For example, my source has two fields - Division, Material_Number. For division 20, material number starts with 5; for 30 material number starts with 9.Can we have this ...
- 814 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Master_DataBric , Yes its possibleHere is the doc link : - https://docs.databricks.com/aws/en/dlt/expectations?language=Python- https://docs.databricks.com/aws/en/dlt/expectations?language=SQL
- 0 kudos
- 10919 Views
- 2 replies
- 1 kudos
POC Comparison: Databricks vs AWS EMR
Hello,I need some assistance with a comparison between Databricks and AWS EMR. We've been evaluating the Databricks Data Intelligence platform for a client and found it to be significantly more expensive than AWS EMR. I understand the challenge in ma...
- 10919 Views
- 2 replies
- 1 kudos
- 1 kudos
Databricks is highly optimized for Delta, which leverages columnar storage, indexing, and caching for better performance.Instead of directly processing CSV files, convert them to Delta first, then perform aggregations and joins, see if this helps
- 1 kudos
- 972 Views
- 1 replies
- 1 kudos
Is it possible to concatenate two notebooks?
I don't think it's possible but I thought I would check. I need to combine notebooks. While developing I might have code in various notebooks. I read them in with "%run".Then when all looks good I combine many cells into fewer notebooks. Is there any...
- 972 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @397973, Combining multiple notebooks into a single notebook isn't an out-of-the-box feature, but will try to combine %run commands ando output them to see if it works, sort of like: %run "/path/to/notebook1"%run "/path/to/notebook2"
- 1 kudos
- 1634 Views
- 2 replies
- 1 kudos
Databricks Lakehouse Monitoring
Hi,I am trying to implement lakehouse monitoring using Inference profile for my inference data that I have, I see that when I create the monitor, two tables get generated profile and drift, I wanted to understand how are these two tables generating a...
- 1634 Views
- 2 replies
- 1 kudos
- 1 kudos
When you create a Databricks Lakehouse Monitoring monitor with an Inference profile, the system automatically generates two metric tables: a profile metrics table and a drift metrics table. Here's how this process works: Background Processing When yo...
- 1 kudos
- 2529 Views
- 2 replies
- 0 kudos
Liquid Clustering Key Change Question
If i already have a cluster key1 for existing table, i want to change cluster key to key2 using ALTER TABLE table CLUSTER BY (key2), then run OPTIMIZE table, based on databrick document , existing files will not be rewritten (verified by my test as w...
- 2529 Views
- 2 replies
- 0 kudos
- 0 kudos
@ShivangiB You're correct in your understanding. When you change a clustering key using ALTER TABLE followed by OPTIMIZE, it doesn't automatically recluster existing data. Let me explain why this happens and what options you have.In Delta Lake (which...
- 0 kudos
- 908 Views
- 1 replies
- 0 kudos
Unable to Access S3 from Serverless but Works on Cluster
Hi everyone,I am trying to access data from S3 using an access key and secret. When I run the code through Databricks clusters, it works fine. However, when I try to do the same from a serverless cluster , I am unable to access the data.I have alread...
- 908 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @HarryRichard08! It looks like this post duplicates the one you recently posted. A response has already been provided to the Original post. I recommend continuing the discussion in that thread to keep the conversation focused and organized.
- 0 kudos
- 4630 Views
- 4 replies
- 0 kudos
Resolved! Exam suspended due to sudden power cut
Hi @Cert-Team I hope this message finds you well. I am writing to request a review of my recently suspended exam. I believe that my situation warrants reconsideration, and I would like to provide some context for your understanding.I applied for Da...
- 4630 Views
- 4 replies
- 0 kudos
- 2834 Views
- 2 replies
- 0 kudos
Workspace Assignment Issue via REST API
I’m relying on workspace assignment via REST API to have the account user created in the workspace. This is like the workspace assignment screen at account level or adding existing user screen at workspace level. The reference URL is below.Workspace ...
- 2834 Views
- 2 replies
- 0 kudos
- 0 kudos
It turns out, the problem is the documentation. It says that the permission parameter (that's supplied in) is an array of strings. It really just expects a string, either UNKNOWN, USER, or ADMIN. It would be great if the team could fix the documentat...
- 0 kudos
- 1295 Views
- 3 replies
- 0 kudos
Require Information on SQL Analytics DBU Cluster
Hello TeamWe are seeking cost information as we have noticed fluctuations in the daily costs for the "SQL Analytics DBU." We would like to understand the reasons behind the daily cost differences, even though the workload remains consistent.trying to...
- 1295 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @gauravmahajan,Most of the cost / DBU used can be retrieved from System tables across your different workspaces in a databricks account. Details related to job compute types and it's associated cost can be fetched from the queries mentioned in the...
- 0 kudos
- 1236 Views
- 2 replies
- 0 kudos
Is there a way to install hail on cluster?
Hi all!Been trying to install hail (https://hail.is/) on databricks with no luck so far. Is there an easy way to make it work? So far I could not get further than (providing sparkContext like `hl.init(sc=spark.sparkContext` also did not help):import ...
- 1236 Views
- 2 replies
- 0 kudos
- 0 kudos
you can run "pip install hail" on notebook cell.
- 0 kudos
- 5744 Views
- 10 replies
- 19 kudos
Resolved! Databricks Demos
I'm looking to build or select a demo in Databricks. Has anyone found any of the particular Databricks demos to deliver a "wow" factor. I am new to Databricks and I'm looking to use one of the staple demos if possible.All the best,BS
- 5744 Views
- 10 replies
- 19 kudos
- 19 kudos
> Has anyone found any of the particular Databricks demos to deliver a "wow" factor.Yes, in fact the last two sprints I did POCs starting with Databricks' AI demos. First, who is your audience--business users, or other technology people? They'll b...
- 19 kudos
-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
adcAws databricks
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
5 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
2 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
Aws databricks
1 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
7 -
Azure data disk
1 -
Azure databricks
16 -
Azure Databricks Delta Table
1 -
Azure Databricks Job
1 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
CDC
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
4 -
Community Edition Account
1 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
csv
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
4 -
Data Explorer
1 -
Data Governance
1 -
Data Ingestion & connectivity
1 -
Data Ingestion Architecture
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
4 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks Serverless
2 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks User Group
1 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
Delta Time Travel
1 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
DQX
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
Event Driven
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free Edition
1 -
Free trial
1 -
friendsofcommunity
1 -
GCP Databricks
1 -
GenAI
2 -
GenAI and LLMs
1 -
GenAI Course Material
1 -
Getting started
3 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
2 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
JSON Object
1 -
LakeflowDesigner
1 -
Learning
2 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
2 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model
1 -
Model Serving
1 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
2 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
provisioned throughput
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Speakers
1 -
Splunk
2 -
SQL
8 -
streamlit
1 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
2 -
Venicold
3 -
Vnet
1 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 140 | |
| 134 | |
| 57 | |
| 46 | |
| 42 |