- 1461 Views
- 1 replies
- 0 kudos
Function/system id for tableau databricks connector
currently, we are using PAT token for authentication purposes to generate tableau reports form data in databricks delta tables using databricks tableau connector. Would somebody know if system/function ID can be used from Tableau to Databricks inste...
- 1461 Views
- 1 replies
- 0 kudos
- 3211 Views
- 3 replies
- 1 kudos
Resolved! Can data be unified based on client profile (unified profile) in databricks?
Hi All,my question is in regard to how data in salesforce data cloud gets unified based on client profiles. Can similar action be done on data in databricks. i believe unity catalog just provides unified layer for security and governance. is there a ...
- 3211 Views
- 3 replies
- 1 kudos
- 1 kudos
You want to identify actual persons based on one or more profiles (based on e-mail address etc). That is something that is not available out-of-the box in Databricks. The 'unified' in Databricks means you have a single platform for several data top...
- 1 kudos
- 2436 Views
- 0 replies
- 2 kudos
pandas .style is ugly in Databricks
Why does something like df.style.hide_index() turn out so ugly in Databricks? That command should show the dataframe pretty like always, but simply with the index column concealed. Instead, here's an image of what happens instead (displayi...
- 2436 Views
- 0 replies
- 2 kudos
- 2425 Views
- 0 replies
- 1 kudos
No able to move a file to volume
I am trying to move a file from repo local directory to volumes, but I am getting directory not found issue. Can some one guide me.tried using dbfs (dbfs/:Volumes/folder/ , /dbfs/Volumes/folder/) and without dbfs (/Volumes/folder/). None worked.@Reti...
- 2425 Views
- 0 replies
- 1 kudos
- 3332 Views
- 0 replies
- 0 kudos
Can't copy notebook cells using keyboard shortcut
I have become really frustrated because I can't copy and paste cells in Databricks notebook. The keyboard shortcuts of Command + C and Command + V doesn't seem to work. I couldn't find a way to change the keyboard shortcut either.As a data scientist ...
- 3332 Views
- 0 replies
- 0 kudos
- 5551 Views
- 5 replies
- 5 kudos
Problem sharing a streaming table created in Delta Live Table via Delta Sharing
Hi all,I hope you could help me to figure out what I am missing.I'm trying to do a simple thing. To read the data from the data ingestion zone (csv files saved to Azure Storage Account) using the Delta Live Tables pipeline and share the resulting tab...
- 5551 Views
- 5 replies
- 5 kudos
- 5 kudos
I'm curious if Databricks plans to address this. We use delta live streaming tables extensively and also planned on using delta sharing to get our data from our production unity catalog (different region). Duplicating the data as a workaround is no...
- 5 kudos
- 4472 Views
- 1 replies
- 0 kudos
How to create Delta live tables in Silver layer
How to create Delta live tables in Silver layerHi DB Experts,Having basic questions :I am working on Madalian Architecture (B, S, G) Layers.on B i am getting Delta files (Parq) format. with log folders. One folder for one table, multiple files are ge...
- 4472 Views
- 1 replies
- 0 kudos
- 0 kudos
Dear Kaniz,Thank you for addressing question :I am getting following error if i follow above: pyspark.errors.exceptions.captured.IllegalArgumentException: Reading from a Delta table is not supported with this syntax. If you would like to consume data...
- 0 kudos
- 10199 Views
- 7 replies
- 24 kudos
Big news: Our Community is now 100,000 members strong with over 50,000 posts🚀
Thanks to every one of you, the Databricks Community has reached an incredible milestone: 100,000 members and over 50,000 posts! Your dedication, expertise and passion have made this possible. Whether you're a seasoned data professional, a coding en...
- 10199 Views
- 7 replies
- 24 kudos
- 6498 Views
- 1 replies
- 0 kudos
Error Spark reading CSV from DBFS MNT: incompatible format detected
I am trying to follow along with a training course, but I am consistently running into an error loading a CSV with Spark from DBFS. Specifically, I keep getting an "Invalid format detected error". Has anyone else encountered this and found a soluti...
- 6498 Views
- 1 replies
- 0 kudos
- 0 kudos
Well your error message is telling you that Spark is encountering a Delta table conflict while trying to read a CSV file. The file path dbfs:/mnt/dbacademy... points to a CSV file. This is where the fun begins. Spark detects a Delta transaction log d...
- 0 kudos
- 12931 Views
- 13 replies
- 1 kudos
Resolved! Want to split JSON data into multiple rows
Hi,This is my sample JSON data which is generated from api response and it is all coming in a single row. I want to split this in multiple rows and store it in a dataframe.[{"transaction_id":"F6001EC5-528196D1","corrects_transaction_id":null,"transac...
- 12931 Views
- 13 replies
- 1 kudos
- 1 kudos
Yes indeed, it was datatype issue. After changing it to Longtype in the schema definition, it is working now. Thanks once again for all your inputs and time. Much appreciated !!!
- 1 kudos
- 2185 Views
- 2 replies
- 0 kudos
Bug with show tblproperties directly returning redacted in the result set where "userid" in value
When I use show tblproperties on a view/table to see the metadata, it will redact any value which has "userid" anywhere put in to it.And it is not just through the visual interface, when I query it through python directly, it contains the redacted va...
- 2185 Views
- 2 replies
- 0 kudos
- 0 kudos
I understand that yours is a View. For my case, it's a Table so I could use `desc detail <schema_name>.<table_name>` to get the table properties info that are not redacted in the `properties` column from the `desc detail` output.
- 0 kudos
- 5925 Views
- 4 replies
- 0 kudos
My exam got suspended ; Need help immediately (10/09/2023)
Hello Team,I encountered Pathetic experience while attempting my DataBricks Data engineer certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. wasted my time and then suspended my exam.I want to file ...
- 5925 Views
- 4 replies
- 0 kudos
- 0 kudos
Hello, @sirishavemula20 It's a general practice for a proctor to ask the test taker to pan the room(as part of security measures) and its the responsibility of the test taker to make sure the surroundings are clear of any other objects whilst attempt...
- 0 kudos
- 6896 Views
- 8 replies
- 6 kudos
Error at model serving for quantised models using bitsandbytes library
Hello,I've been trying to serve registered MLflow models at GPU Model Serving Endpoint, which works except for the models using bitsandbytes library. The library is used to quantise the LLM models into 4-bit/ 8-bit (e.g. Mistral-7B), however, it runs...
- 6896 Views
- 8 replies
- 6 kudos
- 6 kudos
@phi_alpacaWe have solved it by providing a conda_env.yaml when we log the model, all we needed was to add cudatoolkit=11.8 to the dependencies.
- 6 kudos
- 2934 Views
- 2 replies
- 0 kudos
Databricks Job cost(AWS)
Hi Databricks Community,I am looking for a formula/way to calculate the estimated cost for a job run, for which I have a few questions:1. Is there any formula to calculate the cost of any job like -> [(EC2 per hr cost) * (total time job ran)]and when...
- 2934 Views
- 2 replies
- 0 kudos
- 0 kudos
This looks a little bit confusing to me, I'm looking for a more straight forward answer, more like a simple formulaThanks though for your reply
- 0 kudos
- 1641 Views
- 0 replies
- 0 kudos
sparklyr::spark_read_csv forbidden 403 error
Hi,I am trying to read a csv file into a Spark DataFrame using sparklyr::spark_read_csv. I am receiving a 403 access denied error.I have stored my AWS credentials as environment variables, and can successfully read the file as an R dataframe using ar...
- 1641 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
4 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 122 | |
| 57 | |
| 42 | |
| 40 |