- 1806 Views
- 1 replies
- 1 kudos
Resubscribing / Deleting and recreating an account
Hello,at some point I tested Databricks for a potential customer and, after the test, I cancelled the subscription.I read that it is not possible to resubscribe with the same e-mail address. Therefore, my idea would be to delete the account I created...
- 1806 Views
- 1 replies
- 1 kudos
- 1 kudos
I have a similar issue. I subscribed to Databricks using AWS account email. I cancelled it later. Now I want to start using Databricks on AWS again using the same email id and with pay as you go plan. But there is no way to re-subscribe. If this can...
- 1 kudos
- 1501 Views
- 0 replies
- 0 kudos
benchmark tpc-ds from external parquet hive structure in S#
Hi I am just getting started in databricks would appreciate some help here.I have 10TB TPCDS in S3 i a hive partition structure.My goal is to benchmark a data bricks cluster on this data.after setting all IAM credentials according to this https://doc...
- 1501 Views
- 0 replies
- 0 kudos
- 1308 Views
- 0 replies
- 0 kudos
transform a dataframe column as concatenated string
Hello,I have a single column dataframe and I want to transform the content into a stringEG df=abcdefxyzToabc, def, xyz Thanks
- 1308 Views
- 0 replies
- 0 kudos
- 1517 Views
- 1 replies
- 0 kudos
Azure DevOps load sequence
Hi Expert,How we can setup multiple notebook in a sequence order in flow for an example 1 pipeline have notebook1 - sequence 1,Notebook2- Sequence 2(in 1pipeline only)
- 1517 Views
- 1 replies
- 0 kudos
- 0 kudos
Not sure how to approach your challenge but something you can is to use the Databricks Job Scheduler or if you want an external solution in Azure you can call several notebooks from DataFactory.
- 0 kudos
- 1094 Views
- 0 replies
- 0 kudos
DBR 14.1 Pyspark Join on df1["col1"] = df2["col1"] syntax fails
HelloAfter upgrading my cluster from DBR 12 to 14.1 I got a MISSING_ATTRIBUTES.RESOLVED_ATTRIBUTE_APPEAR_IN_OPERATION on some of my Joinsdf1.join( df2, [df1["name"] == df2["name"], df1["age"] == df2["age"]], 'left_outer' )I resolved it by...
- 1094 Views
- 0 replies
- 0 kudos
- 853 Views
- 0 replies
- 0 kudos
Seeking Advice on Optimizing Spark Scripts for Efficient Data Processing
Hello community!I'm currently working on Spark scripts for data processing and facing some performance challenges. Any tips or suggestions on optimizing code for better efficiency? Your expertise is highly appreciated! paybyplatema Thanks.
- 853 Views
- 0 replies
- 0 kudos
- 925 Views
- 0 replies
- 0 kudos
pushing .ipynb from azure DB to github converts file to .py
I was trying to push .ipynb file to github from Azure DB to github and appears that original file is converted to source code as .py.Why does databricks do this and how can I control which ones to do or not ?I need to keep some files as .ipynb.Thanks...
- 925 Views
- 0 replies
- 0 kudos
- 1550 Views
- 0 replies
- 0 kudos
Getting the Databricks login screen as response to API call from Azure Data Factory
Hi,I'm trying to call the DLT api to kickoff my delta live table flow with a web API call block from Azure Data Factory. I have two environments: one DEV and one PROD.The DEV environment works fine, the response is giving me the update_id, but the PR...
- 1550 Views
- 0 replies
- 0 kudos
- 1855 Views
- 0 replies
- 0 kudos
Upload file from REST API Response directly to ADLS
i have usecase to call rest API and then return response file with base64Is it possible save the response directly to ADLS without convert it to file first ?
- 1855 Views
- 0 replies
- 0 kudos
- 17464 Views
- 2 replies
- 2 kudos
Resolved! Chrome browser is very slow
Hi. Recently, I've found that Databricks is really slow when editing notebooks, such as adding cells, copying and pasting text or cells, etc. It's just the past few weeks actually. I'm using Chrome version 118.0.5993.118. Everything else with Chrome ...
- 17464 Views
- 2 replies
- 2 kudos
- 2 kudos
It seems related to the notebook length (number of cells). The notebook that was really slow had about 40-50 cells, which I've done before without issue. Anyway after starting a new notebook using Chrome, it seems useable again. So without a specific...
- 2 kudos
- 1984 Views
- 0 replies
- 1 kudos
Not able to use graphframes library in databricks.
I have installed graphframes library from maven repository in the cluster (13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12), Standard DS4_v2). The library that I have installed is graphframes:graphframes:0.8.3-spark3.5-s_2.13.I can import the graph...
- 1984 Views
- 0 replies
- 1 kudos
- 1317 Views
- 0 replies
- 0 kudos
Unable to run query- Insert into with 'Nan' value in SQL Editor. Getting the error.
Unable to run query- Insert into with 'Nan' value in SQL Editor.Query :-Insert into ABC with values('xyz',123,Nan);Error :-org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation cannot be cast to org.apache.spark.sql.execution.datasources...
- 1317 Views
- 0 replies
- 0 kudos
- 1554 Views
- 0 replies
- 1 kudos
How to visualize result from DESCRIBE DETAIL statement
Hi all,In Lakeview Dashboard, I would like to visualize some delta table's info returned from the SQL statement 'DESCRIBE DETAIL'. In 'Data' tab, the dataset which contained that statement returned all detail info of my delta table. But the visualiz...
- 1554 Views
- 0 replies
- 1 kudos
- 1306 Views
- 0 replies
- 0 kudos
Can't see system catalog from Tableau
Hi Team, I'm trying to connect the system (system.billing.usage table) catalog (unity is enabled) in my workspace from Tableau. Im using Tableau version 2023.1 and ODBC driver version 2.7.5.1012-osx. I was able to create a connection but when Im conn...
- 1306 Views
- 0 replies
- 0 kudos
- 2188 Views
- 2 replies
- 0 kudos
DataBricks Certification Exam Got Suspended without no reason PLEASE HELP!
Hi @Cert-TeamMy test got suspend without any reason, the support guy had secured the area and I had also showed my entire desk but still they suspended my test, this is a huge loss please help me and reschedule my exam.I have also tried to raised a t...
- 2188 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Cert-Team,Still not able to raise ticket, after submitting the info for the ticket I'm not receiving any confirmation mail. Please help me to reschedule the exam as there this suspension is done without any reason and I haven't done anything whic...
- 0 kudos
-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
5 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
2 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
Aws databricks
1 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks Delta Table
1 -
Azure Databricks Job
1 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
CDC
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
4 -
Community Edition Account
1 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
csv
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Governance
1 -
Data Ingestion & connectivity
1 -
Data Ingestion Architecture
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
3 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks Serverless
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
Delta Time Travel
1 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
Event Driven
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free Edition
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
2 -
GenAI and LLMs
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
2 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
JSON Object
1 -
LakeflowDesigner
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
2 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model
1 -
Model Serving
1 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
2 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
provisioned throughput
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Speakers
1 -
Splunk
2 -
SQL
8 -
streamlit
1 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Vnet
1 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 141 | |
| 134 | |
| 57 | |
| 43 | |
| 42 |