- 63717 Views
- 5 replies
- 4 kudos
How to create temporary table in databricks
Hi Team,I have a requirement where I need to create temporary table not temporary view.Can you tell me how to create temporary table in data bricks ?
- 63717 Views
- 5 replies
- 4 kudos
- 4 kudos
I see, thanks for sharing, can you mark the solution which worked for you @abueno as Accepted.
- 4 kudos
- 815 Views
- 1 replies
- 0 kudos
ConcurrentAppendException
Hi Team,We're dealing with a concurrency issue when we attempt to run multiple jobs at the same time, and we're still having the same problem even after using partition and liquid clustering features. Now we're making sure to have the right where con...
- 815 Views
- 1 replies
- 0 kudos
- 0 kudos
Option 3 Bit complicated process but works well: Considering you will do Update, Insert and Delete operation on the same table at the same time by using multiple job.1. Create a Dummy table with the target table schema with additional column called O...
- 0 kudos
- 614 Views
- 0 replies
- 0 kudos
How to test dashboards for free?
Does the community edition include the ability to create dashbiards? I’d not, is there a free option to test dashboards?
- 614 Views
- 0 replies
- 0 kudos
- 946 Views
- 0 replies
- 1 kudos
FAQ for Databricks Learning Festival (Virtual): 10 July - 24 July 2024
IObit Smart Defrag Pro 9.3.0.341 Serial Key Smart defragmentation works quickly, mechanically, and silently in history and is suitable for large hard drives. It facilitates its hard work with greater success than any other product on the market, free...
- 946 Views
- 0 replies
- 1 kudos
- 1860 Views
- 1 replies
- 1 kudos
how to reduce scale to zero time in MLFlow Serving
Hi,I am deploying MLflow models using Databrick serverless serving but seems servers scale down to 0 only after 30 minute of inactivity. Is there any way to reduce this time?Also, Is it possible to deploy multiple models under single endpoint. I want...
- 1860 Views
- 1 replies
- 1 kudos
- 1 kudos
Regarding your first question about reducing the scale-down time of Databricks serverless serving, currently, the system is designed to scale down to zero after 30 minutes of inactivity. This is to ensure that instances are kept warm to handle any su...
- 1 kudos
- 1837 Views
- 3 replies
- 0 kudos
Resolved! Help trying to use Python in Databricks
I am watching an Introduction to Databricks - running Python scripts and I don't see where I create a notebook in my databricks instance to even select Python as the Default Language.Is it possible, my level of databricks isn't allowing me to run Pyt...
- 1837 Views
- 3 replies
- 0 kudos
- 3792 Views
- 3 replies
- 0 kudos
Tableau Desktop connection error from Mac M1
Hi, Im getting the below error while connecting SQL Warehouse from the tableau desktop. I installed the latest ODBC drivers (2.7.5) but I can confirm that the driver name is different. From the error message I see libsparkodbc_sbu.dylib but in my lap...
- 3792 Views
- 3 replies
- 0 kudos
- 0 kudos
Have you referred to this document?https://help.tableau.com/current/pro/desktop/en-us/examples_databricks.html https://help.tableau.com/current/pro/desktop/en-us/examples_databricks.htm
- 0 kudos
- 1522 Views
- 2 replies
- 0 kudos
DQ Expectations Best Practice
Hi there, I hope this is a fairly simple and straightforward question. I'm wondering if there's a "general" consensus on where along the DLT data ingestion + transformation process should data quality expectations be applied? For example, two very si...
- 1522 Views
- 2 replies
- 0 kudos
- 0 kudos
I'll drop my two cents here: having multiple layer validations reduce the effort needed to find the root cause of a data incident, but it has a drawback: they are harder to maintain.Every layer has a set of rules to be enforced and there will be asse...
- 0 kudos
- 3037 Views
- 8 replies
- 9 kudos
Resolved! Facing StorageContext Error while trying to access DBFS
This issue has hindered my practice for the whole day. I scoured the web and couldn't find anybody who has faced this particular error. The error I am getting is: DBFS file browserStorageContext com.databricks.backend.storage.StorageContextType$DbfsR...
- 3037 Views
- 8 replies
- 9 kudos
- 9 kudos
Yeah, unable to save any file with rdd.saveTextfile and to upload any file using the workspace.
- 9 kudos
- 523 Views
- 0 replies
- 0 kudos
DBCU Plans is costlier vs Job Compute Premium 0.30 per DBU Please justify
Please help me to understood the % of savings how Databricks are calculating DBCUThey are telling if I take DBCU 12500 plan the price will be with discount 12000 and 4% discount.That means if I consume 12500 DBU, I am paying for this $12000 and 4% sa...
- 523 Views
- 0 replies
- 0 kudos
- 4417 Views
- 11 replies
- 13 kudos
Resolved! Uploading local file
 Since, last two day i getting an error called "ERROR OCCURRED WHEN PROCESSING FILE:[OBJECT OBJECT]" While uploading any "csv" or "json" file from my local system but it shows or running my previous file but give error after uploading a new file
- 4417 Views
- 11 replies
- 13 kudos
- 2273 Views
- 3 replies
- 0 kudos
Resolved! DLT Compute: "Ephemeral" Job Compute vs. All-purpose compute 2.0 ... WHY?
Hi there, this is a follow-up from a discussion I started last monthSolved: Re: DLT Compute: "Ephemeral" Job Compute vs. All-p... - Databricks Community - 71661Based on what was discussed, I understand that it's not possible to use "All Purpose Clust...
- 2273 Views
- 3 replies
- 0 kudos
- 0 kudos
@ChristianRRL regarding on why DLT doesn't allow you to use all-purpose clusters: 1. The DLT runtime is derived from the shared compute DBR, it's not the same runtime and has different features than the common all-purpose runtime. A DLT pipeline is n...
- 0 kudos
- 678 Views
- 1 replies
- 0 kudos
auto statistics cost
hi,is there any cost implications for automatic statistics collection?or databricks is providing it as a feature and didn't cost on my cluster?
- 678 Views
- 1 replies
- 0 kudos
- 9380 Views
- 8 replies
- 3 kudos
What are the different ways to pull the log data from Splunk to Databricks?
Hi,I have recently started Splunk Integration with Databricks. Basically I am trying to ingest the data from Splunk to Databricks. I have gone through the documentation regarding Splunk Integration. There are some basic information about the integrat...
- 9380 Views
- 8 replies
- 3 kudos
- 3 kudos
Hi @Arch_dbxlearner Did you done integration with splunk if yes can you please help
- 3 kudos
- 539 Views
- 1 replies
- 1 kudos
"No API found for 'POST /workspace-files" error while trying to upload a JAR
Hi,I'm using CE and trying to upload a JAR library of about 45MB into my workspace so I can use it from Pyspark, but getting error "No API found for 'POST /workspace-files". Any thoughts?
- 539 Views
- 1 replies
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
12.2 LST
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API
2 -
API Documentation
2 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
2 -
Azure data disk
1 -
Azure databricks
10 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
4 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Best Practices
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Bug
1 -
Catalog
1 -
Certification
2 -
Certification Exam
1 -
Certification Voucher
1 -
CICD
2 -
cleanroom
1 -
Cli
1 -
Cloud_files_state
1 -
cloudera sql
1 -
CloudFiles
1 -
Cluster
3 -
clusterpolicy
1 -
Code
1 -
Community Group
1 -
Community Social
1 -
Compute
3 -
conditional tasks
1 -
Connection
1 -
Cost
2 -
Credentials
1 -
CustomLibrary
1 -
CustomPythonPackage
1 -
DABs
1 -
Data Engineering
2 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
DataAISummit2023
1 -
DatabrickHive
1 -
databricks
2 -
Databricks Academy
1 -
Databricks Alerts
1 -
databricks app
1 -
Databricks Audit Logs
1 -
Databricks Certified Associate Developer for Apache Spark
1 -
Databricks Cluster
1 -
Databricks Clusters
1 -
Databricks Community
1 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
1 -
Databricks JDBC
1 -
Databricks Job
1 -
Databricks jobs
2 -
Databricks Lakehouse Platform
1 -
Databricks notebook
1 -
Databricks Notebooks
2 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks SQL
1 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
dbdemos
2 -
DBRuntime
1 -
DDL
1 -
deduplication
1 -
Delt Lake
1 -
Delta
13 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
6 -
Delta Sharing
2 -
deltaSharing
1 -
denodo
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
DLTCluster
1 -
Documentation
2 -
Dolly
1 -
Download files
1 -
dropduplicatewithwatermark
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
glob
1 -
Good Documentation
1 -
Google Bigquery
1 -
hdfs
1 -
Help
1 -
How to study Databricks
1 -
I have a table
1 -
informatica
1 -
Jar
1 -
Java
2 -
Jdbc
1 -
JDBC Connector
1 -
Job Cluster
1 -
Job Task
1 -
Kubernetes
1 -
LightGMB
1 -
Lineage
1 -
LLMs
1 -
Login
1 -
Login Account
1 -
Machine Learning
1 -
MachineLearning
1 -
manage account databricks unity catalog
1 -
masking
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Metastore
1 -
MlFlow
2 -
Mlops
1 -
Model Serving
1 -
Model Training
1 -
Mount
1 -
Networking
1 -
nic
1 -
Okta
1 -
ooze
1 -
os
1 -
Password
1 -
Permission
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
policies
1 -
PostgresSQL
1 -
Pricing
1 -
pubsub
1 -
Pyspark
1 -
Python
2 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
RBAC
1 -
react.js
1 -
Read data
1 -
Repos Support
1 -
required versus current
1 -
Reserved VM's
1 -
Reset
1 -
run a job
1 -
runif
1 -
S3
1 -
SAP SUCCESS FACTOR
1 -
Schedule
1 -
SCIM
1 -
Serverless
1 -
Service principal
1 -
Session
1 -
Sign Up Issues
2 -
Significant Performance Difference
1 -
Spark
2 -
sparkui
2 -
Splunk
1 -
sqoop
1 -
Start
1 -
Stateful Stream Processing
1 -
Storage Optimization
1 -
Structured Streaming ForeachBatch
1 -
suggestion
1 -
Summit23
2 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
tabrikck
1 -
Tags
1 -
Training
1 -
trajectory
1 -
Troubleshooting
1 -
ucx
2 -
Unity Catalog
1 -
Unity Catalog Error
2 -
Unity Catalog Metastore
1 -
UntiyCatalog
1 -
Update
1 -
user groups
1 -
Venicold
3 -
volumes
2 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
with open
1 -
Women
1 -
Workflow
2 -
Workspace
2
- « Previous
- Next »