- 1717 Views
- 1 replies
- 0 kudos
Can api for query history /api/2.0/sql/history/queries return data which is older than 30 days?
I am using this api but it is returning the data for only last 30 days. Can this api return data which is older than 30 days?
- 1717 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @RahulChaubey, The query history system table was announced during the Q1 roadmap webinar (see the recording, 32:25). There is a chance that it will provide data with a horizon beyond 30 days. Meanwhile, you can enable system tables - I hope some ...
- 0 kudos
- 2769 Views
- 2 replies
- 0 kudos
Does Delta Table can be the source of streaming/auto loader?
Hi,Since the Auto Loader only accept "append-only" data as the source, I am wondering if the "Delta Table" can also be the source.Does VACCUM(deleting stale files) or _delta_log(creating nested and different file format than parquet) going to break A...
- 2769 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @QPeiran, Auto-loader is a feature that allows to integrate files into the Data Platform. Once your data is stored into the Delta Table, you can rely on spark.readStream.table("<my_table_name>") to continuously read from the table. Take a look at ...
- 0 kudos
- 1579 Views
- 1 replies
- 0 kudos
Handling large volumes of streamed transactional data using DLT
We have a data stream from event hub with approximately 10 million rows per day (into one table) - these records are insert only (no update). We are trying to find a solution to aggregate / group by the data based on multiple data points and our requ...
- 1579 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, please find below a set of resources I believe relevant for you. Success stories You can find the success stories of companies leveraging the streaming on Databricks here. Videos Introduction to Data Streaming on the Lakehouse : Structured Stream...
- 0 kudos
- 4537 Views
- 2 replies
- 0 kudos
Resolved! Rearrange tasks in databricks workflow
Hello,There is anyway to rearrange tasks in databricks workflow?.I would like that line that join the two marked tasks doesn't pass behind the other tasks. It is posible that this line by one side?Thanks.
- 4537 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @chemajar, Take a look at Databricks Asset Bundles. It allows you to streamline the development of complex workflows using a yaml definition. In case you need to change the task dependencies, you can rearrange the flow as you need just change the ...
- 0 kudos
- 2821 Views
- 2 replies
- 0 kudos
Do we pay just for qurery run duration while using databricks serverless sql ?
While using databricks serverless sql to run queries does we only pay for the compute resources during the run duration of the query ?
- 2821 Views
- 2 replies
- 0 kudos
- 1741 Views
- 1 replies
- 0 kudos
- 1741 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, can you clatify what is your aim ? Maybe there is no need to use DB SDK at all ?
- 0 kudos
- 3240 Views
- 3 replies
- 0 kudos
Unity Catalog view access in Azure Storage account
Hi,I have my unity catalog in Azure Storage account and I can able to access table objects but I couldn't find my views that were created on top of those table. 1. I can can access Delta tables & related views via Databricks SQL and also find the tab...
- 3240 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi, Couple of options are possible : Use Databricks to do the complex SQL queries (joins, unions, etc) and write to a staging Delta Table. Then use DataFlow to read from that staged table. Orchestrate all of this using ADF or even Databricks Workflo...
- 0 kudos
- 2989 Views
- 2 replies
- 1 kudos
PowerBI Tips
Does anyone have any tips for using PowerBI on top of databricks? Any best practices you know of or roadblocks you have run into that should be avoided?Thanks.
- 2989 Views
- 2 replies
- 1 kudos
- 1 kudos
Hey, Use Partner Connect to establish a connection to PBI Consider to use Databricks SQL Serverless warehouses for the best user experience and performance (see Intelligent Workload Management aka auto-scaling and query queuing, remote result cache, ...
- 1 kudos
- 1751 Views
- 1 replies
- 0 kudos
Connecting to DataBricks Sql warehouse from .net
Hi,How can I connect to DataBricks sql warehouse from .net application. Kr
- 1751 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey, Please, take a look at Statement Execution API Best,
- 0 kudos
- 1961 Views
- 1 replies
- 1 kudos
Resolved! Can we get SQL Serverless warehouses monitoring data using APIs or logs ?
I am looking for a possible way to get the autoscaling history data for SQL Serverless Warehouses using API or logs.I want something like what we see in monitoring UI.
- 1961 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi Rahul, you need to perform two actions : Enable system tables schema named "compute" (how-to, take a look on the page, it's highly possible that you'll find other schemas useful too)Explore system.compute.warehouse_events table Hope this helps. B...
- 1 kudos
- 11476 Views
- 4 replies
- 1 kudos
column "id" is of type uuid but expression is of type character varying.
Hello,I'm trying to write to Azure PostgreSQL flexible database from Azure Databricks, using PostgreSQL connector in Databricks Runtime in 12.2LTS.I'm using df.write.format("postgresql").save() to write to PostgreSQL database, but getting the follow...
- 11476 Views
- 4 replies
- 1 kudos
- 1 kudos
Yes, this stack overflow was my reference too and adding below option made load go with no error on UUID data type in postgres columnSpoiler.option(stringtype, "unspecified").option(stringtype, "unspecified")https://stackoverflow.com/questions/409739...
- 1 kudos
- 2358 Views
- 1 replies
- 3 kudos
Regional Group Request for Istanbul
Hello, I kindly request the formation of a regional group for Istanbul/Turkey. I would appreciate your assistance in this matter.Thank you,Can
- 2358 Views
- 1 replies
- 3 kudos
- 3 kudos
@kankotan Happy to help set it up for you. I have dropped an email for more information!
- 3 kudos
- 1310 Views
- 0 replies
- 0 kudos
Usage of SparkMetric_CL, SparkListenerEvent_CL and SparkLoggingEvent_CL
I am wondering If can retrieve any information from Azure Log Analytics custom tables (already set) for Azure Databricks. Would like to retrieve information about query and data performance for SQL Warehouse Cluster. I am not sure If I can get it fro...
- 1310 Views
- 0 replies
- 0 kudos
- 1925 Views
- 2 replies
- 0 kudos
Method public void org.apache.spark.sql.internal.CatalogImpl.clearCache() is not whitelisted on clas
I'm executing a notebook and failed with this error: Sometime, when i execute some function in spark and also failed with the error 'this class is not whitelist'. Could everyone help me check on this?Thanks for your help!
- 1925 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks for your feedback, actually my cluster is shared cluster, and then I change to single cluster then can run that method/
- 0 kudos
- 3331 Views
- 0 replies
- 0 kudos
Lakeview dashboard dynamically change filter values
Greetings everyone,We are trying to implement a series of visualizations. All these visualizations have queries assigned to them in the form of “ Select * from test table where timestamp between :Days start and :Days end”. The is also a filter applie...
- 3331 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
88 | |
42 | |
42 | |
30 |