- 2865 Views
- 3 replies
- 0 kudos
Databricks Pub-Sub Data Recon
I am trying to setup a recon activity between GCP Pub-Sub and databricks, Is there any way to fetch the last 24hrs record count from Pub-Sub?I tried but not got any direct solution for it, It will be great if any one can suggest me the way t#pubsub, ...
- 2865 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @Ajay-Pandey Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too. Cheers!
- 0 kudos
- 1412 Views
- 1 replies
- 3 kudos
Resolved! No "create catalog" option in workspace with metastore linked
I created a workspace and a metastore(following the long tedious instructions). I assigned the workspace to the metastore. workspace -> Data, I can see the metastore link on the top left of the page. Through it I configured the permissions (giving me...
- 1412 Views
- 1 replies
- 3 kudos
- 3 kudos
Answering my own question - all that is needed is to refresh the Data web page!
- 3 kudos
- 1043 Views
- 1 replies
- 0 kudos
Problem sharing a streaming table created in Delta Live Table via Delta Sharing
Hi all,I hope you could help me to figure out what I am missing.I'm trying to do a simple thing. To read the data from the data ingestion zone (csv files saved to Azure Storage Account) using the Delta Live Tables pipeline and share the resulting tab...
- 1043 Views
- 1 replies
- 0 kudos
- 0 kudos
Sorry, I think I've created the post in the wrong thread. Created the same post in the Community Cove.
- 0 kudos
- 2821 Views
- 2 replies
- 0 kudos
Databricks Exam got suspended
Hi,I attended Databricks certified associate developer for apache spark 3.0- scala on 09 July 2023(today). At 7.35pm suddenly got a notice stating that due to eye movement away from exam your exam got suspended.I had completed the exam and was at the...
- 2821 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @nikhil018 Thank you for reaching out! Please submit a ticket to our Training Team here: https://help.databricks.com/s/contact-us?ReqType=training and our team will get back to you shortly.
- 0 kudos
- 2945 Views
- 3 replies
- 2 kudos
Can't use Partner Connect to FiveTran
Brand new Databricks account, Workspace (Premium tier)New SQL Warehouse (Small, Pro)Brand new FiveTran trialI can't seem to use Partner Connect to connect to FiveTran in my trial. It appeared to work at first, and after attempting to sync with Stripe...
- 2945 Views
- 3 replies
- 2 kudos
- 2 kudos
Hi @Anonymous ,So far, I haven't been able to get it to work. However, I really appreciate @Prabakar's answer and your follow-up. I'm still interpreting @Prabakar's wise advice; however, I think it mostly doesn't apply to this situation, as I'm using...
- 2 kudos
- 8872 Views
- 4 replies
- 1 kudos
Service Principal to run Jobs that contain notebooks in Repos (GitHub)
Hi, Would appreciate your help in understanding how to set up Git credentials for a Service Principal running jobs that contain notebooks in Repos (GitHub), so that it will have access to these notebooks. These credentials should not have any depende...
- 8872 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi @giladba Thank you for posting your question in our community! We are happy to assist you. To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your...
- 1 kudos
- 6003 Views
- 3 replies
- 0 kudos
Snowflake connection to databricks error
I am trying to connect to snowflake using data bricks but getting the below errornet.snowflake.client.jdbc.SnowflakeSQLException: JDBC driver encountered a communication error. Message: Exception encountered for HTTP request: Connect to xxx.region.sn...
- 6003 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @shivank25 Thank you for posting your question in our community! We are happy to assist you. To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers yo...
- 0 kudos
- 9134 Views
- 7 replies
- 4 kudos
How to schedule job in workflow for every 30 days
Dear All - I want to schedule a job in Workflow for every 30 days. So, when i try the below CRON expression it gives invalid cron expression. Anyone who already implemented this?0 0 */30 * *
- 9134 Views
- 7 replies
- 4 kudos
- 4 kudos
@pcbzmani Could you try 0 0 0 1/30 * ? *Databricks uses Quartz cron syntax and we have to provide 1/30 for day to achieve a schedule for 30 days.
- 4 kudos
- 2350 Views
- 2 replies
- 0 kudos
ADF & Databricks Pipeline Fails "Library installation failed for library due to user error"
I'm working on small POC to create a data pipeline which get triggered from ADF while having some parameters from ADF but my pipeline fails showing the attaching error:Operation on target Compute Daily Product Revenue failed: Databricks execution fai...
- 2350 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @qasimhassan Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you. Than...
- 0 kudos
- 686 Views
- 0 replies
- 0 kudos
Where to see who accessed a table in a TAC enabled cluster and SQL warehouse
Where to see who accessed a table in a Table access control enabled cluster and SQL warehouse.In Azure Diagnostic settings all the Category groups are selected but unable to find any information in log regarding the actual SQL command ran against a t...
- 686 Views
- 0 replies
- 0 kudos
- 2451 Views
- 4 replies
- 3 kudos
Not able to reset the password
I forgot my current password. So I was trying to change the password but the Databricks platform is not sending the password change link to my mail. I am now confused about what should be the next steps?It's a partner account which I am unable to res...
- 2451 Views
- 4 replies
- 3 kudos
- 3 kudos
Hi @Ck29 Thank you for posting your question in our community! We are happy to assist you. To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your qu...
- 3 kudos
- 4293 Views
- 6 replies
- 7 kudos
Not able to run spark context by running python file in a cell using %sh
Hi Everyone,I am trying to get spark context by running SparkSession builder command in a .py file and when i try to run it directl it works but if i got to a notebook and in a cell if i use %sh command to run python "myfilename" it gives an error th...
- 4293 Views
- 6 replies
- 7 kudos
- 7 kudos
@deficiant_codge there is minor difference can you try it.
- 7 kudos
- 3353 Views
- 3 replies
- 8 kudos
📣 Getting started with Databricks is easier than ever 🚀
We are thrilled to introduce a dedicated "Get Started with Databricks" space on Databricks Community for new users and curious data practitioners not yet on Databricks! In this space, we are bringing together the best of resources to ensure new and c...
- 3353 Views
- 3 replies
- 8 kudos
- 629 Views
- 0 replies
- 0 kudos
Send email Personalized from DATABRICKS
Hi..I would like to know if anyone knows how to send personalized emails from databricks or if there is an example that you can replicate it would be very helpful.Greetings to each one.
- 629 Views
- 0 replies
- 0 kudos
- 670 Views
- 1 replies
- 0 kudos
Databricks Repos w/o provider
Is there any way to get the functionality of databricks repos without using a provider? i.e. Databricks would be the provider. I would like to get the benefits of git operations in databricks but without linking my repository to an external source. I...
- 670 Views
- 1 replies
- 0 kudos
- 0 kudos
I'm afraid this isn't a supported feature today. Databricks Repos acts as a conduit or client between the workspace and a configured Git provider. No hosting functionality is available.
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
12.2 LST
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API
2 -
API Documentation
2 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
2 -
Azure data disk
1 -
Azure databricks
10 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
4 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Best Practices
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Bug
1 -
Catalog
1 -
Certification
1 -
Certification Exam
1 -
Certification Voucher
1 -
CICD
2 -
cleanroom
1 -
Cli
1 -
Cloud_files_state
1 -
cloudera sql
1 -
CloudFiles
1 -
Cluster
3 -
clusterpolicy
1 -
Code
1 -
Community Group
1 -
Community Social
1 -
Compute
2 -
conditional tasks
1 -
Connection
1 -
Cost
2 -
Credentials
1 -
CustomLibrary
1 -
CustomPythonPackage
1 -
DABs
1 -
Data Engineering
2 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
DataAISummit2023
1 -
DatabrickHive
1 -
databricks
2 -
Databricks Academy
1 -
Databricks Alerts
1 -
Databricks Audit Logs
1 -
Databricks Certified Associate Developer for Apache Spark
1 -
Databricks Cluster
1 -
Databricks Clusters
1 -
Databricks Community
1 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Documentation
1 -
Databricks JDBC
1 -
Databricks Job
1 -
Databricks jobs
2 -
Databricks Lakehouse Platform
1 -
Databricks notebook
1 -
Databricks Notebooks
2 -
Databricks Platform
1 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks SQL
1 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
3 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
dbdemos
1 -
DBRuntime
1 -
DDL
1 -
deduplication
1 -
Delt Lake
1 -
Delta
13 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
6 -
Delta Sharing
2 -
deltaSharing
1 -
denodo
1 -
Deny assignment
1 -
Devops
1 -
DLT
9 -
DLT Pipeline
6 -
DLT Pipelines
5 -
DLTCluster
1 -
Documentation
2 -
Dolly
1 -
Download files
1 -
dropduplicatewithwatermark
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
1 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
glob
1 -
Good Documentation
1 -
Google Bigquery
1 -
hdfs
1 -
Help
1 -
How to study Databricks
1 -
I have a table
1 -
informatica
1 -
Jar
1 -
Java
1 -
JDBC Connector
1 -
Job Cluster
1 -
Job Task
1 -
Kubernetes
1 -
LightGMB
1 -
Lineage
1 -
LLMs
1 -
Login
1 -
Login Account
1 -
Machine Learning
1 -
MachineLearning
1 -
masking
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Metastore
1 -
MlFlow
2 -
Mlops
1 -
Model Serving
1 -
Model Training
1 -
Mount
1 -
Networking
1 -
nic
1 -
Okta
1 -
ooze
1 -
os
1 -
Password
1 -
Permission
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
policies
1 -
PostgresSQL
1 -
Pricing
1 -
pubsub
1 -
Pyspark
1 -
Python
2 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
RBAC
1 -
Repos Support
1 -
Reserved VM's
1 -
Reset
1 -
run a job
1 -
runif
1 -
S3
1 -
SAP SUCCESS FACTOR
1 -
Schedule
1 -
SCIM
1 -
Serverless
1 -
Service principal
1 -
Session
1 -
Sign Up Issues
2 -
Significant Performance Difference
1 -
Spark
2 -
sparkui
2 -
Splunk
1 -
sqoop
1 -
Start
1 -
Stateful Stream Processing
1 -
Storage Optimization
1 -
Structured Streaming ForeachBatch
1 -
suggestion
1 -
Summit23
2 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
tabrikck
1 -
Tags
1 -
Troubleshooting
1 -
ucx
2 -
Unity Catalog
1 -
Unity Catalog Error
2 -
Unity Catalog Metastore
1 -
UntiyCatalog
1 -
Update
1 -
user groups
1 -
Venicold
3 -
volumes
2 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
with open
1 -
Women
1 -
Workflow
2 -
Workspace
2
- « Previous
- Next »