- 1194 Views
- 0 replies
- 0 kudos
Does delta file format support the continous trigger streaming as a sink?.trigger(Trigger.Continuous("1 second"))Can't find a document around itIn the spark documentation, I could see that below mentioned sinks are supported:Sinks:Kafka sink: All opt...
Nope This is not supported with delta file format
Hi Team, IHAC who is running multiple stream in the jar based job, one of the streams got terminated, but other streams are processing without termination.Is this know behaviour in case of jar based streaming application? Any insight please? (edited)
Failure in any of the active streaming queries causes the active run to fail and terminate all the other streaming queries. You do not need to use streamingQuery.awaitTermination()or spark.streams.awaitAnyTermination() at the end of your notebook. ...
Hello Team, is there any way to add new input sources to a streaming job while retaining the old checkpoint ? I have seen - Changes in the number or type (i.e. different source) of input sources: This is not allowed . Want to find out if there is any...
No you cannot do this, but often you can start a new checkpoint that starts from the same offsets the old one left off at: https://docs.databricks.com/delta/delta-streaming.html#specify-initial-position
Hi Team, For auto-loader, do we pause fetching the messages from SQS. Apple restarted their streaming workload and found no new messages are fetched or deleted. Taking the difference of the latest sequence processed and the latest sequence available ...
Hi streaming team, do we have any good integration for Azure Service Bus?
Hi Team I am trying to do security audit and its become tough to manage so many credentials and IAM role we have in databricks Different clusters, Is it possible that I simplify it , like a user who has type of access in s3 bucket get same type of...
This is a great question and Databricks is working continuously working on management of security , to make user experience better and simple.The use case you are trying to solve will be easily solved using high concurrency cluster and checkin...
Access control: Rich suite of access control all the way down to the storage layer. Databricks can take advantage of its cloud backbone by utilizing state-of-the-art AWS security services right in the platform. Federate your existing AWS data access ...
hI allI am reading data and I am caching the data and then I am performing Action Count to get the data in memory, but still, in dag I found out that every time it reads data from SOURCE.
It looks like the the spark memory is not sufficient to cache all the data so it read always from source
I am thinking of migrating the spark 2.4 to 3.0, what should I know to take care of changes thet I need to look at while migrating
I see there are many changes that need to take care of if you have used coding in spark 2.4 the changes are in Data set API StatementBuiltin UDF and functionsMore you can get from spark Documentation https://spark.apache.org/docs/latest/sql-migrat...
What type of aws instance and how many are used for an L sized Databricks SQL(SQLA) cluster with Photon enabled
After I vacuum the tables, do i need to update the manifest table and parquet table to refresh my external tables for integrations to work?
Manifest files need to be re-created when partitions are added or altered. Since a VACUUM only deletes all historical versions, you shouldn't need to create an updated manifest file unless you are also running an OPTIMIZE.
G1GC can solve problems in some cases where garbage collection is a bottleneck. checkout https://databricks.com/blog/2015/05/28/tuning-java-garbage-collection-for-spark-applications.html
As of this comment, SQL analytics still requires a few additional enablement steps. You will need to ask your Databricks account team to help turn this on in your workspace.
You could potentially do this through a Global Init Script - https://docs.databricks.com/clusters/init-scripts.html
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now| User | Count |
|---|---|
| 1632 | |
| 791 | |
| 513 | |
| 349 | |
| 287 |