by
MarsSu
• New Contributor II
- 9122 Views
- 3 replies
- 0 kudos
Hi, Everyone.Currently I try to implement spark structured streaming with Pyspark. And I would like to merge multiple rows in single row with array and sink to downstream message queue for another service to use. Related example can follow as:* Befor...
- 9122 Views
- 3 replies
- 0 kudos
Latest Reply
Is there any solution to this, @MarsSu were you able to solve this, kindly shed some light on this if you resolve this.
2 More Replies
- 5625 Views
- 4 replies
- 2 kudos
object OurMainObject extends LazyLogging with IOApp {
def run(args: List[String]): IO[ExitCode] = {
logger.info("Started the application")
val conf = defaultOverrides.withFallback(defaultApplication).withFallback(defaultReference)
val...
- 5625 Views
- 4 replies
- 2 kudos
Latest Reply
my workaround now is to make the code like below, so the databricks jobs becomes failure. case Left(ex) => {
IO(logger.error("Glue failure", ex)).map(_ => ExitCode.Error)
IO.raiseError(ex)
}
3 More Replies
- 3348 Views
- 3 replies
- 0 kudos
import com.databricks.client.jdbc.DataSource;
import java.sql.*;
public class testDatabricks {
public static void main(String[] args) throws SQLException {
String dbUrl = "jdbc:databricks://<hostname>:443;HttpPath=<HttpPath>;";
// Cop...
- 3348 Views
- 3 replies
- 0 kudos
Latest Reply
Atanu
Databricks Employee
This looks like due to maintenance on US . Are you still facing the issue @Sriramkumar Thamizharasan Is your workspace on eastus and eastus2 ?
2 More Replies