cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

spark conf for serveless jobs

seefoods
Valued Contributor

Hello Guys, 

I use serveless on databricks Azure, so i have build a decorator which instanciate a SparkSession. My job use autolaoder / kafka using mode availableNow. Someone Knows which spark conf is required beacause i want to add it  ? 

Thanx 

import logging

from databricks.sdk import WorkspaceClient

from pyspark.sql import SparkSession
from pyspark.sql import DataFrame
from pyspark.sql.functions import col, udf

from pydantic import ValidationError

from typing import Callable
from functools import wraps




def get_spark_session():
"""

:return:
"""
return SparkSession.builder.getOrCreate()


def provide_spark_session(function):
""""
:param function:
:return: """

def wrapper(*args, **kwargs):
return function(spark=get_spark_session(), *args, **kwargs)

return wrapper


def get_workspace_session():
"""
:return: WorkspaceClient instance
"""
return WorkspaceClient()


def provide_workspace_session(function):
"""
Decorator to provide a Databricks WorkspaceClient session
:param function: Function to decorate
:return: Decorated function
"""

@wraps(function)
def wrapper(*args, **kwargs):
return function(workspace_session=get_workspace_session(), *args, **kwargs)

return wrapper
0 REPLIES 0