- 2134 Views
- 2 replies
- 2 kudos
We are actually working in the migration of Teradata to Databricks.And i want to know how can i replace the Teradata connector in Datastage, to send the data to azure datalake using mft ? Thank you for your help.
- 2134 Views
- 2 replies
- 2 kudos
Latest Reply
@OthmaneH use Travinto Technologies tool that may help you. we have done using their tool more that 100+ sources to databricks migration with etl, database and sql.
1 More Replies
- 11738 Views
- 7 replies
- 8 kudos
hey i want to know can we connect databricks to the teradata database and if yes what will be the procedure ??? help would be appreciated
- 11738 Views
- 7 replies
- 8 kudos
Latest Reply
use the JDBC driver from here https://docs.databricks.com/integrations/jdbc-odbc-bi.html
6 More Replies
- 1656 Views
- 0 replies
- 0 kudos
from pyspark.sql import SparkSessionspark = SparkSession.builder.getOrCreate()def load_data(driver, jdbc_url, sql, user, password): return spark.read \ .format('jdbc') \ .option('driver', driver) \ .option('url', jdbc_url) \ .option('dbt...
- 1656 Views
- 0 replies
- 0 kudos