cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Rishabh-Pandey
by Esteemed Contributor
  • 12899 Views
  • 8 replies
  • 8 kudos

Resolved! connect databricks to teradata

hey i want to know can we connect databricks to the teradata database and if yes what will be the procedure ??? help would be appreciated

  • 12899 Views
  • 8 replies
  • 8 kudos
Latest Reply
BroData
New Contributor II
  • 8 kudos

There are two main ways to connect to Teradata from Databricks using Python.Way 1: Using Python Libraries (e.g., sqlalchemy, pyjdbc, pyodbc, jaydebeapi, and so on)Pros: Provides a comprehensive solution, allowing us to: Query data, Trigger stored pro...

  • 8 kudos
7 More Replies
OthmaneH
by New Contributor II
  • 2418 Views
  • 2 replies
  • 2 kudos

Migration Teradata to Databricks

We are actually working in the migration of Teradata to Databricks.And i want to know how can i replace the Teradata connector in Datastage, to send the data to azure datalake using mft ? Thank you for your help.

  • 2418 Views
  • 2 replies
  • 2 kudos
Latest Reply
thelogicplus
Contributor
  • 2 kudos

@OthmaneH  use Travinto Technologies tool that may help you. we have done using their tool  more that 100+ sources to databricks migration with etl, database and sql.

  • 2 kudos
1 More Replies
explore
by New Contributor
  • 1848 Views
  • 0 replies
  • 0 kudos

Hi, Can we connect to the Teradata vantage installed in a vm via the community notebook. I am working on a POC to fetch data from Teradata vantate (just a teradata as it uses the jdbc) and process it in community notebook. Downloaded the terajdbc4.jar

from pyspark.sql import SparkSessionspark = SparkSession.builder.getOrCreate()def load_data(driver, jdbc_url, sql, user, password):  return spark.read \    .format('jdbc') \    .option('driver', driver) \    .option('url', jdbc_url) \    .option('dbt...

  • 1848 Views
  • 0 replies
  • 0 kudos
Labels