Back to Jobs
//} ?>
Views:110
Applicants:1
Posted on 30 Oct, 2025
Remote
Job Description | Responsibilities
-
Develop reusable, metadata-driven ETL/ELT pipelines in Databricks using PySpark and SQL.
-
Automate and optimize data platform workflows and integrations with diverse data sources and consumers
-
Maintain technical documentation and contribute to continuous architectural improvement
Overview
- Industry - ICTE, Information, Communication, Telecom Equipment & Smart City Components Manufacturing
- Functional Area - IT Software Programming / Analysis / Quality / Testing / Training
- Job Role - Data Quality Engineer
- Employment type - Full Time - Permanent
- Work Mode - Remote
Qualifications
- Any Graduate - Any Specialization
- Any Post Graduate - Any Specialization
- Any Doctorate - Any Specialization
Job Related Keywords
Databricks
PySpark
Terraform
Python
SQL
Kafka
Azure