Back to Jobs
//} ?>
Views:111
Applicants:2
Posted on 30 Oct, 2025
Remote
Job Description | Responsibilities
-
Build reusable, metadata-driven ETL pipelines using Python/PySpark.
-
Integrate diverse data sources and automate ingestion and transformation processes.
-
Optimize data performance in Databricks and Azure environments.
-
Develop CI/CD pipelines for data solutions.
Overview
- Industry - ICTE, Information, Communication, Telecom Equipment & Smart City Components Manufacturing
- Functional Area - IT Software Programming / Analysis / Quality / Testing / Training
- Job Role - Data Quality Engineer
- Employment type - Full Time - Permanent
- Work Mode - Remote
Qualifications
- Any Graduate - Any Specialization
- Any Post Graduate - Any Specialization
- Any Doctorate - Any Specialization
Job Related Keywords
Python
PySpark
Azure
Databricks
Terraform
Agile
Kafka
Cloud Data Engineering
Similar Jobs
Data Engineer – Core Java & SQL
TRDFIN Support Services Pvt. Ltd.
Data Engineer – Core Java & SQL
TRDFIN Support Services Pvt. Ltd.
Data Engineering Internship (Remote)
Data Engineering Solution Architect- Hyderabad
HYrEzy Tech Solutions
Data Engineer - Snowflake & Analytics Engineering
HYrEzy Tech Solutions