Back to Jobs
//} ?>
Views:163
Applicants:0
Posted on 24 Jul, 2025
In Office
Job Description | Responsibilities
-
Design, develop, and maintain scalable data processing applications using Apache Spark
-
Write efficient, reusable, and well-documented Spark/Scala/Python code
-
Evaluate feasibility of data solutions from business and technical standpoints
-
Tune and optimize existing Spark applications for performance
Overview
- Industry - Market Research Firms
- Functional Area - IT Software Programming / Analysis / Quality / Testing / Training
- Job Role - Database, Datawarehousing
- Employment type - Full Time - Permanent
- Work Mode - In Office
Qualifications
- Any Graduate - Any Specialization
- Any Post Graduate - Any Specialization
- Any Doctorate - Any Specialization
Job Related Keywords
Apache Spark
Scala
Python
SQL
data modeling techniques
data warehousing
Core Java
Linux