DATAECONOMY
PySpark Developer
Job Description
Join DataEconomy and be part of a dynamic team driving data-driven solutions. We’re seeking highly skilled PySpark developers with 4-6 years of experience to join our team in Hyderabad or Pune.
Responsibilities:
- Design and implement robust metadata-driven data ingestion pipelines using PySpark.
- Collaborate with technical teams to develop innovative data solutions.
- Work closely with business stakeholders to understand and translate requirements into technical specifications.
- Conduct unit testing, system testing, and support during UAT.
- Demonstrate strong analytical and problem-solving skills, as well as a commitment to excellence in software development.
- Experience in the financial or banking domain is a plus.