DATAECONOMY

PySpark Developer

11 October 2024
Apply Now
Deadline date:
£28000 - £67000 / year

Job Description

Join DataEconomy and be part of a dynamic team driving data-driven solutions. We’re seeking highly skilled PySpark developers with 4-6 years of experience to join our team in Hyderabad or Pune.

Responsibilities:

  • Design and implement robust metadata-driven data ingestion pipelines using PySpark.
  • Collaborate with technical teams to develop innovative data solutions.
  • Work closely with business stakeholders to understand and translate requirements into technical specifications.
  • Conduct unit testing, system testing, and support during UAT.
  • Demonstrate strong analytical and problem-solving skills, as well as a commitment to excellence in software development.
  • Experience in the financial or banking domain is a plus.

Requirements

  • 4-6 years of experience in IT, with a minimum of 3 years of hands-on experience in Python and PySpark.
  • Solid understanding of data warehousing concepts and ETL processes.
  • Proficiency in Linux and Java is a plus.
  • Experience with code versioning tools like Git, AWS CodeCommit, and CI/CD pipelines (e.g., AWS CodePipeline).
  • Proven ability to build metadata-driven frameworks for data ingestion.
  • Familiarity with various designs and architectural patterns.
  • Benefits

  • Opportunities for professional growth and development.
  • Be part of a dynamic and collaborative team.