Fidelity International
Senior Data Engineer
Job Description
Application Deadline: 31 July 2025
Strategic Impact
As a Senior Data Engineer, you will directly contribute to our key organizational objectives:
● Accelerated Innovation
○ Enable rapid development and deployment of data-driven products through scalable, cloud-native architectures
○ Empower analytics and data science teams with self-service, real-time, and high-quality data access
○ Shorten time-to-insight by automating data ingestion, transformation, and delivery pipelines
● Cost Optimization
○ Reduce infrastructure costs by leveraging serverless, pay-as-you-go, and managed cloud services (e.g., AWS Glue, Databricks, Snowflake)
○ Minimize manual intervention through orchestration, monitoring, and automated recovery of data workflows
○ Optimize storage and compute usage with efficient data partitioning, compression, and lifecycle management
● Risk Mitigation
○ Improve data governance, lineage, and compliance through metadata management and automated policy enforcement
○ Increase data quality and reliability with robust validation, monitoring, and alerting frameworks
○ Enhance system resilience and scalability by adopting distributed, fault-tolerant architectures
● Business Enablement
○ Foster cross-functional collaboration by building and maintaining well-documented, discoverable data assets (e.g., data lakes, data warehouses, APIs)
○ Support advanced analytics, machine learning, and AI initiatives by ensuring timely, trusted, and accessible data
○ Drive business agility by enabling rapid experimentation and iteration on new data products and features
Key Responsibilities
• Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics
• Be accountable for technical delivery and take ownership of solutions
• Lead a team of senior and junior developers providing mentorship and guidance
• Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress
• Drive technical innovation within the department to increase code reusability, code quality and developer productivity
• Challenge the status quo by bringing the very latest data engineering practices and techniques
About you
Core Technical Skills
• Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house.
• Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3.
• Experience designing event-based or streaming data architectures using Kafka.
• Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python.
• Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation.
• Data Security & Performance Optimization: Experience implementing data access controls to meet regulatory requirements.
• Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings.
• Experience implementing CDC ingestion
• Experience using orchestration tools (Airflow, Control-M, etc…)
• Significant experience in software engineering practices using GitHub, code verification, validation, and use of copilots
Bonus technical Skills:
• Strong experience in containerisation and experience deploying applications to Kubernetes
• Strong experience in API development using Python based frameworks like FastAPI
Key Soft Skills:
• Problem-Solving: Leadership experience in problem-solving and technical decision-making.
• Communication: Strong in strategic communication and stakeholder engagement.
• Project Management: Experienced in overseeing project lifecycles working with Project Managers to manage resources.