NorthBay Solutions

AWS Data Engineer – L1/L2 Support Role

20 February 2025
Apply Now
Deadline date:
£24000 / year

Job Description

Job Title: AWS Data Engineer – L1/L2 Support Role

Job Overview  

We are looking for a Level 1/2 AWS Data Engineer to join our team. In this role, you will be responsible for monitoring, troubleshooting, and optimizing AWS-based data pipelines. You will work with AWS Glue, Lambda, Step Functions, and other AWS data services to investigate job failures, re-run ETL jobs, and ensure data integrity. You will be part of a fast-paced environment that requires analytical thinking, problem-solving skills, and an understanding of AWS data engineering best practices.  

Key Responsibilities  

– Monitor, troubleshoot, and support AWS data pipelines (Glue, Lambda, Step Functions, S3, Athena, Redshift, Airflow).  

– Investigate job failures, analyze CloudWatch logs, and take corrective actions to ensure data pipeline continuity.  

– Re-run failed jobs and retry data ingestion processes while ensuring data consistency.  

– Assist in debugging Glue ETL scripts, Athena queries, and Lambda functions used for data transformations.  

– Work closely with L2/L3 engineers and data architects to escalate and resolve complex issues.  

– Optimize data processing jobs by applying best practices in partitioning, bucketing, and indexing.  

– Maintain documentation of troubleshooting steps, common issues, and resolutions.  

– Ensure compliance with AWS security best practices (IAM roles, policies, and access controls).  

– Participate in incident response and data recovery procedures as required.  

– Provide technical support for production data pipelines while ensuring SLAs are met.  

Qualifications  

– Bachelor’s degree in Computer Science, Data Science, Information Technology, or related field.  

– Up to 1 year of experience in AWS-based data engineering.  

– Hands-on experience with AWS Glue, S3, Athena, Lambda, Step Functions, and CloudWatch.  

– Strong Python (Pandas, Boto3) and SQL skills for data querying and processing.  

– Familiarity with PySpark and ETL development is a plus.  

– Basic knowledge of AWS IAM, security policies, and access control.  

– Experience in monitoring and debugging AWS Glue jobs, Step Functions, and Lambda logs.

– Hands-on experience with Data Lake, Delta Lake, and Lakehouse architectures.  

– Strong analytical and problem-solving skills to diagnose and resolve data processing failures.  

– Ability to work independently while collaborating effectively with cross-functional teams.  

– AWS certification such as AWS Certified Data Analytics – Specialty or AWS Certified Solutions Architect Associate (preferred but not mandatory).