Barclays
Data Engineer
Job Description
Purpose of the role
To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Accountabilities
- Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data.
- Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures.
- Development of processing and analysis algorithms fit for the intended data complexity and volumes.
- Collaboration with data scientist to build and deploy machine learning models.
Analyst Expectations
- Will have an impact on the work of related teams within the area.
- Partner with other functions and business areas.
- Takes responsibility for end results of a team’s operational processing and activities.
- Escalate breaches of policies / procedure appropriately.
- Take responsibility for embedding new policies/ procedures adopted due to risk mitigation.
- Advise and influence decision making within own area of expertise.
- Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct.
- Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function.
- Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function.
- Make evaluative judgements based on the analysis of factual information, paying attention to detail.
- Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents.
- Guide and persuade team members and communicate complex / sensitive information.
- Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.
All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Join us as a “Data Engineer” at Barclays, where you’ll spearhead the evolution of our digital landscape, driving innovation and excellence. You’ll harness cutting-edge technology to revolutionise our digital offerings, ensuring unapparelled customer experiences.
You may be assessed on the key critical skills relevant for success in role, such as experience with Data Engineer, as well as job-specific skillsets.
To be successful as a “Data Engineer”, you should have experience with:
Basic/ Essential Qualifications:
-
Providing technical leadership in Cloud Environments (like AWS, Azure) across our Product Stack and contributes to open-source Big Data technologies.
-
Where applicable writing Data Pipelines, workflows, and data structures.
-
Adapting quickly to change in requirements and be willing to work with different technologies if required.
-
Owning deliverables for the Data and Analytics Platform team from a delivery perspective
-
Leading in deployment, testing activities with hands on engineering and automation experience including CI/CD/CD and DataOps mindset
-
Reviewing technical designs and providing feedback
-
Designing and building end-to-end data solutions for analytics in AWS/Azure
-
Strengthening data quality and reliability through code
-
Improving data lineage and governance by tagging through code
-
Progressing standards and best practices for the platform and operational excellence
-
Apply best practices and ways of working among our global Data Platform engineering team.
Desirable skillsets/ good to have:
-
Hands-on in scripting languages, Java, Python, RDBMS (Oracle/MySQL/Postgres).
-
Cloud Concepts and hands-on experience with Public Cloud Providers such as AWS, Azure, GCP
-
Overall experience of 4-8 years, majority of it in the Data Platform Space.
-
Minimum 1 years of experience in managing one of GCP / AWS / Azure environment is a must.
-
Minimum 1 years of experience architecting, designing, developing, and implementing cloud solutions on one of GCP / AWS / Azure platforms.
-
Good cloud infrastructure technical knowledge including cloud security, cloud formation templates and cost optimization.
-
Knowledge of information technology infrastructure domains such as compute server platforms, storage server platforms, server components, network devices, technologies and architectures, IT service delivery principles and best practices
-
Familiarity with compliance & security standards across the enterprise IT landscape
-
Experience using DevOps tools in a cloud environment, such as Ansible, Artifactory, Docker, GitHub, Jenkins, Kubernetes, Maven, and Sonar Qube
-
Hands on experience in the following Cloud services is mandatory – AWS S3, Glue, LakeFormation, EC2, EKS, ECS.
This role will be based out of Pune.