Expleo

Senior Data Engineer

7 March 2024
Apply Now
Deadline date:
£128000 - £190000

Job Description

Overview

Expleo is a global engineering, technology, and consulting service provider that partners with leading organizations to guide them through their business transformation, helping them achieve operational excellence and future-proof their businesses.

 

Expleo benefits from more than 50 years of experience developing complex products in automotive and aerospace, optimizing manufacturing processes, and ensuring the quality of information systems. Leveraging its deep sector knowledge and wide-ranging expertise in fields including AI engineering, digitalization, automation, cybersecurity and data science, the group’s mission is to fast-track innovation through each step of the value chain.

 

With 17,000 employees and a worldwide presence in 30 countries, our global footprint includes excellence centers around the world. We are in Romania since 1994 and currently counting 1400 colleagues

Responsibilities

  • Work closely with the Data Engineering Manager to devise strategy, goals, and process improvements. 
  • Drive the implementation of cloud data warehousing best practices across the team. 
  • Develop and maintain data pipelines using Azure Data Factory. 
  • Write complex SQL stored procedures and ensure their stability, reliability, and performance in Snowflake through robust testing. 
  • Troubleshoot data-related problems and lead the team in finding and implementing solutions. 
  • Work with our Data Governance team in ensuring we comply with our data governance policy/procedures 
  • Maintain data pipelines, measuring performance and ensuring good data quality by implementing adequate monitoring. 
  • Maintains required documents to support maintenance and knowledge building. 
  • Works with our Data Governance team in ensuring we comply with our data governance policy/procedures 
  • Works alongside multiple stakeholders with competing priorities and ensures robust documentation management. 
  • #LI-SR1

Qualifications

  • Strong ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata and workload management. 
  • Strong experience with popular data warehousing and data integration technologies such as Snowflake, SQL Server, T/SQL, SSIS, or Azure Data Factory
  • 4+ experience as a Data Engineer with Snowflake, PowerBI, Python or similar scripting languages 
  • Strong experience in working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using traditional data integration technologies. 
  • Solid project management skills coordinating multiple initiatives in parallel. Experience of Agile Scrum a distinct advantage 
  • Should have experience in ELT/ETL, data replication/CDC, message-oriented data movement, API design and access and up and coming data ingestion and integration technologies. 
  • Strong experience in working with and optimizing existing ELT/ETL processes and data integration and data preparation flows and helping to move them in production. 
  • Adept in agile methodologies and capable of applying DevOps principles to data pipelines to improve the communication, integration, reuse and automation of data flow between data managers and consumers across an organization 

 

Benefits

  • Meal Vouchers
  • Private medical insurance
  • Performance bonus
  • Easter and Christmas bonus
  • Employee referral bonus
  • Bookster subscription
  • Work from home options depending on project
  • Various discounts (7Card, Lensa, World Class & more)