MP Solutions Ltd.

Data Engineer, ETL/Hadoop Developer

20 May 2024
Apply Now
Deadline date:
£90000 - £155000

Job Description

Primary responsibilities 

  • Design and development of ETL and Hadoop/ Snowflake applications.
  • Undertaking end-to-end project delivery (from inception to post-implementation support), including review and finalization of business requirements, creation of functional specifications and/or system designs, and ensuring that end-solution meets business needs and expectations. 
  • Responsibilities around deployment support (late hour, weekend). 
  • Development of new transformation processes to load data from source to target, or performance tuning of existing ETL code (mappings, sessions) and Hadoop/ Snowflake Platform. 
  • Analysis of existing designs and interfaces and applying design modifications or enhancements. 
  • Coding and documenting data processing scripts and stored procedures. 
  • Providing business insights and analysis findings for ad-hoc data requests 
  • Testing software components and complete solutions (including debugging and troubleshooting) and preparing migration documentation. Providing reporting-line transparency through periodic updates on project or task status. 

Requirements

  • Bachelor’s/master’s degree in engineering, preferably Computer Science/Engineering. 
  • 3+ years of experience with the technical analysis and design, development and implementation of data warehousing / Data Lake solutions. 
  • Strong SQL programming and stored procedure development skills. 
  • 2+ years of experience developing in Informatica or any other ETL tool. 
  • 2+ years relational database experience. 
  • Strong UNIX Shell scripting experience to support data warehousing solutions. 
  • Process oriented, focused on standardization, streamlining, and implementation of best practices delivery approach. 
  • Excellent problem solving and analytical skills. 
  • Excellent verbal and written communication skills. 
  • Experience in optimizing large data loads. 

 

Advantages 

 

  • Understanding/experience in Hive/Impala/Spark/Snowflake. 
  • Experience with Teradata is a big plus. 
  • Ability to architect an ETL solution and data conversion strategy.  
  • Exposure to an Agile Development environment. 
  • Knowledge about TWS Scheduler. 
  • Strong understanding of Data warehousing domain. 
  • Good understanding of dimensional modelling. 
  • Should be a good Team player. 

Benefits

  • You will have the opportunity to gain experience in exciting, long-term, innovative projects
  • Flexible working arrangements (core hours and opportunity to work from home) 
  • Work in a multinational team/environment, 
  • A team of great engineers, 
  • Cafeteria