Publicis Groupe

Senior Data Engineer

25 March 2024
Apply Now
Deadline date:
£122000 - £194000 / year

Job Description

Company Description

When you’re one of us, you get to run with the best. For decades, we’ve been helping marketers from the world’s top brands personalize experiences for millions of people with our cutting-edge technology, solutions and services. Epsilon’s best-in-class identity gives brands a clear, privacy-safe view of their customers, which they can use across our suite of digital media, messaging and loyalty solutions. We process 400+ billion consumer actions each day and hold many patents of proprietary technology, including real-time modeling languages and consumer privacy advancements. Thanks to the work of every employee, Epsilon India is now Great Place to Work-Certified™. Epsilon has also been consistently recognized as industry-leading by Forrester, Adweek and the MRC. Positioned at the core of Publicis Groupe, Epsilon is a global company with more than 8,000 employees around the world. For more information, visit epsilon.com/apac or our LinkedIn page.

Job Description

About BU

Digital wizards and experience creators, our DX team crafts compelling customer journeys across the web. They bring Epsilon teams and technologies together to create immersive experiences that help brands stand out. By leveraging the power of our platforms, cutting-edge digital and marketing cloud tools, the team drives greater engagement for our global clients. Fueled by provocative and new thinking, this talented group of individuals reimagine digital experiences, one customer at a time.

Why we are looking for you.

This position in the Engineering team under the Digital Integration Services organization. We drive the first mile of the customer experience through personalization of offers and content. We are currently on the lookout for a smart, highly driven software engineer.

What you will enjoy in this role

You will be part of a team that is focused on building solutions, pipelines using latest software engineering design principles and tech stacks. You will also be expected to Identify, design, and implement improvements including re-designing infrastructure for greater scalability, optimizing data delivery and automate continuous integration and deployment processes/pipelines.

The incumbent is also expected to partner with various stakeholders, bring scientific rigor to design and develop high quality software.

She/He also must have excellent verbal and written communication skills and be comfortable working in an entrepreneurial, ‘startup’ environment within a larger company.

What you will do

  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Develop end-to-end (Data/Dev) pipelines based on in-depth understanding of cloud platforms, and business problems to ensure solutions are delivered efficiently and sustainably.
  • Collaborate with other members of the team to ensure high quality deliverables.
  • Learning and implementing the latest design patterns in software engineering

 

Qualifications

Data Management 

  • Experience with both structured and unstructured data
  • Experience building Data and CI/CD pipelines
  • Experience working on AdTech or MarTech technologies is added advantage
  • Experience in relational and non-relational databases and SQL (NoSQL is a plus).
  • Experience with Cloud technologies (AWS or Azure)
  • Hands on experience building ETL workflows/pipelines on large volumes of data
  • Good understanding of Data Modeling, Data Warehouse, Data Catalog concepts and tools
  • Experience with Data Lake architectures, and with combining structured and unstructured data into unified representations.
  • Able to identify, join, explore, and examine data from multiple disparate sources and formats. 
  • Ability to reduce large quantities of unstructured or formless data and get it into a form in which it can be analyzed 
  • Ability to deal with data imperfections such as missing values, outliers, inconsistent formatting, etc.
  • Ability to manipulate large datasets, (millions of rows, thousands of variables)

Software Development 

  • Ability to write code in programming languages such as Python, PySpark and shell script on Linux
  • Familiarity with software development methodology such as Agile/Scrum
  • Love to learn new technologies, keep abreast of the latest technologies within the cloud architecture, and drive your organization to adapt to emerging best practices

Architecture and Infrastructure

  • Architectural design experience on AWS
  • Experience in delivering software with AWS EC2, S3, EMR/Glue, Lambda, Data Pipeline, CloudFormation, Redshift etc.
  • Good knowledge of working in UNIX/LINUX systems
  • Bachelor’s Degree in Computer Science with 5+ years of similar experience
  • Tech Stack: Python, PySpark, Micro services, Docker, Serverless Frameworks and Databricks.
  • Familiarity with automated unit/integration test frameworks
  • Familiarity with Airflow and MLFlow tools
  • Good written and spoken communication skills, team player.
  • Strong analytic thought process and ability to interpret findings 

In addition, the candidate should have strong business acumen, and interpersonal and communication skills, yet also be able to work independently. He/she should be able to communicate findings and the way techniques work in a manner that all stakeholders, both technical and non-technical, will understand.