Kenvue

CDP Data Engineering Architect

11 October 2024
Apply Now
Deadline date:
£59000 - £109000 / year

Job Description

CDP Data Engineering Architect-2407020918W

Description

 

Who we are

At Kenvue, we realize the extraordinary power of everyday care. Built on over a century of heritage and rooted in science, we’re the house of iconic brands – including NEUTROGENA®, AVEENO®, TYLENOL®, LISTERINE®, JOHNSON’S® and BAND-AID® that you already know and love. Science is our passion; care is our talent. Our global team is made by 22,000 diverse and brilliant people, passionate about insights, innovation and committed to deliver the best products to our customers. With expertise and empathy, being a Kenvuer means to have the power to impact life of millions of people every day. We put people first, care fiercely, earn trust with science and solve with courage – and have brilliant opportunities waiting for you! Join us in shaping our future–and yours. For more information, click here.

              

What you will do

The Lead Engineer- CDP Data Engineer is responsible for analyzing and designing technology solutions to meet business needs. 

Key Responsibilities

·         Design and implement scalable and efficient data pipelines to support business needs

·         Collaborate with cross-functional teams to identify and solve complex data-related problems

·         Develop and maintain data architecture and data modeling best practices

·         Provide technical leadership and guidance to junior team members

·         Design and develop scalable data solutions and architectures.

·         Lead the development and implementation of data pipelines and ETL processes.

·         Stay up-to-date with emerging trends and technologies in data engineering and solution architecture

·         Collaborate with cross-functional teams to identify business requirements and provide technical solutions.

 

What we are looking for

Required Qualifications

·         5-8 years of overall experience with minimum of 3 years experience in data engineering and solution architecture.

·         Excellent communication skills.

·         Bachelor’s degree or equivalent in Computer Science, Engineering, or related field

·         Proficiency in programming languages such as Python, Java, and SQL

·         Experience with cloud-based data platforms such as AWS, Hive, Presto, Azure and Real-time APIs

·         Expertise in data modeling, data warehousing, and ETL processes.

·         Relevant Treasure Data experience or equivalent Customer Data Platform experience, e.g. Salesforce Data Cloud, Adobe AEP, Tealium AudienceStream, Segment, ActionIQ, etc.

·         Proficient in configuring and implementing the Treasure Data Platform

·         Experience with configuring and using Treasure Data modules like Integration hub, and Audience Studio

·         Proficient in the use of Treasure Data Workflows for data integration and segmentation

·         Marketing database/data warehouse experience preferred, e.g. Snowflake, Databricks, Redshift, BigQuery, SQL Server, etc.

·         Experience with data management, data transformation, ETL, preferably using cloud-based tools/infrastructure

 

 

Desired Qualifications

·         Strong problem-solving and analytical skills

·         Excellent communication and collaboration skills.

·         Strong aptitude toward communicating complex business and technical concepts using visualization and modeling aids. Ability to conceptualize and create sophisticated diagrams and documents.

·         Strong desire and aptitude to learn new technologies quickly and thoroughly.

·         Expertise in gathering and analyzing information related to data integration, subscriber management, and identify resolution.

·         Demonstrated ability to influence a group audience, facilitate solutioning and lead discussions such as implementation methodology, architectural roadmaps, enterprise transformation strategy, and executive-level requirement gathering sessions.

·         Knowledge of Data Governance and Data Privacy concepts and regulations a plus

·         Knowledge of big data technologies such as Hadoop, Spark, and Kafka

 

 

Qualifications

 

·         Bachelor’s degree or equivalent in Computer Science, Engineering, or related field

 

Primary Location

 Asia Pacific-India-Karnataka-Bangalore

Job Function

 Operations (IT)