Colgate-Palmolive

Sr. Technology Associate/ Specialist – Data Engineering

28 October 2024
Apply Now
Deadline date:
£52000 - £97000 / year

Job Description

Relocation Assistance Offered Within Country
# 163437 – Mumbai, Maharashtra, India

Who We Are
Colgate-Palmolive Company is a caring, innovative growth company that is reimagining a healthier future for all people, their pets and our planet. Focused on Oral Care, Personal Care, Home Care and Pet Nutrition, we sell our products in more than 200 countries and territories under brands such as Colgate, Palmolive, elmex, hello, meridol, Sorriso, Tom’s of Maine, EltaMD, Filorga, Irish Spring, PCA SKIN, Protex, Sanex, Softsoap, Speed Stick, Ajax, Axion, Fabuloso, Soupline and Suavitel, as well as Hill’s Pet Nutrition.

We are recognized for our leadership and innovation in promoting sustainability and community wellbeing, including our achievements in decreasing plastic waste and promoting recyclability, saving water, conserving natural resources and improving children’s oral health.

If you want to work for a company that lives by their values, then give your career a reason to smile and join our global team!
 

The Experience:

In today’s multifaceted technology environment, it is an exciting time to be in the information technology team at Colgate!.

Our highly technical and innovative team is dedicated in driving growth for Colgate Palmolive in this evolving landscape.

What role will you play as a member of the Colgate’s Information Technology team?

We are seeking a skilled and expert Data Engineer. The individual will be developing and handling data pipelines and data lakes on cloud platforms.

The Individual will be handling and developing the solution which entails planning, design, development and maintenance of our data repositories, pipeline and analytical solutions. The best fit for this position is the one who is self driven and looking to build high quality tools to deliver actionable insights to our business partners to facilitate data-driven decision making. The candidate needs to be a self-starter – eager to learn new technologies, extract insights and value from data and how to demonstrate technology to deliver those insights through efficient data pipelines

Additionally, knowledge of cloud environments and automation/optimisation skills would be good, as well Should be experienced in ETL pipelines and aware of code standards and optimisation techniques. The candidate should be able to play a lead role towards the evolution of technology deployments across the global footprint.

 

Who are you…

You are a Technical expert –

  • You are thrilled at the prospect of using technology to tackle data problems

  • Build data applications on Google cloud platform and integrate with different data sources/ tools.

  • Design, develop, and maintain an extraordinary data warehouse and analytics architecture to meet business analysis, reporting needs, and data science initiatives

  • Work directly with supply chain business users and data scientists to assist in project analysis

  • Participate in the development and maintenance of ETL jobs and data pipelines to aggregate data from various on premise, cloud platforms & external data sources

  • Design and develop data marts for consumption by analytics tools and end users

  • Develop code standards, guidelines to handle and ensure data quality and integrity

  • Optimize and scale data warehouse and data processing infrastructure

  • Evaluate new technologies and constantly work towards continuous improvements in data engineering, our platform, and the organization

 

You connect the dots –

  • Your proficiency in handling data pipelines coupled with your expertise in understanding the data, will bridge the gap between functional requirements and technical implementation.

  • You will connect the dots by ETL pipeline design with overarching architecture standards and data security objectives, ensuring that our data is available in best and secure way accessible for every kind of insights

 

You are a collaborator –

  • Your expertise and insights as someone who can understand the functional requirement and design the underline technical implementation for the same .You will work closely with all functional business and IT team to help them demonstrate the best use of data

 

You are an innovator –

  • You will work on ground-breaking initiatives that push the boundaries of what’s possible with cloud platform technologies and beyond. Your innovation will drive the evolution of our data environment, from introducing innovative features to devising novel approaches for data and insights

  • Your ability to think creative and build use cases which are secure, scalable and have low maintenance will set you apart as a true innovator.

 

What you’ll need…(Required)

  • Bachelor’s degree required, Graduate degree or equivalent experience in Computer Science, Statistics, Informatics, Information Systems or another quantitative field would be an advantage

  • Ability to work as a Data Engineer with minimum 3years of experience.

  • At least 1 year of experience in data warehousing & analytical tools like Tableau etc.

  • Demonstrates expertise in data modeling, ETL development, and Data warehousing

  • Knowledge of data management fundamentals and data storage principles

  • Experience with Python or similar programming languages

  • Working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases

  • Experience in CI/CD workflow approaches

  • Knowledge of Python, Terraform, docker Kubernetes, cloud functions, snowflake/BigQuery, DBT

  • Experience with Git for version control

  • Strong analytical, problem solving & conceptual skills

What you’ll need…(Preferred):

 

  • Experience building and optimizing”big data” data pipelines, architectures and data sets.

  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and find opportunities for improvement.

  • Experience with data warehousing platforms/storage platforms such as SAP BW/HANA

  • Experience with cloud data warehousing environments like Snowflakes, BigQuery, Google cloud storage etc is a plus

  • Experience with object-oriented/object function scripting languages: Python, Java, etc.

  • Experience with data pipelines and streaming frameworks such as Pubsub, Spark, Airflow, Kafka etc

  • Experience with RDBMS; NoSQL experience also encouraged

  • Ability to accept, learn, and apply new technologies and tools

  • Familiarity with agile software development methodology

  • Should be proficient well with both technical and non-technical teams

 

Our Commitment to Sustainability
With the Colgate brand in more homes than any other, we are presented with great opportunities and new challenges as we work to integrate sustainability into all aspects of our business and create positive social impact. We are determined to position ourselves for further growth as we act on our 2025 Sustainability & Social Impact Strategy.

Our Commitment to Diversity, Equity & Inclusion
Achieving our purpose starts with our people — ensuring our workforce represents the people and communities we serve —and creating an environment where our people feel they belong; where we can be our authentic selves, feel treated with respect and have the support of leadership to impact the business in a meaningful way.

Equal Opportunity Employer
Colgate is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity, sexual orientation, national origin, ethnicity, age, disability, marital status, veteran status (United States positions), or any other characteristic protected by law.

Reasonable accommodation during the application process is available for persons with disabilities. Please contact [email protected] with the subject “Accommodation Request” should you require accommodation.

 #LI-Hybrid