dentsu international
Lead Data Engineer
Job Description
Company Description
We Dream. We Do. We Deliver.
About Merkle
Merkle, a dentsu company, powers the experience economy. For more than 35 years, the company has put people at the heart of its approach to digital business transformation. As the only integrated experience consultancy in the world with a heritage in data science and business performance, Merkle delivers holistic, end-to-end experiences that drive growth, engagement, and loyalty. Merkle’s expertise has earned recognition as a “Leader” by top industry analyst firms, in categories such as digital transformation and commerce, experience design, engineering and technology integration, digital marketing, data science, CRM and loyalty, and customer data management. With more than 16,000 employees, Merkle operates in 30+ countries throughout the Americas, EMEA, and APAC. For more information, visit www.merkle.com.
We are looking for a savvy Lead Data Engineer to join our team of data heroes. You will be responsible building Big Data architecture pipelines for data lakehouses in cloud, as well as optimizing and productionizing Machine Learning and predictive models. The ideal candidate is an experienced engineer and data wrangler who enjoys building complex platforms from the ground up, using the latest technologies in cloud. You will cooperate with data architects and data scientists on large data projects for the biggest international brands, as well as build an internal platform framework to ensure consistent & optimal delivery. You should be a versatile self-starter eager to roll out next-gen data architectures, comfortable supporting multiple technologies/teams/solutions/clients, and also a great team player able to work within our international team with a positive, startup-minded attitude.
Job Description
-
Design and implement data ingestion and processing from diverse sources, leveraging ‘big data’ technologies, including related AWS services: Spark, Glue, Airflow, Kafka, Data Pipeline, NoSQL DBs, SageMaker, and ML Studio.
-
Develop and maintain data tools for data science and analyst teams, supporting AI/ML model building, optimization, and productionization.
-
Assemble large, complex data sets for a data lakehouse, emphasizing advanced data modeling techniques including star schemas.
-
Create data pipelines for actionable insights in areas like marketing automation, customer acquisition, and other key business domains.
-
Guide and support fellow data engineers, assuming a leadership position in assigning and ensuring the successful completion of data engineering tasks within a diverse and collaborative product team.
-
Collaborate with stakeholders to address data-related technical issues, support data infrastructure needs, and optimize data delivery and scalability within the AWS environment.
-
Support pre-sales activities by proposing technical solutions, providing accurate effort estimates, and showcasing expertise in AWS Cloud Technologies.
Qualifications
-
Proven experience building and productionizing big data architectures, pipelines, and datasets within AWS.
-
Solid understanding of AWS ‘big data’ concepts and patterns, including data lake, lambda architecture, streaming processing, DWH, BI, and reporting.
-
Deep hands-on development experience in the AWS environment, with a focus on AWS services like Glue, EC2, EMR, RDS, Redshift, Sagemaker, etc.
-
4+ years of experience in a Data Engineer role, demonstrating expertise in big data tools (Spark, Kafka), object-oriented/scripting languages (Python, Scala, Java, R, C++), and AWS cloud services.
-
Implementation of large-scale data-oriented pipelines/workflows using ETL tools and extensive experience with relational (like MS SQL, PostgreSQL, Redshift) and NoSQL databases (Cassandra, MongoDB, Dynamo DB).
-
Proficiency in data modeling techniques, including dimensional modelling and star schemas.
-
Relevant analytic skills related to structured and unstructured datasets.
-
Ample experience in utilizing CI/CD automation tools
-
Good project management and organizational skills.
-
Preferred Skills:
-
Past experience delivering business intelligence projects using tools like PowerBI, Tableau, Qlick Sense, Keboola.
-
Working knowledge of message queuing, stream processing, and highly scalable real-time data processing using technologies like Kafka.
-
Experience with data pipeline/workflow management tools such as Airflow, Kestra, AWS Glue.
Additional Information
With us, you will become part of:
- An international, amazing team, where you can gain new/relevant experience
- A dynamic and supportive environment where you will never happen to fall into a routine
- Possibility to grow, in accordance with your skills and interests connected with future development
- Start-up agile atmosphere
- Friendly international team of creative minds
We, obviously, offer even more:
⛺ 5 weeks of vacation + 3 wellness days
❤️ 2 Volunteering days to share the kindness of your heart with others
⏰ Flexible working hours and home office
🎯 Fully covered certifications in AWS, Adobe, Microsoft, Snowflake etc.
🎓 Full access to Dentsu Academy, on-site learning sessions
🐶 Pet friendly offices
💌 Edenred meal and cafeteria points
🍹 Team events: company parties, monthly breakfasts, and pub quizzes
🥪 Snacks, and drinks at the office
💸 Referral bonus programme
💻 Laptop + equipment
📞 Corporate mobile phone subscription