dentsu international
Senior Data Engineer
Job Description
Company Description
We Dream. We Do. We Deliver.
As a full-service, data-driven customer experience transformation, we partner with Top 500 companies in the DACH region and in Eastern Europe. Originally from Switzerland, Merkle DACH was created out of a merger Namics and Isobar – two leading full-service digital agencies.
Our 1200+ digital enthusiasts are innovating the way brands are built, through providing expertise in Digital Transformation strategy, MarTech platforms, Creativity, UX, CRM, Data, Commerce, Mobile, Social Media, Intranet and CMS. We are part of the global Merkle brand, the largest brand within the dentsu group, who shares with us a network of over 66,000 passionate individuals in 146 countries.
Job Description
- Use CI/CD tools to facilitate deployment of code to stage and production environments.
- Participate on architecture of end-to-end solutions for our customers on AWS, Azure and other cloud platforms.
- Maintain GIT repositories using Gitflow framework.
- Collaborate on feature deliverables to meet milestones and quality expectations.
- Communicate with the stakeholders, vendors and technology subject matter experts.
- Document implemented logic in a structured manner using Confluence; plan your activities using Agile methodology in Jira.
- Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs, like optimizing existing data delivery, re-designing infrastructure for greater scalability, etc.
Qualifications
- Experience in building and productionizing public cloud infrastructure.
- Experience using GitHub, Bitbucket or other code repository solution.
- Experience in setting up and using CI/CD automation tools like like Github Actions, Azure DevOps or AWS CodePipeline.
- Experience with infrastructure as a code frameworks like Terraform, AWS CloudFormation, ARM templates.
- Understanding of containerization concepts and container orchestration services like Docker, Fargate, Kubernetes.
- Experience with scripting languages like Python, Bash, PowerShell etc.
- Strong analytic skills related to working with structured and unstructured datasets.
- Person who is precise, well organized, has good communication skill, can adapt to changing circumstances and is not afraid of responsibility for his / her work will do great in this role.
Preferred Skills
- Understanding data concepts and patterns of big data, data lake, lambda architecture, stream processing, DWH, BI & reporting.
- Experience with data pipeline / workflow management tools like dbt, AWS Step Functions, AWS Glue, Azure Data Factory, Airflow.
- Knowledge of SQL
Additional Information
With us, you will become part of:
- An international, amazing team, where you can gain new/relevant experience
- A dynamic and supportive environment where you will never happen to fall into a routine
- Possibility to grow, in accordance with your skills and interests connected with future development
- Start-up agile atmosphere
- Friendly international team of creative minds
We, obviously, offer even more:
⛺ 5 weeks of vacation + 3 wellness days
⏰ Flexible working hours and home office
🎯 Fully covered certifications in Salesforce, Adobe, Microsoft, etc. (delete for non-tech roles)
🎓 Full access to Dentsu Academy, LinkedIn Learning, on-site learning sessions
🐶 Pet friendly offices
🍹 Team events: company parties, monthly breakfasts, and pub quizzes
🥪 Snacks, and drinks at the office
💸 Referral bonus programme
💻 Laptop + equipment