AUTODOC

Middle Data Engineer (m/f/d)

22 May 2024
Apply Now
Deadline date:
£106000 - £185000 / year

Job Description

Company Description

AUTODOC is the largest and fastest growing auto parts ecommerce platform in Europe.
Present across 27 countries with around 5,000 employees, AUTODOC generated revenue of over €1.3 billion in 2023, supplying more than 7.4 million active customers with its 5.8 million vehicle parts and accessories for car, truck, and motorcycle brands.

Curious minds, adventurous experts and tech-savvy professionals – one team, one billion euros revenue. Catch the ride!

Job Description

Data Engineer plays a critical role in data processing, responsible for designing and developing DBMS to transform and connect various data sources into analytical datasets. He is responsible for improving the efficiency and reliability of the ETL process infrastructure, including fault tolerance. He is also responsible for data quality assurance and project management.

Responsibilities:

  • Development and optimization of architecture of database processing systems
  • Development and support of infrastructure in the Google Cloud Platform environment
  • Improving the quality and reliability of data that drives end-to-end cleansing processes
  • Create efficient CI/CD and other automated solutions
  • Maintenance and enhancement of existing infrastructure
  • Infrastructure security and monitoring

Qualifications

  • In-depth knowledge of Data Engineering methodologies
  • Ability to work with large datasets, ETL tools and databases
  • Experience with cloud tools for data warehousing and processing such as AWS, GCP or Azure
  • Ability to design, build and maintain robust and scalable data pipelines
  • Interacting with other departments and participating in strategic decision making at the company level in the field of data engineering
  • Ability to find innovative ways to solve problems
  • Developing solutions for system monitoring

Experience:

  • Experience in developing fault tolerance mechanisms – approaches to clustering, replication, scaling, etc.
  • Strong understanding and use of programming languages such as Python, SQL, and professional knowledge of Big Data technologies such as Hadoop, Spark.
  • Experience with GCP cloud infrastructure
  • Experience in Data Science for 3 years or more.
  • Understanding of services: GCP, JupyterHub, AirFlow, ClickHouse, Spark, MLFlow
  • Finding your way out of the Vim text editor

What do we offer?

  • Competitive salaries based on your professional experience
  • Fast growing international company with stable employment
  • Annual vacation of 24 days and 1 additional day off on your birthday
  • Monthly Allowance for cover the costs of medical insurance expenses
  • Mental Wellbeing Program – the opportunity for free psychological counseling for you and your family members 24/7 hotline and online sessions
  • Opportunities for advancement, further trainings (over 650 courses on soft and hard skills on our e-learning platform) and coaching
  • Free English and German language classes
  • Flexible working hours and hybrid work

Join us today and let’s create a success story together!