dormakaba

Data Engineer

14 March 2024
Apply Now
Deadline date:
£92000 - £154000

Job Description

We are one of the top three companies in the global market for access and security solutions. With around 15,000 employees we are at our customers’ side in over 130 countries. Our specialized service covers access control systems, automatic doors, door closers, lodging products, barriers, interior glass systems, master key systems, movable walls, to offer you a complete solution from front to back of house or building.

 

A job that matters: Your Tasks 

 

We are looking for a Data engineer who thrives to develop data analysis pipelines to extract business relevant insights about the usage and activity of physical products on the field.

As Data engineer you will ensure that data is collected, stored, and made accessible for analysis. You will be responsible for building, maintaining, and organizing the infrastructure that enables organizations to leverage data effectively.

 

  • Collection and aggregation of data from various sources, including databases, APIs, external data providers, and streaming sources. They must design and implement efficient data pipelines to ensure a smooth flow of information into the data warehouse or storage system
  • Store and manage data: This involves choosing appropriate database systems, optimizing data schemas, and ensuring data quality and integrity. They also must consider scalability and performance to handle large volumes of data
  • Create the pipelines for extracting and transforming raw data into formats suitable for analysis. This involves data cleansing, aggregation, and enrichment, ensuring the data is usable for data scientists and analysts

 

An experience that matters: Your Skills

 

  • 3+ years of experience in developing and maintaining data analysis pipelines
  • Experience with big databases
  • Accessing data with different paradigms, relational and not relational. Including NoSQL databases such as MongoDB or Cassandra with unstructured or semi-structured data
  • Deep understanding of relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra).
  • Pipelining, ETL tools: Apache Spark, AWS glue.
  • Data analysis tools: Pandas
  • Experience with any of the following AWS (preferred), Azure, or Google Cloud
  • Depth knowledge of principes such as SOLID, OOP, CLEAN and SW patterns
  • Expertise in Python/ R, Java, Scala, or similar programming languages
  • Experience with using cloud platforms to build scalable and cost-effective data solutions over distributed architectures
  • Experience on real-time data processing using streaming technologies, such as Kafka or Kinesis is a plus
  • Experience on ensuring the security of the data and its byproducts
  • University degree or equivalent on Technical College or relevant related experience
  • You are a self-organized with agile mindset
  • Proactive and independent working
  • Fluent English
  • Flexibility to travel – not extensively

 

A workplace that matters: Our offering

 

  • Best opportunities in a globally operating company valuing diversity, inclusion, sustainability and mutual trust
  • Attractive remuneration package
  • Flexible hybrid work with on-site time for interacting with the team
  • 25 days paid annual leave
  • Additional health insurance
  • 200 BGN Food vouchers
  • Public transportation card
  • Multisport card
  • Training and mentorship programs
  • 24/7 access to over 15,000 LinkedIn Learning courses to assist in your professional development and to expand on your individual interests