KION Group

Principal Data Architect (Remote)

22 May 2024
Apply Now
Deadline date:
£94000 - £198000 / year

Job Description

About Us

At Dematic, we are gearing up to revolutionize our data landscape by building a cutting-edge Enterprise Data Lakehouse Platform. We are forming multiple teams that will spearhead the creation of the platform’s foundational components. These teams go beyond traditional data ingestion; they are architects of a microservices-driven platform, providing abstractions that empower other teams to seamlessly extend the platform.

Role Overview

We are seeking a dynamic and highly skilled Principal Data architect who has extensive experience building enterprise scale data platforms to lead these foundational efforts. This role demands someone who not only possesses a profound understanding of the data engineering landscape but is also at the forefront of their game. The ideal candidate will contribute significantly to platform development, leading several data engineering teams with diverse skillset while also being very hands-on coding and actively shaping the future of our data ecosystem.

What we offer:

  • What We Offer:

    • Career Development
    • Competitive Compensation and Benefits
    • Pay Transparency
    • Global Opportunities

     

    Learn More Here: https://www.dematic.com/en-us/about/careers/what-we-offer

     

    Dematic provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.

     

    This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.

    The base pay range for this role is estimated to be $94,000.00 – $198,000.00 at the time of posting. Final compensation will be determined by various factors such as work location, education, experience, knowledge and skills.

Tasks and Qualifications:

This is What You Will do in This Role:

  • As the sole hands-on enterprise data architect, you will be responsible for ideation, architecture, design and development of new enterprise data platform. You will collaborate with other cloud and security architects to ensure seamless alignment within our overarching technology strategy.
  • Architect and design core components with a microservices architecture, abstracting platform, and infrastructure intricacies.
  • Create and maintain essential data platform SDKs and libraries, adhering to industry best practices.
  • Design and develop connector frameworks and modern connectors to source data from disparate systems both on-prem and cloud.
  • Design and optimize data storage, processing, and querying performance for large-scale datasets using industry best practices while keeping costs in check.
  • Architect and design the best security patterns and practices
  • Design and develop data quality frameworks and processes to ensure the accuracy and reliability of data.
  • Collaborate with data scientists, analysts, and cross functional teams to design data models, database schemas and data storage solutions.
  • Design and develop advanced analytics and machine learning capabilities on the data platform.
  • Design and develop observability and data governance frameworks and practices.
  • Stay up to date with the latest data engineering trends, technologies, and best practices.
  • Drive the deployment and release cycles, ensuring a robust and scalable platform.

What We are Looking For:

  • 15+ of proven experience in modern data engineering, broader data landscape experience and exposure and solid software engineering experience.
  • Prior experience architecting and building successful enterprise scale data platforms in a green field environment is a must.
  • Proficiency in building end to end data platforms and data services in GCP is a must.
  • Proficiency in tools and technologies: BigQuery, SQL, Python, Spark, DBT, Airflow, Kafka, Kubernetes, Docker.
  • Solid experience designing and developing distributed microservices based data architectures.
  • Proficiency in architecting and designing and development experience with batch and real time streaming infrastructure and workloads.
  • Proficiency with IoT architectures.
  • Solid experience with architecting and implementing metadata management including data catalogues, data lineage, data quality and data observability for big data workflows.
  • Hands-on experience with GCP ecosystem and data lakehouse architectures.
  • Strong experience with container technologies such as Docker, Kubernetes.
  • Strong understanding of data modeling, data architecture, and data governance principles.
  • Excellent experience with DataOps principles and test automation.
  • Excellent experience with observability tooling: Grafana, Datadog.
  • Previous experience working with engineers of all levels – Principal, Senior and Junior

What Will Set You Apart :

  • Experience with Data Mesh architecture.
  • Experience building Semantic layers for data platforms.
  • Experience building scalable IoT architectures

#LI-DP1