Devexperts

Middle/Senior Data Engineer

22 May 2024
Apply Now
Deadline date:
£89000 - £150000 / year

Job Description

Company Description

Devexperts has been working for nearly two decades consulting and developing for the financial industry. We solve complex technological challenges facing the most well-respected financial institutions worldwide.

By becoming a part of Devexperts, you’ll become a part of a company that fosters self-improvement and actively seeks out-of-the-box ideas. Our teams work together to create the next generation of financial software solutions. We welcome all candidates who believe, as we do, that innovation is grounded in education.

Job Description

We are looking for a Middle/Senior Data Engineer with a Java / Scala / Python background to join the project for a Top-5 US retail broker (by the number of users). The project is devoted to trading experience, finance reports, and risk management.

You will join a cross-functional team that excels in getting features done from zero to production.

Key responsibilities:

1. Data Pipeline Development:

  • Design, develop, and maintain robust data pipelines using Java within AWS infrastructure.
  • Implement scalable solutions for data analysis and transformation using Apache Spark and PySpark.
  • Utilise Airflow for efficient workflow orchestration in complex data processing tasks.
  • Ensure fast and interactive querying capabilities through the use of Presto.

2. Infrastructure Management:

  • Containerise applications using Docker for streamlined deployment and scaling.
  • Orchestrate and manage containers effectively with Kubernetes in production environments.
  • Implement infrastructure as code using Terraform for provisioning and managing AWS resources.

3. Collaboration and Communication:

  • Collaborate with cross-functional teams to understand data requirements and architect scalable solutions aligned with business goals.
  • Ensure data quality and reliability through robust testing methodologies and monitoring solutions.
  • Stay updated with emerging technologies and industry trends to continuously enhance the data engineering ecosystem.

Qualifications

1. Education and Experience:

  • Bachelor’s degree in Computer Science, Engineering, or related field.
  • Minimum 4 years of hands-on experience in Java / Scala / Python development, emphasising object-oriented principles.

2. Technical Proficiency:

  • Proficient in Apache Spark or PySpark for large-scale data processing.
  • Experience with Airflow for workflow orchestration in production environments.
  • Familiarity with Docker for containerisation and Kubernetes for container orchestration.
  • Knowledge of Terraform for infrastructure as code implementation in AWS environments.
  • Experience managing AWS services such as S3, EMR, Glue, Athena, and Redshift.
  • Strong background in SQL and relational databases, with proficiency in technologies like Postgres.
  • Preference for experience with streaming platforms such as Kafka for real-time data processing.

3. Communication Skills:

  • Excellent English language communication skills, both verbal and written.
  • Ability to collaborate effectively with technical and non-technical stakeholders.

Additional Information

  • Paid vacation 20 + 5 days
  • Free MultiSport card
  • Medical insurance – premium package
  • Мodern office space
  • Panoramic view of Vitosha mountain
  • Gym & billiard in the office
  • Parking spot or public transport card
  • Mentorship program
  • Training, courses, workshops
  • Paid pro certifications
  • Subscriptions to pro sources
  • Participation in conferences
  • English courses
  • Trading contest within the company
  • Tech meetup dxTechTalk
  • Speaker’s club
  • Opportunity to develop your personal brand as a speaker
  • Internal referral program
  • Remote work / Hybrid mode
  • Flexible schedule
  • Work & Travel program
  • Relocation opportunities