Everfield

Senior Data Engineer

7 March 2025
Apply Now
Deadline date:
£83000 / year

Job Description

Senior Data Engineer 

As a Senior Data Engineer at Everfield, you will leverage your expertise and experience to: 

  • Continuously improve scalability and stability of our data platform called the Lakehouse 

  • Create new data pipeline solutions on various data sources (ERP, CRMs, etc.) 

  • Improve existing solutions to improve value add for various stakeholders 

This role is hybrid from either Amsterdam area or Cracow area. 

 

About Everfield 

Everfield buys, builds, and grows European vertical market and specialist software companies, providing them with the tools they need to move to the next level. Our mission is to foster ambition, fuel growth, and unlock opportunities for Europe’s software ecosystem. 

Companies in the Everfield ecosystem follow a decentralised model, maintaining their team, brand, and offices, while focusing on what they do best – building products and supporting customers. Everfield provides support in talent acquisition, HR, and a team of experts in building and growing European B2B SaaS companies consult on financial and operational topics from. Founded in 2022, Everfield has an ecosystem presence in 7 countries, and growing. 

 

About the team: 

You’ll be working as part of the Business Intelligence (BI) team as the fourth member. Currently the team consists of a Lead Data Engineer, a Data Analyst and a hands-on team lead. The BI team creates value for various stakeholders (CEO, Finance, COO, Sales, Marketing, and Everfield’s portfolio companies), by creating insights, delivering analysis, automating reporting processes and improving data quality. 

 

The team builds pipelines with Azure Synapse Analytics to ingest the data into our data platform, uses Azure Databricks for data transformation and Power BI for data visualization. For analytics the team also uses Python and SQL

 

The scope of the team is expected to scale with the company’s growth and maturity level. Everfield will scale from 20 portfolio companies currently, to approximately 90 companies in the next 4 years. 

 

Our way of working: 

  • Sprints of 2 weeks 

  • Ops duty every other week 

  • On average 2 releases per sprint 

  • Weekly refinement sessions 

  • Evaluation/ retrospective every month 

 

What you will do: 

Adhere to the DevOps methodology by: 

  • Collaborate with the Lead Data Engineer and Team Lead in Azure DevOps to plan and prioritize tasks. 

  • Develop data pipelines in Azure Synapse Analytics, along with data ingestion and process scripts using PySpark and Pandas, while managing version control through Git. 

  • Create unit tests, test your own work and review work of colleagues (via Pull Requests), to ensure quality control. 

  • Enhance and maintain deployment pipelines to facilitate smooth releases to end users. 

  • Monitor the data platform (including data pipelines, ingestion/processing jobs, and alerts), while continuously improving logging and monitoring practices. 

 

What we are looking for: 

Hard Skills 

Required: 

  • 5+ year of experience as Data Engineer.

  • Expert in PySpark and Pandas for data processing. 

  • Experience with Azure Synapse Analytics, Azure Data Factory, or Azure Fabric (Data Factory) for managing data pipelines. 

  • Strong knowledge of SQL and data modeling (especially star schema). 

  • Familiarity with financial systems such as ERP, NetSuite, SAP, or similar. 

  • Experience working in an Agile Scrum environment. 

Nice to Have: 

  • Experience with Databricks for advanced analytics. 

  • Knowledge of Power Query and Power BI for data visualization. 

  • Experience deploying Azure resources like Azure SQL Server and Azure Web Apps

  • Experience with CRM systems and customer data

Soft Skills 

  • Independent Worker: Ability to operate autonomously while contributing effectively to team goals. 

  • Business Acumen: Strong understanding of business and economic principles, with an ability to apply them in a data context. 

  • Analytical Mindset: Excellent problem-solving and critical thinking skills. 

  • Collaborative: Actively seeks alignment with team members and participates in planning sessions. 

  • Quality-Focused: Committed to reviewing both your own work and that of others to maintain high standards. 

  • Strategic Planner: Consider alternative solutions and refine features before implementation. 

  • Purpose-Driven: Demonstrates a deep understanding of tasks, ensuring they align with business objectives. 

What’s in it for you? 

  • Grow your career in a fast-paced and growing organization. 

  • Enable exponential growth by designing a world class data platform. 

  • Working with the latest Data Engineering & Data Modeling technologies. 

  • Long term career opportunities. 

  • Working from home for up to 4 days per week. 

  • Flexible work environment with offices around Europe (Netherlands, UK, Spain, France, Poland, Germany). 

Recruitment process: 

Here’s what you can expect and who you will speak to during the interview process. 

  • Initial call with Talent Acquisition Lead Maria 

  • First interview with Hiring Manager Tom 

  • Second interview with Lead Data Engineer Pim 

  • Third interview, a case study assessment preferable on site 

  • Final interview with CFO Franklin 

This role is hybrid from either Amsterdam area or Cracow area. Including yourself, the team will be four people. 

We look forward to reviewing your application!