Pagos
Data Engineer
Job Description
About Us
At Pagos, we’re passionate about empowering businesses to take control of their payments stack and solve the puzzles standing between them and optimized growth. Our global platform provides developers, product teams,and payments leaders with both a deeper understanding of their payments data and access to new payments technology through user-friendly tools that are easy to implement. To succeed in this, we need creative thinkers who are willing to roll up their sleeves and start building alongside us.
About the Role
As a Data Engineer, you’ll play a key part in designing, building, and maintaining the platform that powers our products. By collaborating with backend engineers, data analysts, and other engineers, you’ll build and own new features, modules, and extensions of our systems. We’re seeking an action-oriented and collaborative problem solver who thrives in ambiguity and can take on new challenges with optimism in a fast-paced environment. We value team members who are not only skilled in their area of expertise but are also perpetual learners who are committed to growth and contributing to our collective success.
In this role, you will:
-
Craft high-quality code for scale, availability, and performance
-
Design, develop, and maintain scalable data pipelines and processes to extract, process, and transform large volumes of data, both real-time and batched (ELT)
-
Build and maintain integrations with data providers using various data transfer protocols
-
Drive engineering projects from start to finish with a high level of ownership and autonomy
-
Ensure the quality of our products and data through both manual and automated testing, as
well as code reviews
What We’re Looking For
We’re looking for someone with:
-
5+ years of software engineering experience with an emphasis on Data Engineering
-
Bachelor’s degree or higher in Computer Science or related technical discipline (or equivalent experience)
-
Advanced experience with complex SQL queries and database/lakehouse technologies such as Redshift, Delta Lake and Postgres
-
Deep experience with big data technologies and frameworks such as Apache Spark, DBT, as well as data quality tools, like DBT (test)
-
Familiarity with cloud platforms like AWS, GCP, or Azure, and common data related services (e.g. S3, Redshift, EMR, Glue, Kinesis, Athena)
-
A bias for action, where no task is too small, and an eagerness to learn and grow with our industry
Nice to have:
-
Experience with real-time streaming frameworks like Apache Kafka, Apache Flink, or Apache Storm
-
Experience with Great Expectations and/or Soda
-
Comfort and/or past experience working and managing big data and ELT pipelines
-
Comfort and/or past experience working with Apache Airflow or similar orchestration tools
-
Experience working in high-growth, venture-backed startup(s)
Pagos does not accept unsolicited resumes from third-party recruiting agencies. All interested candidates are encouraged to apply directly.