TELUS International

Data Engineer

11 April 2024
Apply Now
Deadline date:
£89000 - £150000 / year

Job Description

Description

Join our team and what we’ll accomplish together

 

As a Data Engineer, you will be responsible for designing, building and supporting the data pipelines which enable innovative, customer centric digital experiences. You will be working as part of a friendly, cross-discipline agile team who helps each other solve problems across all functions. As a custodian of customer trust, you will employ best practice in development, security, accessibility and design to achieve the highest quality of service for our customers. Our development team uses a range of technologies to get the job done including ETL and data quality tools from Informatica, streaming via Apache NiFi, google native tools on GCP (Dataflow, Composer, Big Query, etc.). We also do some API design and development with Postman and Node.js
 

You will be part of the team building data pipelines that support our marketing, finance, campaign and Executive Leadership team as well as implementing Informatica Master Data Management (MDM) hosted on Amazon Web Services (AWS).  Specifically you’ll be building pipelines that support insights to enable our business partners’ analytics and campaigns. You are a fast learner, highly technical, passionate person looking to work within a team of multidisciplinary experts to improve your craft and contribute to the data development practice.
 

 

What you’ll do
 

  • Learn new skills & advance your data development practice
  • Analyze and profile data
  • Design, develop, test, deploy, maintain and improve batch and real-time data pipelines
  • Assist with design and development of solution prototypes
  • Support consumers with understanding the data outcomes and technical design
  • Collaborate closely with multiple teams in an agile environment

 

Qualifications

What you bring

 

  • You are a developer with 3+ years of experience in IT platform implementation in a technical capacity
  • Bachelor of Computer Science, Engineering or equivalent 
  • Understanding of data warehouses/cloud architectures and ETL processes
  • Working SQL knowledge and experience working with relational databases and query authoring (SQL), as well as working familiarity with a variety of databases
  • Experience with the Google Cloud Platform (GCP) and its related technologies (Kubernetes, CloudSQL, PubSub, Storage, Logging, Dashboards, Airflow, BigQuery, BigTable, Python, BQ SQL, Dataplex, Datastream etc.)
  • Experience with Python and software engineering best practices
  • API development using Node.js and testing using Postman/SoapUI
  • Experience working with message queues like JMS, Kafka, PubSub
  • A passion for data quality

 

Great-to-haves

 

  • Experience with Informatica IDQ/PowerCenter/IICS, Apache NiFi and other related ETL tools
  • Familiarity with Informatica MDM (preferred) but strong skills in other MDM tools still an asset
  • Proficiency in Python, Java
  • Understanding of TMF standards