Welocalize

Data Operations Engineer

28 March 2024
Apply Now
Deadline date:
£125000 - £238000 / year

Job Description

As a trusted global transformation partner, Welocalize accelerates the global business journey by enabling brands and companies to reach, engage, and grow international audiences. Welocalize delivers multilingual content transformation services in translation, localization, and adaptation for over 250 languages with a growing network of over 400,000 in-country linguistic resources. Driving innovation in language services, Welocalize delivers high-quality training data transformation solutions for NLP-enabled machine learning by blending technology and human intelligence to collect, annotate, and evaluate all content types. Our team works across locations in North America, Europe, and Asia serving our global clients in the markets that matter to them. www.welocalize.com
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
MAIN PURPOSE OF JOB-
The Data Operations Engineer is a key player in the development and maintenance of our data ecosystem, specifically as it relates to BI, Analytics and business operations. This professional specializes in crafting data flows from various key systems to our data warehouse, underscoring the importance of actionable data analytics applications. They also address a variety of data engineering activities, bridging the organization’s array of third-party and in-house developed systems to facilitate key initiatives, streamline business operations, and ensure data quality and accessibility.
MAIN DUTIES-

The following is a non-exhaustive list of responsibilities and areas of ownership of a Data Engineer:

  • Leveraging an understanding of business requirements and desired outcomes, build and maintain robust ETL processes, integrating key systems to funnel data effectively into our data warehouse for BI and analytics purposes.
  • Develop and maintain scalable and reliable data pipelines that cater to operational and analytical needs, while adhering to our overall data strategy.
  • Provide support for data transformation and migration, aligning with the dynamic needs of the business and operational requirements.
  • Engage with a range of data engineering tasks, from constructing datasets to complex data analysis to preparing data sets for prescriptive and predictive modeling, ensuring adaptability across various interconnected systems.
  • Collaborate effectively with other technical teams to support the broader data strategy, including close interaction with data scientists and architects to realize data-driven initiatives.
  • Recommend and implement ways to improve data reliability, efficiency, and quality, promoting best practices in data management.
  • REQUIREMENTS-

Education Level-

  • Bachelor’s degree in Computer Science, Engineering, or related field, or equivalent professional experience.

Experience-

  • Multiple years of experience with ETL development, data pipelines, and workflow management.
  • Proficiency with data transformation technologies (ETL) and agile methodologies.
  • Strong knowledge of relational and non-relational databases, complex data queries, and database technologies.
  • Demonstrated ability in data analytics applications and reporting systems such as PowerBI.

Other relevant skill-

  • In-depth analytical skills for complex data analysis and the ability to present findings clearly.
  • Experience with a diverse array of data engineering tasks, reflecting versatility and adaptability to various system interconnections.
  • Familiarity with data integration from multiple third-party and in-house developed systems, with a focus on operational data flows rather than core architecture.
  • KEY COMPETENCIES REQUIRED FOR THIS ROLE

Customer Service-

  • Proficiency in issue resolution and escalation management, with an emphasis on maintaining high service quality

Innovation-

  • A strong sense of curiosity and inventiveness in enhancing data systems and data flow processes.

Quality-

  • A self-starter with a commitment to producing high-quality work.
  • Diligence in verifying data and avoiding assumptions to ensure accuracy and integrity.
  • Ownership and accountability for deliverables, with a focus on continuous improvement.

Global Teamwork-

  • Proven ability to manage workload independently and contribute effectively as part of a global team.
  • Self-sufficient in managing complex data-related tasks.
  • Collaborative approach to working with cross-functional teams, reflecting a global perspective.