Transaction Network Services

Senior Data Architect – 2138 (Remote)

20 November 2024
Apply Now
Deadline date:
£55000 - £102000 / year

Job Description

CES has 26+ years of experience in delivering Software Product Development, Quality Engineering, and Digital Transformation Consulting Services to Global SMEs & Large Enterprises. CES has been delivering services to some of the leading Fortune 500 Companies including Automotive, AgTech, Bio Science, EdTech, FinTech, Manufacturing, Online Retailers, and Investment Banks. These are long-term relationships of more than 10 years and are nurtured by not only our commitment to timely delivery of quality services but also due to our investments and innovations in their technology roadmap. As an organization, we are in an exponential growth phase with a consistent focus on continuous improvement, process-oriented culture, and a true partnership mindset with our customers. We are looking for the right qualified and committed individuals to play an exceptional role as well as to support our accelerated growth. You can learn more about us at: http://www.cesltd.com/
Job Role: We are seeking an experienced Senior Data Architect to join our dynamic team.

Ideal Candidate: 
As a Senior Data Architect, you will be responsible for designing, developing, and maintaining the data architecture of the organization. You will work closely with data engineers, data analysts, and business stakeholders to ensure that the data architecture supports the organization’s data-driven initiatives.  This role requires a proactive and resourceful individual with a solid understanding of data engineering, data analytics, and cloud architecture.

Design, develop, and maintain the data architecture of the organization, including data models, data storage, and data integration.

Work with data engineers to design and implement data pipelines for ingestion, transformation, and loading of data into the data warehouse.

Collaborate with data analysts and business stakeholders to understand their data needs and ensure that the data architecture supports their requirements.

Define, design, and develop data pipelines for ingestion, transformation, and loading of data into Azure Synapse Analytics. This includes understanding functional and non-functional requirements, performing source data analysis, data profiling, and defining efficient ELT processes.

Work with Azure Synapse Analytics to build and optimize data models, SQL queries, stored procedures, and other artifacts necessary for data processing and analysis.

Understand the characteristics of Data Lake House including the various file formats, optimizing data storage, and implementing efficient data reading and writing mechanisms for incremental updates within Azure Synapse Analytics.

Ensure compliance with data governance policies and implement security measures to protect sensitive data stored in Azure. This involves encryption, masking, and access control mechanisms.

Continuously optimize data pipelines and storage configurations to improve performance, scalability, and reliability. This includes identifying bottlenecks, query tuning, and leveraging Azure Synapse Analytics features for parallel processing.

Define monitoring solutions to track data pipeline performance, data quality, and system health. Troubleshoot issues related to data ingestion, transformation, or storage, and provide timely resolutions.

Personal Attributes:

•    Education: Bachelor’s degree in computer science, engineering 
•    Minimum of 15 years of experience in a relevant data engineering role.
•    Proficiency with one or more of the following database platforms; e.g. Oracle, Microsoft SQL Server, PostgreSQL, MySQL/MariaDB, MongoDB, Parquet 

•    Strong SQL skills, including experience with complex SQL queries, stored procedures, and performance optimization techniques. Familiarity with T-SQL for Azure Synapse Analytics is a plus.

•    Proven experience in building ELT pipelines and data integration solutions using tools like Azure Data Factory, Oracle Golden Gate, or similar platforms. Ability to handle a variety of legacy data sources and file formats efficiently.

•    Familiarity with dimensional modeling, star schemas, and data warehousing principles. Experience in designing and implementing data models for analytical workloads.

•    Strong analytical skills with the ability to understand complex data requirements, troubleshoot technical issues, and propose effective solutions to meet business needs.

•    Excellent communication skills with the ability to collaborate effectively with cross-functional teams, including Data Scientists, Reporting Analysts, and DevOps professionals.