GlobalFoundries

Lead Data Engineer

15 November 2024
Apply Now
Deadline date:
£28000 - £67000 / year

Job Description

About GlobalFoundries

GlobalFoundries is a leading full-service semiconductor foundry providing a unique combination of design, development, and fabrication services to some of the world’s most inspired technology companies. With a global manufacturing footprint spanning three continents, GlobalFoundries makes possible the technologies and systems that transform industries and give customers the power to shape their markets. For more information, visit www.gf.com.

Introduction:

The Data Solutions Group is responsible for integrating manufacturing and engineering data out of a high variety of source systems used in the semiconductor engineering and production process. The data warehouse solutions are used in the GLOBALFOUNDRIES Fabs in Dresden, in  the US and in Singapore, and the Data Solutions Group is responsible to conceptualize and provide timely, high-quality solutions that address analysis needs of engineers in a leading-edge semiconductor foundry

Your Job:  

  • Understand the business case and translate to a holistic a solution involving AWS Cloud Services , PySpark, EMR, Python, Data Ingestion and  Cloud DB Redshift / Postgres

  • PL/SQL development for high volume data sets.

  • Experience in preparing data warehouse design artifacts based on given requirements (ETL framework design, data modeling, source-target-mapping),

  • DB query monitoring for tuning and optimization opportunities

  • Proven experience with large, complex database projects in environments producing high-volume data 

  • Demonstrated problem solving skills; familiarity with various root cause analysis methods; experience in documenting identified problems and determined resolutions.

  • Makes recommendations regarding enhancements and/or improvements

  • Provides appropriate consulting, interfacing, and standards relating to database management, and monitors transaction activity and utilization.

  • Performance issues analysis and Tuning

  • Data Warehouse design and development, including logical and physical schema design.

Other Responsibilities: 

  • Perform all activities in a safe and responsible manner and support all Environmental, Health, Safety & Security requirements and programs

  • Customer/stakeholder focus. Ability to build strong relationships with Application teams, cross functional IT and global/local IT teams

Required Qualifications:

  • Bachelor or master’s degree in information technology, Electrical Engineering or similar relevant fields.

  • Proven experience (3 years minimum) with ETL development, design, performance tuning and optimization,

  • Very good knowledge of data warehouse architecture approaches and trends, and high interest to apply and further develop that knowledge, including understanding of Dimensional Modelling and ERD design approaches,

  • Working Experience in Kubernetes and Docker Administration is added advantage

  • Good experience in AWS Services, Big data, PySpark, EMR,Python, Cloud DB RedShift 

  • Proven experience with large, complex database projects in environments producing high-volume data,

  • Proficiency in SQL and PL/SQL

  • Experience in preparing data warehouse design artifacts based on given requirements (ETL framework design, data modeling, source-target-mapping),

  • Experience in developing streaming applications e.g. SAP Data Intelligence, Spark Streaming, Flink, Storm, etc.

  • Excellent conceptual abilities pared with very good technical documentation skills, e.g. ability to understand and document complex data flows as part of business / production processes,

  • infrastructure.

  • Familiarity with SDLC concepts and processes

Additional Skill:

  • Experience using and developing on AWS services.

  • Experience in semiconductor industry,

  • Knowledge of Semistructured datasets

  • Experience with reporting data solutions and business intelligence tools

  • Experience in collecting, structuring and summarizing requirements in a data warehouse environment,

  • Knowledge of statistical data analysis and data mining,

  • Experience in test management, test case definition and test processes

Preferred Qualifications:

  • Bachelor or master’s degree with Minimum 10 Years Exp

  • AWS Cloud Services , PySpark, EMR, Python, Cloud DB Redshift / Postgres and Data Ingestion

  • Experience in preparing Data Warehouse design (ETL framework design, data modeling, source-target-mapping)

GlobalFoundries is an equal opportunity employer, cultivating a diverse and inclusive workforce. We believe having a multicultural workplace enhances productivity, efficiency and innovation whilst our employees feel truly respected, valued and heard.

As an affirmative employer, all qualified applicants are considered for employment regardless of age, ethnicity, marital status, citizenship, race, religion, political affiliation, gender, sexual orientation and medical and/or physical abilities.

All offers of employment with GlobalFoundries are conditioned upon the successful completion of background checks, medical screenings as applicable and subject to the respective local laws and regulations.

To ensure that we maintain a safe and healthy workplace for our GlobalFoundries employees, please note that offered candidates who have applied for jobs in India will have to be fully vaccinated prior to their targeted start date. For new hires, the appointment is contingent upon the provision of a copy of their COVID-19 vaccination document, subject to any written request for medical or religious accommodation.

Information about our benefits you can find here: https://gf.com/about-us/careers/opportunities-asia