Ford Motor Company
Data Engineer (FCE Data Engineering)
Job Description
This role is for a GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices. You will work on analysing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the GCP. You will be responsible for designing the transformation and modernization on GCP. Experience with large scale solutions and operationalizing of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform.
Key Role Responsibilities:
- Develop technical solutions for Cloud Data Platforms team in North America and work between 1 PM and 10 PM IST to enable more overlap time.
- This role will work closely with teams in US and as well as Europe to ensure robust, integrated migration aligned with Global Data Engineering patterns and standards.
- Design and deploying data pipelines with automated data lineage.
- Develop, reusable Data Engineering patterns.
- Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine
- Ensure timely migration of Ford Credit Europe FCE Teradata warehouse to GCP and to enable Teradata platform decommissioning by end 2025 with a strong focus on ensuring continued, robust, and accurate Regulatory Reporting capability.
Position Opportunities:
The Data Engineer role within FC Data Engineering supports the following opportunities for successful individuals:
- Key player in a high priority program to unlock the potential of Data Engineering Products and Services & secure operational resilience for Ford Credit Europe
- Explore and implement leading edge technologies, tooling and software development best practices
- Experience of managing data warehousing and product delivery within a financially regulated environment
- Experience of collaborative development practices within an open-plan, team-designed environment
- Experience of working with third party suppliers / supplier management
- Continued personal and professional development with support and encouragement for further certification
Qualifications for Candidates
Essential:
- 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles).
- 5+ years of SQL development experience
- 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale.
- Strong understanding of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner
- Excellent problem-solving skills, with the ability to design and optimize complex data pipelines.
- Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team
- Experience developing with micro service architecture from container orchestration framework.
- Designing pipelines and architectures for data processing
- Strong evidence of self-motivation to continuously develop own engineering skills and those of the team.
- Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support
- Evidence of a proactive mindset to problem solving and willingness to take the initiative.
- Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines
Desired:
- Professional Certification in GCP (e.g., Professional Data Engineer).
- Data engineering or development experience gained in a regulated, financial environment.
- Experience with Teradata to GCP migrations is a plus.
- Strong expertise in SQL and experience with programming languages such as Python, Java, and/or Apache Beam
- Experience of coaching and mentoring Data Engineers
- Experience with data security, governance, and compliance best practices in the cloud.
- An understanding of current architecture standards and digital platform services strategy