Pacific Gas and Electric Company

Data Engineer

10 March 2024
Apply Now
Deadline date:
£98000 - £146000

Job Description

Requisition ID # 155570 

Job Category: Information Technology 

Job Level: Individual Contributor

Business Unit: Information Technology

Work Type: Hybrid

Job Location: Oakland

 

 

Department Summary

 

The IT Data, Analytics, & Insights organization is an enterprise team that is responsible for working collaboratively across various lines of business (e.g., Gas Operations, Electric Operations, Safety, Energy Procurement, etc.) and is focused on unlocking the value of PG&E’s data to support the company’s Wildfire Safety Program and True North Strategy. As part of that focus, we focus on delivering data and AI/ML centric products to support these initiatives. Some of the products that are being delivered include Remote Inspections, AI Enabled Inspections, Vegetation Management through LiDAR capabilities, Transmission Line Asset Master, Electric Distribution Asset Master, Asset Risk Modeling, and a Cloud Native Foundational Platform.

 

A critical part of how we operate is to apply design thinking, work and observe the Agile development methodology, and co-location. Through these principles, we work as product teams to help deliver a valuable product to our business.

 

Position Summary

 

The Data Analytics and Insights team is seeking an experienced and talented Data Engineer to join our growing team of analytics experts. As a key member of our team, you will play an essential role in the design, development, and maintenance of data pipelines, analytic products, which includes data applications, reports, and dashboards. You should be a proactive, detail-oriented, and motivated individual who can thrive in a fast-paced environment and help us scale our analytic product development to meet our clients’ ever-evolving needs. In this role, you will collaborate with our cross functional team including solution architects, data pipeline engineers, data analysts, and data scientists on mission critical initiatives and will ensure optimal delivery of analytic products.

 

You will have a unique opportunity to be at the forefront of the utility industry and gain a comprehensive view of the nation’s most advanced smart grid. It is the perfect role for someone who would like to continue to build upon their professional experience and help advance PG&E’s sustainability goals.

 

The role is hybrid working primarily from your remote office and the Oakland General Office, in-person, 1-2x monthly for collaboration or as business needs require.

 

PG&E is providing the salary range that the company in good faith believes it might pay for this position at the time of the job posting. This compensation range is specific to the locality of the job. The actual salary paid to an individual will be based on multiple factors, including, but not limited to, specific skills, education, licenses or certifications, experience, market value, geographic location, and internal equity. Although we estimate the successful candidate hired into this role will be placed between the entry point and the middle of the range, the decision will be made on a case-by-case basis related to these factors.​ This job is also eligible to participate in PG&E’s discretionary incentive compensation programs.  

 

A reasonable salary range is:​

 

Bay Area Minimum:       $98,000

Bay Area Maximum:      $146,000

 

Job Responsibilities

 

  • Assembles large, complex sets of data that meet non-functional and functional business requirements.
  • Builds high-performance data pipelines and prototypes that enable business use of the data.
  • Builds infrastructure for optimal extraction, transformation and loading of data from various data sources.
  • Understands business requirements and applies them to complex software engineering and analysis.
  • Communicates (oral and written) recommendations with peers inside the department.
  • Partners with team members to understand and incorporate standards information and requirements into work procedures.
  • Identifies and analyzes to departmental standards, norms, and new goals/objectives.
  • Assists in data, design, product, and executive teams with data-related technical issues.
  • Understands the infrastructure that allows big data to be accessed and analyzed.
  • Utilizes department standard issue tracking, source control, and documentation tools.

 

Qualifications

 

Minimum: 

 

  • Bachelor’s degree in computer science, engineering, or a related field or equivalent work experience
  • 3 years of experience with data engineering/ETL ecosystem, such as Palantir Foundry, Spark, Informatica, SAP BODS, OBIEE.

 

Desired: 

 

  • 3 year of experience with data engineering/ETL ecosystem, such as Palantir Foundry, Spark, Informatica, SAP BODS or OBIEE
  • Experience with any of the following: Elastic Search, DBT, Airflow, Palantir Foundry, Data Quality tools, Collibra, MDM, Informatica, Spark, Snowflake, Teradata, SAP Business Warehouse, Business Objects Suite, Tableau, SAS Enterprise Miner, and other database and BI technologies, open-source Hadoop, and related technologies, data access languages such as SQL, SAS, R, Python, Scala, etc.
  • Experience with data engineering and data transformations via a training or apprenticeship program acceptable.
  • Business Intelligence and data access tool knowledge.
  • Knowledge of software engineering principals such as unit testing, CI/CD, source control.