Mutual of Omaha

Data Engineer (Solution Lead) – Remote

24 October 2024
Apply Now
Deadline date:
£110000 - £145000 / year

Job Description

As a Data Engineer, you’ll play a crucial role in enabling information delivery and data architectural governance for both internal and external stakeholders. You’ll be responsible for defining and overseeing the data strategy for our Emerging & Strategic Solutions (ESS) business segment, including oversight of other data engineers. You will partner closely with our business data product owner to drive forward the data needs that will enable ESS to double Institutional Annuity product sales by 2027, complete the technology modernization of our Special Risk Insurance systems, and align our data roadmap and team engineering skills to the Enterprise data strategy. Join our team and contribute to transforming data to enable business growth and decision-making!

WHAT WE CAN OFFER YOU:

  • Estimated Salary (Levels have variable responsibilities and qualifications):
    • Engineer II: $110,000 – $130,000, plus annual bonus.
    • Engineer III: $125,000 – $145,000, plus annual bonus.
  • Benefits and Perks, 401(k) plan with a 2% company contribution and 6% company match.
  • Regular associates working 40 hours a week can earn up to 15 days of vacation each year.
  • Regular associates receive 11 paid holidays in 2024, which includes 2 floating holidays that are added to your prorated personal time to be used at your discretion.
  • Regular associates are provided sick leave through the use of personal time. Associates working 40 hours a week can receive up to 40 hours of personal time in 2024, which is prorated based on the start date. Additionally you will receive two floating holidays in 2024 by way of personal time that may be used at your discretion.
  • Applicants for this position must not now, nor at any point in the future, require sponsorship for employment.

WHAT YOU’LL DO:

  • Over the next two years, key deliverables include:
    • ESS Cloud Data Strategy: To align with enterprise goals, we must refine the migration path to Snowflake and expand data visualization capabilities with Sigma or another tool. Over the next 18 months, we will migrate 7 years of Special Risk data to the cloud, supporting data visualization for a broker portal.
    • Institutional Annuity Growth Catalyst Data Strategy: Develop an ETL framework, centralize data, and enable real-time analytics. Streamline SDLC processes and establish a single source of truth for SQL changes in SSRS and Sigma. Use GitHub for documentation and Alation for metadata management. Improve health of data across our test environments to enable faster software delivery.
    • Automated Data Testing: Implement automated data validation, reducing manual reporting processes. Leverage Snowflake or other tools for error handling and validation across data sources. Focus on automating the validation of the 600+ Special Risk reports generated monthly.
  • Design, build, and maintain efficient data systems and pipelines to support both analytic and operational data needs, including data movement, modeling, and reporting.
  • Collaborate with enterprise teams in data science, architecture, and governance to align on standards and optimize production environments for performance and security.
  • Work with business owners to enhance data collection, storage, and usage, ensuring data solutions maximize information value across supported systems.
  • Translate business requirements into data architecture solutions, implementing and optimizing these solutions within production environments.
  • Develop and deploy data orchestration pipelines, including data sourcing, cleansing, and quality control, as well as basic machine learning models in partnership with business units.

WHAT YOU’LL BRING:

  • At least 5 or more years of experience at an expert level solving build, development, and implementation challenges with:
    Traditional data technologies and methodologies, such as RDBMS, SQL, and ETL/ELT, as well as emerging technologies like data science, programming languages, and cloud platforms including experience with Python development.
  • Foundational knowledge in analyzing data systems for structure, relationships, and impact to create effective data pipelines that align with business needs via ETL processes and data pipeline management.
  • Cloud Data Platforms & Data Architecture: Experience with cloud data warehousing platforms (preferably Snowflake), OLAP architecture, data modeling, and building data architectures for internal and external reporting. Proficiency in version control (Git) and change management to manage data pipelines, reporting changes, and code deployments.
  • Data Governance & Visualization: Strong understanding of data governance, security, and compliance (e.g., HIPAA, GDPR). Ability to create compelling visualizations using tools like Sigma that communicate data insights and support data-driven decision-making.
  • Collaboration & Problem Solving: Excellent communication and teamwork skills, with the ability to collaborate across teams, quickly understand business context, and implement data solutions that align with organizational goals. Strong problem-solving and learning agility.
  • You promote a culture of diversity and inclusion, value different ideas and opinions, and listen courageously, remaining curious in all that you do.
  • Able to work remotely with access to a high-speed internet connection and located in the United States or Puerto Rico.

PREFERRED:

  • Intermediate to expert knowledge of: Informatica, SSRS, AWS, Snowflake

We value diverse experience, skills, and passion for innovation. If your experience aligns with the listed requirements, please apply! 

If you have questions about your application or the hiring process, email our Talent Acquisition area at [email protected]. Please allow at least one week from time of applying if you are checking on the status.

#Circa