PulsePoint
Data Engineer (Remote FTE, India)
Job Description
- Design, build, and maintain reliable and scalable enterprise-level distributed transactional data processing systems for scaling the existing business and supporting new business initiatives
- Optimize jobs to utilize Kafka, Hadoop, Presto, Spark, and Kubernetes resources in the most efficient way
- Monitor and provide transparency into data quality across systems (accuracy, consistency, completeness, etc)
- Increase accessibility and effectiveness of data (work with analysts, data scientists, and developers to build/deploy tools and datasets that fit their use cases)
- Collaborate within a small team with diverse technology backgrounds
- Provide mentorship and guidance to junior team members
Team Responsibilities:
- Ingest, validate and process internal & third party data
- Create, maintain and monitor data flows in Spark, Hive, SQL and Presto for consistency, accuracy and lag time
- Maintain and enhance framework for jobs(primarily aggregate jobs in Spark and Hive)
- Create different consumers for data in Kafka using Spark Streaming for near time aggregation
- Tool evaluation/selection/implementation
- Backups/Retention/High Availability/Capacity Planning
- Review/Approval – DDL for database, Hive Framework jobs and Spark Streaming to make sure they meet our standards
Technologies We Use:
- Airflow – for job scheduling
- Docker – Packaged container image with all dependencies
- Graphite/Beacon – for monitoring data flows
- Hive – SQL data warehouse layer for data in HDFS
- Kafka- distributed commit log storage
- Kubernetes – Distributed cluster resource manager
- Presto – fast parallel data warehouse and data federation layer
- Spark Streaming – Near time aggregation
- SQL Server – Reliable OLTP RDBMS
- GCP BQ
Requirements
- 5+ years of software engineering experience
- Fluency in Python, experience in Scala/Java is a huge plus (Polyglot programmer preferred!)
- Hive experience
- Proficiency in Linux
- Strong understanding of RDBMS, SQL;
- Passion for engineering and computer science around data
- Willing and able to work East Coast U.S. hours (9am-6pm EST)
- Willingness to participate in 24×7 on-call rotation
- Knowledge and exposure to distributed production systems i.e Hadoop is a huge plus
- Knowledge and exposure to Cloud migration is a plus
Note that this is a full-time employee role.Selection Process:1) Initial Screen (30 mins)2) Hiring Manager Interview (45 mins)3) Tech Challenge4) Team Interview (60 mins + 3 x 45 mins) + SVP of Engineering (15 mins)5) WebMD Sr. Director, DBA (30 mins)WebMD and its affiliates is an Equal Opportunity/Affirmative Action employer and does not discriminate on the basis of race, ancestry, color, religion, sex, gender, age, marital status, sexual orientation, gender identity, national origin, medical condition, disability, veterans status, or any other basis protected by law.