Western Digital
Staff Engineer, Data Analytics Engineering
Job Description
Company Description
At Western Digital, our vision is to power global innovation and push the boundaries of technology to make what you thought was once impossible, possible.
At our core, Western Digital is a company of problem solvers. People achieve extraordinary things given the right technology. For decades, we’ve been doing just that. Our technology helped people put a man on the moon.
We are a key partner to some of the largest and highest growth organizations in the world. From energizing the most competitive gaming platforms, to enabling systems to make cities safer and cars smarter and more connected, to powering the data centers behind many of the world’s biggest companies and public cloud, Western Digital is fueling a brighter, smarter future.
Binge-watch any shows, use social media or shop online lately? You’ll find Western Digital supporting the storage infrastructure behind many of these platforms. And, that flash memory card that captures and preserves your most precious moments? That’s us, too.
We offer an expansive portfolio of technologies, storage devices and platforms for business and consumers alike. Our data-centric solutions are comprised of the Western Digital®, G-Technology™, SanDisk® and WD® brands.
Today’s exceptional challenges require your unique skills. It’s You & Western Digital. Together, we’re the next BIG thing in data.
Job Description
- Minimum of 6+ years of experience in developing ETL jobs using any industry leading ETL tool.
- Ability to design, develop, and optimize Apache Spark applications for large-scale data processing.
- Ability to implement efficient data transformation and manipulation logic using Spark RDDs and Data Frames.
- Ability to design, implement, and maintain Apache Kafka pipelines for real-time data streaming and event-driven architectures.
- Development and deep technical skill in Python, Scala, NIFI and SQL/Procedure.
- Working knowledge and understanding on Unix/Linux operating system like awk, ssh, crontab, etc.,
- Ability to write transact SQL, develop and debug stored procedures and user defined functions in python.
- Working experience on Postgres and/or Redshift database is required.
- Exposure to CI/CD tools like bit bucket, Jenkins, ansible, docker, Kubernetes etc. is preferred.
- Ability to understand relational database systems and its concepts.
- Ability to handle large table/dataset of 2+TB in a columnar database environment.
- Ability to integrate data pipelines with Splunk/Grafana for real-time monitoring, analysis, and visualization.
- Ability to create and schedule the Airflow Jobs.
Qualifications
- Minimum of a bachelor’s degree in computer science or engineering. Master’s degree preferred.
- AWS developer certification will be preferred.
- Any certification on SDLC (Software Development Life Cycle) methodology, integrated source control system, continuous development and continuous integration will be preferred.
Additional Information
Western Digital thrives on the power and potential of diversity. As a global company, we believe the most effective way to embrace the diversity of our customers and communities is to mirror it from within. We believe the fusion of various perspectives results in the best outcomes for our employees, our company, our customers, and the world around us. We are committed to an inclusive environment where every individual can thrive through a sense of belonging, respect and contribution.
Western Digital is committed to offering opportunities to applicants with disabilities and ensuring all candidates can successfully navigate our careers website and our hiring process. Please contact us at [email protected] to advise us of your accommodation request. In your email, please include a description of the specific accommodation you are requesting as well as the job title and requisition number of the position for which you are applying.