JPMorganChase
Software Engineer III – Databricks and PySpark Developer
Job Description
Job DescriptionWe have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III – Databricks and PySpark Developer at JPMorgan Chase within Commercial and Investment Banking, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.
Job responsibilitiesExecutes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problemsSupport review of controls to ensure sufficient protection of enterprise data and advise and makes custom configuration changes in one to two tools to generate a product at the business or customer requestUpdate logical or physical data models based on new use case and use SQL and understands NoSQL databases and their niche in the marketplaceAnalyze, design, develop and drive performance enhancements, you will focus on significantly increasing default ingestion speeds to meet the substantial data demands, ensuring our systems operate at peak efficiency. Implement automation, optimization, performance tuning and scaling techniques to ensure efficient pipeline performanceHandle new and complex challenges, continuously seeking innovative solutions to improve data processing and performanceCreates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systemsProduces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code developmentGathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systemsProactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architectureAdds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skillsFormal training or certification on software engineering concepts and 3+ years of applied experienceProficiency in Databrick, AWS and PySpark for data processing and analytics and Databricks Cluster configuration, Unity catalog repository configurationStrong programming skills in Python with experience in writing complex SQL queriesAdvanced at SQL (e. g. , joins and aggregations, window functions)Experience in data engineering and Cloud architecture specifically with Databricks and AWSProven experience and ability to migrate the Data load models developed on ETL framework to the multi node Databricks computeUnderstanding of system architectures, and design patterns and should be able to design and develop applications using these principles.
Hands-on practical experience in system design, application development, testing, and operational stabilityExperience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languagesSolid understanding of agile methodologies such as CI/CD, Application Resiliency, and SecurityDemonstrated knowledge of software applications and technical processes within a technical discipline (e. g.
, cloud, artificial intelligence, machine learning, mobile, etc. ) Preferred qualifications, capabilities, and skillsExperience in Data modelingExperience in AI/ML models Working understanding of NoSQL databasesExperience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysisExcellent problem-solving skills to be able to structure the right analytical solutions and have a strong sense of teamwork, ownership, and accountabilityAbility to work in a fast-paced environment with tight schedulesStrong understanding of developing data warehouses, data marts, etc.
EWJD3