What are the responsibilities and job description for the Lead Big Data Software Engineer - Spark, Java position at JPMorgan Chase?
You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you.
As a Lead Software Engineer at JPMorgan Chase within Corporate Technology, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for design, development, and technical troubleshooting across multiple technical areas within various business functions in support of the firm’s business objectives.
Job responsibilities
- Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications
- Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation
- Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies
- Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development
Develops secure high-quality production code, and reviews and debugs code written by others
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 5 years applied experience
- Hands-on practical experience in system design, application development, testing, and operational stability, including AWS cloud computing
- Advanced proficiency in one or more languages: Java, Python, Scala and Strong SQL and database skills
- Experience with data structures and algorithms and efficient coding practices
- Experience with Spark and distributed computing
- Demonstrated proficiency in code optimization and refactoring to enhance scalability and performance.
- Ability to collaborate well with global teams in geographically distributed locations across time zones
- Advanced understanding of agile methodologies, including CI/CD, application resiliency, and security.
- Experience with automation and continuous delivery methods.
Preferred qualifications, capabilities, and skills
- Exposure to cloud technologies (AWS)