What are the responsibilities and job description for the AWS Software Engineer III-Databricks/Python/Pyspark position at JPMorgan Chase?
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.
As an AWS Software Engineer III-Databricks/Python/PySpark at JPMorgan Chase within the Corporate Sector-Global Finance Team, you will be a key member of an agile team, tasked with designing and delivering cutting-edge products that are secure, stable, and scalable. Your role involves implementing essential technology solutions across diverse technical domains to support the firm's business goals effectively.
Job responsibilities
- Developing and optimizing data pipelines and workflows to support data integration, transformation, and analysis
- Implementing best practices for data management, ensuring data quality, security, and compliance
- Writing secure, high-quality production code following AWS best practices, and deploying efficiently using CI/CD pipelines
- Creating architecture and design documents for complex applications, ensuring software code meets design constraints
- Identifying hidden issues and patterns in data to enhance coding practices and system architecture
- Contributing to software engineering communities, promoting diversity, opportunity, inclusion, and respect within the team
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 3 years applied experience
- Experience with Spark and SQL
- Expertise in Lakehouse/Delta Lake architecture, system design, application development, testing, and ensuring operational stability
- Strong programming skills in Python/PySpark
- Proficient in orchestration using Airflow
- In-depth knowledge of Big Data and data warehousing concepts
- Proficient in SQL/SparkSQL
- Experience with CI/CD processes
- Thorough understanding of the Software Development Life Cycle (SDLC)
- Solid understanding of agile methodologies, including DevOps practices, application resiliency, and security measures
- Proven expertise in software applications and technical processes within a specialized technical domain
Preferred qualifications, capabilities, and skills
- Experience in full-stack development with strong proficiency in Python
- Experience with Databricks, and the AWS cloud ecosystem
- Familiarity with Snowflake, Terraform and LLM
- Exposure to cloud technologies such as AWS Glue, S3, SQS/SNS, Lambda etc.
- Familiarity with Data Observability, Data Quality, Query Optimization & Cost Optimization
- AWS certifications such as SAA, Associate Developer, Data Analytics Specialty, or Databricks certification