What are the responsibilities and job description for the Senior AWS/Snowflake Data Engineer position at Heitmeyer Consulting?
Heitmeyer Consulting has a banking client that has a need within their Chief Data Office for a Senior AWS / Snowflake Data Engineer for a contract?to?hire role to help design, build, and operate modern cloud?based data solutions. This is a hands?on engineering role focused on AWS?native data pipelines, Apache Iceberg lakehouse patterns, and Snowflake analytics platforms.
The ideal candidate is a strong individual contributor who enjoys building production?ready data solutions, collaborating with analytics and business teams, and helping shape scalable data platforms that support critical business decisions. Role must be based in Dallas, TX.
Top Required Skills:
The ideal candidate is a strong individual contributor who enjoys building production?ready data solutions, collaborating with analytics and business teams, and helping shape scalable data platforms that support critical business decisions. Role must be based in Dallas, TX.
Top Required Skills:
- Strong hands?on experience as a Data Engineer in AWS
- Proven experience with Apache Iceberg (table design, schema evolution, partitioning, maintenance)
- Solid experience designing and operating Snowflake data models for analytics and consumption
- Proficiency in Python, Java, or Scala
- Advanced SQL skills (including Snowflake SQL)
- Experience building ETL / ELT pipelines using native AWS services:
- S3, Glue, Lambda, Step Functions, EventBridge
- Experience developing secure, scalable REST APIs for data access
- Strong understanding of data quality, monitoring, and error handling
- Ability to work independently while collaborating across engineering, analytics, and business teams
- Apache Spark and big?data ecosystem experience
- Workflow orchestration tools (Airflow or AWS?native orchestration)
- Terraform / Infrastructure as Code experience
- CI/CD and DevOps practices for data platforms
- Experience optimizing cloud cost and performance
- Financial services or other regulated?industry experience
- AWS or cloud?data certifications
- Design and build AWS lakehouse architectures using S3 and Apache Iceberg
- Develop and support end?to?end data pipelines using AWS?native services
- Engineer and maintain Iceberg datasets, including schema evolution and table maintenance
- Design and implement Snowflake schemas, tables, and views for analytics and data products
- Build and maintain secure data APIs for internal applications and analytics teams
- Optimize data solutions for performance, scalability, and cost efficiency
- Implement data quality checks, validation, and observability
- Partner with analysts, data scientists, and engineers to deliver reusable data products
- Troubleshoot and resolve data pipeline and platform issues
- Maintain clear technical documentation