What are the responsibilities and job description for the AWS Data Architect position at Princeton IT Services, Inc?
Job Title: AWS Data Architect
Location: Neenah, WI or Oconomowoc, WI (Onsite)
Job type: Contract
Position Overview
We are looking for a highly skilled AWS Data Architect to design, build, and optimize cloud-based data platforms supporting scalable analytics and business intelligence. The ideal candidate will have strong expertise in AWS services, data modeling, data lakes, ETL pipelines, and modern big data ecosystems. Open to candidates willing to relocate to Wisconsin
Key Responsibilities
Location: Neenah, WI or Oconomowoc, WI (Onsite)
Job type: Contract
Position Overview
We are looking for a highly skilled AWS Data Architect to design, build, and optimize cloud-based data platforms supporting scalable analytics and business intelligence. The ideal candidate will have strong expertise in AWS services, data modeling, data lakes, ETL pipelines, and modern big data ecosystems. Open to candidates willing to relocate to Wisconsin
Key Responsibilities
- Design and implement end-to-end data architectures on AWS, including data lakes, data warehouses, and real-time streaming solutions.
- Define data ingestion, transformation, and storage strategies using AWS native services such as Glue, Lambda, EMR, S3, Redshift, Athena, etc.
- Architect and optimize ETL/ELT pipelines ensuring security, reliability, and high performance.
- Collaborate with data engineers, analysts, and business teams to translate business requirements into scalable technical solutions.
- Establish and maintain data governance, security, and compliance frameworks (IAM, KMS, Lake Formation).
- Improve data systems for performance, cost-efficiency, and scalability.
- Lead data migration projects from on-prem or other cloud platforms into AWS.
- Provide technical leadership, guidance, and mentorship to data engineering teams.
- 10 years of experience in Data Architecture, Data Engineering, or Cloud Architecture.
- Strong hands-on experience with AWS services:
- Storage & Compute: S3, EC2, Lambda, ECS, EKS
- Data Processing: Glue, EMR, Kinesis, Step Functions
- Expertise in data modeling, data lake design, and distributed processing.
- Strong understanding of ETL/ELT frameworks and big data ecosystems.
- Experience with security, governance, and AWS best practices.
- Excellent communication and stakeholder management skills.