What are the responsibilities and job description for the AWS Data Engineer position at Aptonet?
Title: AWS Data Engineer
Location: Seattle, WA
Role Summary
Seeking an AWS Data Engineer to design, build, and optimize large-scale data pipelines and analytics solutions on AWS. This role is responsible for the end-to-end data lifecycle, including data ingestion, transformation, storage, and delivery for analytics, machine learning, and operational systems.
Key Responsibilities
- Design, build, and optimize ETL/ELT workflows to ingest data from multiple sources (S3, Redshift, Lake Formation, Glue, Lambda).
- Implement data cleansing, enrichment, and standardization processes.
- Develop and automate batch and real-time streaming data pipelines using Kinesis, MSK, Lambda, Glue, EMR, and Step Functions.
- Ensure pipelines are optimized for scalability, performance, and fault tolerance.
- Optimize SQL queries, data models, and pipeline performance.
- Design and implement data architecture across data lakes, data warehouses, and lakehouses.
- Optimize data storage strategies including partitioning, indexing, and schema design.
- Integrate data from diverse sources such as databases, APIs, IoT, and third-party systems.
- Collaborate with Data Scientists, Analysts, and BI developers to deliver structured data.
- Document data assets and processes for discoverability.
- Train internal staff to maintain infrastructure and pipelines.
Required Technical Skills
- Strong experience with AWS services: S3, Redshift, Lake Formation, Glue, Lambda, Kinesis, MSK, EMR, Step Functions.
- Proficiency in SQL, Python, or Scala for data transformation and processing.
- Hands-on experience with Databricks on AWS.
- Working knowledge of DevOps and CI/CD practices.
- Experience designing data pipelines for batch and streaming architectures.
- Experience with utility industry data (meter data, customer data, grid/asset data, work management, outage data).
- Familiarity with IEC CIM standards and utility integration frameworks.
Preferred / Nice-to-Have Skills
- Experience with Infrastructure as Code (IaC) tools such as Terraform.
Qualifications & Experience
- Bachelor’s degree in Computer Science, Data Engineering, or related field.
- 5 years of experience in data engineering roles.
- U.S. Citizenship or Green Card required.
- No security clearance required.
- Standard work schedule (40 hours/week), no OT required.