What are the responsibilities and job description for the Data Engineer position at TPI Global Solutions?
Role: Data Engineer (Builder)
Locations: Montgomery, AL (Onsite)
Duration: Contract to Hire
Visa: USC or GC Only
Experience Needed
- 5–7 years in data engineering or database development.
- Hands‑on experience with SQL Server ETL/ELT pipelines.
- Experience integrating pipelines with cloud services (AWS Glue, Azure Data Factory, GCP Dataflow).
- Familiarity with streaming technologies (Kafka, Kinesis).
- Experience in data modeling and architecture design.
- Proficiency in Python/Scala/Java programming for pipeline development.
- Exposure to DevOps automation (Terraform, Ansible) and containerization (Docker, Kubernetes).
- DevOps and automation maturity with certifications (HashiCorp Terraform Associate, AWS DevOps Engineer) and containerization (Docker, Kubernetes).
- Preferred: Advanced programming depth with applied coursework or certifications (Python Institute PCPP, Scala Professional Certification).
- Preferred: Data modeling specialization with advanced coursework or vendor‑specific training (Snowflake, AWS Big Data Specialty).
Education
- Bachelor’s degree in Computer Science, Software Engineering, or related technical field.
Certifications (Preferred)
- AWS Certified Data Engineer
- Azure Data Engineer Associate
- Google Professional Data Engineer
Software Use
- SQL Server (ETL/ELT pipelines, stored procedures).
- Orchestration tools (Airflow, DBT).
- Cloud integration services (AWS Glue, Azure Data Factory, GCP Dataflow).
- Observability tools (OpenLineage, Monte Carlo).
- DevOps automation tools (Terraform, Ansible).
- Containerization platforms (Docker, Kubernetes).