What are the responsibilities and job description for the ETL Developer position at Alignity Solutions?
- Jobseeker Video Testimonials
- Employee Glassdoor Reviews
We are an IT Solutions Integrator/Consulting Firm helping our clients hire the right professional for an exciting long-term project. Here are a few details.
Requirements
We are looking for a skilled and versatile ETL Developer with strong expertise in ETL/ELT pipeline development, data design, cloud technologies (especially AWS), and automation practices (CI/CD). The ideal candidate will have experience working across various stages of the data lifecycle – from ingestion and transformation to orchestration and deployment – using tools such as IICS (Informatica Intelligent Cloud Services), Python/PySpark, and Shell scripting.
Key Responsibilities:
Design, build, and maintain robust ETL/ELT data pipelines using IICS, Python, or PySpark to support large-scale data processing and analytics.
Collaborate with data architects and analysts to design scalable data models and processing solutions.
Develop and maintain shell scripts for task automation, job orchestration, and system monitoring.
Work closely with DevOps teams to implement CI/CD pipelines for data solutions, ensuring fast and reliable deployments.
Deploy and manage data workflows and infrastructure on AWS cloud services (e.g., S3, Lambda, Glue, EMR, Redshift, Athena).
Ensure data quality, integrity, and compliance through testing, validation, and monitoring frameworks.
Participate in performance tuning and optimization of ETL jobs and data processing applications.
Troubleshoot data pipeline failures and perform root cause analysis and resolution.
Required Skills:
ETL/ELT Tools: Hands-on experience with Informatica IICS or similar platforms.
Programming: Strong proficiency in Python and/or PySpark for data transformation and processing.
Scripting: Advanced knowledge of Shell scripting in Unix/Linux environments.
Cloud: Experience working with AWS services like S3, EC2, Glue, Redshift, Lambda, etc.
CI/CD: Familiarity with tools like Jenkins, GitLab CI, or AWS CodePipeline.
Data Modeling & Design: Ability to interpret business requirements into scalable and efficient data architecture.
Strong problem-solving and communication skills, with an ability to collaborate across technical and business teams.
Nice to Have:
Exposure to data governance, metadata management, or data cataloging tools.
Knowledge of SQL tuning and performance optimization techniques.
Experience with monitoring tools (e.g., CloudWatch, DataDog).
Understanding of Agile/Scrum methodologies.
Educational Qualifications:
Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or related field.
Benefits