What are the responsibilities and job description for the ETL Engineer position at eAspire Technolabs Inc.?
Role: ETL Engineer
Location: Windsor, CT (Onsite)
Visa: H1B
Experience: 12 Years
Job Summary:
We are seeking a skilled ETL Engineer to design, develop, and maintain scalable data integration solutions. The ideal candidate will be responsible for extracting data from various sources, transforming it into usable formats, and loading it into data warehouses or analytical systems. This role requires strong experience with ETL tools, SQL, data modeling, and performance optimization.
Key Responsibilities:
- Design, develop, and maintain ETL workflows and data pipelines for large-scale data integration.
- Extract, transform, and load data from various structured and unstructured sources into target systems (e.g., Data Warehouse, Data Lake).
- Work closely with data architects, analysts, and business stakeholders to understand data requirements and deliver high-quality data solutions.
- Optimize ETL processes for performance, scalability, and reliability.
- Ensure data quality, accuracy, and integrity across all stages of data processing.
- Monitor ETL jobs, troubleshoot issues, and perform root cause analysis for failures or delays.
- Implement automation and scheduling for ETL processes using tools such as Airflow, Control-M, or similar.
- Support data migration, cleansing, and validation activities as part of ongoing data initiatives.
- Create and maintain technical documentation for ETL processes and data flows.
Required Skills and Qualifications:
- Bachelor’s degree in Computer Science, Information Systems, or related field.
- 12 years of experience in ETL development and data integration.
- Hands-on experience with ETL tools such as Informatica, Talend, DataStage, SSIS, Pentaho, or similar.
- Strong proficiency in SQL and experience working with relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL).
- Experience with data warehousing concepts, data modeling, and dimensional modeling (Star/Snowflake schema).
- Knowledge of Python, Shell scripting, or other scripting languages for automation.
- Familiarity with cloud-based ETL and data platforms (e.g., AWS Glue, Azure Data Factory, Google Cloud Dataflow, Snowflake).
- Strong analytical, problem-solving, and communication skills.