What are the responsibilities and job description for the Lead SDET with ETL position at APLOMB Technologies?
Role: Lead SDET with ETL
Location: Fort Mill, SC
Must have financial/Baking Exp
Key Responsibilities
- Design & execute test plans for ETL/ELT workflows, data feeds, and batch extracts, ensuring comprehensive data validation
- Write and maintain manual and automated tests for data accuracy, completeness, and consistency against mapping documents and specification rules
- Develop SQL-based validation logic—supporting complex queries, joins, aggregates, and stored procedures—to verify data correctness
- Collaborate with data engineering, BI, architecture teams to identify and resolve data quality issues across data pipelines
- Develop or support CI/CD pipelines for automated data quality checks and data-flow validations
- Analyze test results, document defects, track resolution, and report status to stakeholders; act as advocate for quality in an Agile delivery process
Required Skills & Experience
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
- 5–6 years of experience in QA, SDET, or data quality roles—preferably in data engineering/test test automation in the financial services industry
- Strong programming and automation experience in Python or Java, with focus on testing frameworks and automation tools (e.g. Selenium, TestNG, Appium)
- Advanced SQL skills, capable of building complex queries to validate ETL, data loads, and data transformations
- Hands‑on with ETL/Database testing—manual & automated—covering batch pipelines, API integrations, and data feeds
- Experience with AWS-based data environments (e.g. S3, RDS, DynamoDB, Lambda, Snowflake, Redshift) and familiar in working with data lakes
- Knowledge of CI/CD tools (Jenkins, TeamCity, GitHub Actions, Octopus), test automation in pipelines, and Agile methodologies including Scrum ceremonies
Preferred Qualifications
- Familiarity with data quality tools, metrics frameworks, rule validation engines, or profiling platforms
- Exposure to big data technologies—such as Hadoop, Snowflake, DBT, Kafka, DBT, Airflow, FiveTran, PySpark—for data testing contexts
- Understanding of dimension modeling (star schema, SCD), data governance standards, metadata management, and change data capture patterns
- Financial services domain experience and familiarity with regulatory/operational compliance testing contexts