What are the responsibilities and job description for the ETL developer position at OVA.Work?
Job Title: ETL Developer
Location: TX
Job Type: Full-Time
Department: Data Engineering / IT
Reports To: Data Engineering Lead / Project Manager
Job Summary
We are looking for a detail-oriented and experienced ETL Developer to design, develop, and maintain robust ETL processes for our data integration and warehousing solutions. The ideal candidate will have strong technical skills in ETL tools, databases, and SQL, along with the ability to analyze business requirements and transform them into scalable data pipelines.
Key Responsibilities
Location: TX
Job Type: Full-Time
Department: Data Engineering / IT
Reports To: Data Engineering Lead / Project Manager
Job Summary
We are looking for a detail-oriented and experienced ETL Developer to design, develop, and maintain robust ETL processes for our data integration and warehousing solutions. The ideal candidate will have strong technical skills in ETL tools, databases, and SQL, along with the ability to analyze business requirements and transform them into scalable data pipelines.
Key Responsibilities
- Design, develop, and maintain ETL workflows for data extraction, transformation, and loading.
- Collaborate with business analysts, data engineers, and stakeholders to understand data requirements.
- Optimize ETL processes for performance, scalability, and reliability.
- Perform data validation, cleansing, and quality checks to ensure accuracy.
- Develop and maintain data models, schemas, and metadata.
- Troubleshoot and resolve issues related to data integration and ETL jobs.
- Document ETL processes, workflows, and best practices.
- Support production deployments and provide ongoing maintenance.
- Bachelor's degree in Computer Science, IT, or a related field.
- 25 years of experience as an ETL Developer or in a related data engineering role.
- Strong proficiency in SQL and relational databases (Oracle, SQL Server, MySQL, PostgreSQL, etc.).
- Hands-on experience with ETL tools (Informatica, Talend, SSIS, DataStage, Pentaho, or equivalent).
- Knowledge of data warehousing concepts, star/snowflake schemas, and dimensional modeling.
- Experience with scripting languages (Python, Shell, etc.) for automation.
- Familiarity with cloud-based data platforms (AWS Redshift, Azure Data Factory, GCP BigQuery) is a plus.
- Strong analytical, problem-solving, and debugging skills.
- Experience with big data technologies (Hadoop, Spark).
- Familiarity with real-time data streaming (Kafka, Kinesis).
- Exposure to DevOps tools for CI/CD of data pipelines.
- Understanding of data governance and security best practices.
- Competitive compensation and benefits.
- Opportunity to work on large-scale data engineering projects.
- Professional development and training opportunities.
- Collaborative and growth-oriented environment.