What are the responsibilities and job description for the ETL Developer position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, INNOVIT USA INC, is seeking the following. Apply via Dice today!
Hiring: W2 Candidates Only
Visa: Open to any visa type with valid work authorization in the USA
Key Responsibilities
Hiring: W2 Candidates Only
Visa: Open to any visa type with valid work authorization in the USA
Key Responsibilities
- Design, develop, and implement ETL workflows and data pipelines using tools such as Informatica, Talend, SSIS, DataStage, or similar.
- Analyze source systems, define data extraction methods, and map data transformations based on business requirements.
- Optimize ETL processes for performance, scalability, and reliability.
- Develop and maintain data models, data dictionaries, and technical documentation.
- Collaborate with data architects, analysts, and business stakeholders to ensure alignment with data requirements.
- Perform data validation, cleansing, and quality checks to ensure accuracy and completeness.
- Troubleshoot ETL job failures, performance bottlenecks, and data inconsistencies.
- Support data warehouse development, including staging, integration, and presentation layers.
- Work with database technologies such as SQL Server, Oracle, MySQL, PostgreSQL, or cloud databases.
- Assist with deployment, scheduling, and monitoring of ETL jobs in production environments.
- Bachelor s degree in Computer Science, Information Systems, Engineering, or a related field.
- 8 years of experience developing ETL processes and data pipelines.
- Strong SQL skills, including complex queries, joins, and performance tuning.
- Hands-on experience with at least one major ETL tool (e.g., Informatica, SSIS, Talend, DataStage).
- Understanding of data warehousing concepts (star schema, fact/dimension tables, normalization).
- Experience with version control and CI/CD practices.
- Strong analytical and problem-solving skills.
- Experience with cloud data platforms (AWS Redshift, Snowflake, Azure Synapse, Google BigQuery).
- Knowledge of Python or other scripting languages for data manipulation.
- Familiarity with data governance, metadata management, and data quality frameworks.
- Experience working in Agile environments.
- Exposure to big data tools (Spark, Hadoop, Databricks) is a plus.
Integration Developer
Cutter & Buck -
Meridian, ID
Web Developer
POWER Engineers -
Boise, ID
UiPath Developer
CoSourcing Partners -
Boise, ID