What are the responsibilities and job description for the ETL Architect position at Coforge?
- Role: ETL Architect
- Skills: ETL/ELT Architecture, Azure, Databricks, Data Sources (Oracle, SQL Server, SAP, Salesforce), Data Modelling, Cloud Platforms
- Experience: 15 years
- Location: Chicago IL (Onsite Job)
We at Coforge are seeking an ETL Architect to design and lead scalable data integration solutions for enterprise Azure Databricks based Lakehouse. This role requires expertise in Azure, Databricks along with hands-on experience in working with diverse data sources and ETL/ELT technologies. The architect will ensure efficient data movement, transformation, and governance across heterogeneous systems to support analytics and business intelligence initiatives.
Key Responsibilities:
- Design and implement ETL/ELT architecture with Databricks as the enterprise Lakehouse.
- Integrate data from diverse sources (RDBMS, APIs, SaaS apps, flat files, streaming platforms, cloud services) into Lakehouse.
- Define data integration best practices, including reusability, scalability, and cost optimization.
- Lead and mentor ETL/ELT developers in building robust pipelines.
- Establish data quality, governance, and lineage frameworks.
- Collaborate with data architects, BI developers, and business stakeholders for end-to-end data delivery.
- Evaluate and implement ETL/ELT tools and automation frameworks suited for multiple source systems.
- Troubleshoot integration issues and define long-term solutions.
- Keep up to date with Snowflake features and emerging data integration technologies.
Required Skills & Qualifications:
- 10 years in ETL/ELT architecture and development.
- Strong expertise in Databricks (warehouses, streams, tasks, notebooks, data sharing).
- Strong SQL and performance optimization skills.
- Experience working with varied data sources: Oracle, SQL Server, SAP, Salesforce, REST APIs, flat files, cloud-native systems.
- Solid understanding of data modeling (star schema, snowflake schema, data vault) and data warehousing principles.
- Hands-on experience with cloud platforms (AWS/Azure/GCP) for data integration.
- Strong leadership and communication skills for onsite stakeholder management.
Nice to Have:
- Experience with real-time/streaming data integration (Kafka, Databricks streaming, Azure Event Hub).
- Familiarity with data governance and catalog tools (Collibra, unity catalog).
- Knowledge of big data ecosystems (Spark, Hadoop).
- Exposure to BI/Analytics platforms (Power BI, Tableau