What are the responsibilities and job description for the ETL Data Architect position at Sanrose Information Services Inc.?
Job Details
This is 100% onsite work position based in Tallahassee, FL. Apply only if you are willing to relocate
Education: A bachelor’s degree from an accredited college or university in Computer Science, Information Systems, or a related field is required. Alternatively, equivalent work experience, including experience in Service-Oriented Architecture (SOA) and Microsoft Azure Cloud Solutions, can be substituted for the educational requirement on a year-for-year basis, when applicable.
Certifications:
1. Microsoft Office Certification as an Azure Data Engineer Associate, Azure Solutions Architect Expert, and Azure Fundamentals
2. Azure Databricks Certification as a Data Engineer Associate.
Required Experience:
· 12 years of experience in ETL development and data engineering;
· 6 years of experience working with ADF, Azure Cloud, Azure Databricks, Azure Synapse Analytics, and Azure Purview;
· Proven track record of building and optimizing large-scale ETL pipelines for high-performance, high-availability environments;
· Extensive expertise in Spark, Python, and/or Scala for large-scale data transformations;
· Strong Structured Query Language (SQL) proficiency and experience working with complex data structures;
· In-depth knowledge of data governance, security protocols, and role-based access control (RBAC) within the Azure ecosystem; and
· Ability to design ETL processes that are resilient, efficient, and fully compliant with regulatory standards.
Responsibilities:
ETL Pipeline Design and Development:
· Lead the design and development of high-performing ETL processes to integrate and transform data across disparate sources;
· Deliver efficient, reliable pipelines that meet business needs and maintain the highest standards of security; and
· Utilize ADF to automate and streamline data workflows, ensuring smooth transitions from source to target.
Data Integration and Transformation:
· Build and manage complex ETL workflows that extract, transform, and load data for downstream analytics and reporting, ensuring data is accurate, timely, and secure; and
· Take ownership of data quality and validation, creating resilient ETL processes that ensure only trusted data reaches its destination.
Cloud Platform Expertise:
· Leverage the full power of the Azure ecosystem—ADF, Databricks, Synapse, and Purview—to manage and process high volumes of structured and unstructured data, delivering solutions that are scalable and performance-optimized; and
· Integrate large datasets into Azure Synapse Analytics, enabling analytics teams to deliver data-driven insights that support the Department’s mission.
Performance Optimization:
· Continuously optimize ETL jobs to minimize latency and maximize throughput; and
· Ensure the architecture supports fast, reliable data access for end-users and systems, meeting stringent performance metrics.
Security and Compliance:
· Embed security and compliance best practices in every step of the ETL process;.
· Protect sensitive data by adhering to industry standards and ensuring compliance with the Department’s data governance policies; and
· Use Azure Purview to enforce data governance, track data lineage, and ensure that data handling meets the highest standards of integrity.
Collaboration and Stakeholder Engagement:
· Partner with cross-functional teams (e.g., data engineers, analysts, business stakeholders, and security experts) to design and implement ETL solutions that meet the Department’s evolving needs; and
· Act as a technical leader and mentor, helping guide junior team members and providing expert guidance on data processing and transformation best practices.