What are the responsibilities and job description for the Lead Data Architect position at Cygnus Professionals Inc.?
Data Lead ( Supply Chain )
Philadelphia PA onsite
Longterm contract
W2 Contract
Data Architecture & Design
Responsible in designing Supply Chain Anomaly Detection and Revenue Assurance platform for Order processing data platform.
Define and own end-to-end supply chain data architecture, including source ingestion, transformation, storage, and consumption layers.
Design data models for supply chain domains such as inventory, logistics, fulfilment, and supplier performance.
Establish architecture standards, patterns, and design guidelines aligned with the enterprise data strategy.
Data Engineering & Platforms
Architect and guide development of scalable data pipelines using,
PySpark and Spark-based processing
Python for transformation, orchestration, and data services
Enterprise ETL/ELT frameworks
Advanced SQL for data modeling and analytics
Support both batch and near–real-time data processing use cases
Optimize pipelines for data quality, performance, scalability, and cost.
Supply Chain Analytics Enablement
Enable downstream usage for,
Supply chain planning and forecasting
Inventory optimization and demand analytics
Vendor and procurement performance reporting
Operational KPIs and executive dashboards
SKU Management
Partner with analytics and data science teams to ensure data is fit for advanced analytics platforms.
Cloud & Data Storage
Design and oversee implementation of data solutions leveraging cloud-native data platforms.
Ensure secure, compliant, and resilient data storage and access patterns
Data Governance & Quality
Partner with governance and security teams to ensure,
Data quality, consistency, and reliability
Data lineage, metadata management, and documentation
Compliance with data privacy, security, and internal policies
Leadership & Collaboration
Collaborate with product owners, supply chain leaders, engineering teams, and vendors
Translate business and operational needs into technical architecture solutions
Mentor data engineers and architects on best practices and design principles
Qualifications
Data Engineering: 10 years building data pipelines with Kafka/CDC, ETL tooling.
Streaming Expertise: Hands‑on with stream processing using Spark Streaming, Kafka Streams etc..
SQL & BI: Strong SQL/analytics skills and experience building dashboards
Data Governance: Familiarity with lineage/audit tools (Open Lineage), data privacy, and regulatory controls.
Communication: Strong cross‑functional collaboration and experience presenting to executives.
Education: Bachelor’s degree in CS/Engineering, or equivalent practical experience.