What are the responsibilities and job description for the Senior Enterprise Data Architect position at NAM Info Inc?
Senior Enterprise Data Architect
Full Time Position
Onsite at Cranbury NJ
About NAM
NAM Info is an IT application and implementation services entity having its US HQ in Cranbury, New Jersy and Development centers HQ in Bangalore in India. NAM’s distinctive service line offerings include Professional Services, Managed Services – re-engineering, modernization largely involving emerging technology.
NAM is also home to a next-generation Data Intelligence Platform that enables enterprises to automate and accelerate their journey from data to insights. Our platform simplifies and unifies data engineering, governance, and analytics—empowering organizations to achieve end-to-end data intelligence at scale. As we expand our product capabilities, we are seeking a Data Product Tester to help ensure that the Inferyx platform delivers world-class performance, accuracy, and reliability.
About the Role
As an Enterprise Data Architect will be responsible for driving end-to-end Data Lake implementation projects using Snowflake or Databricks on AWS / Azure / GCP platforms. You will lead cross-functional teams, manage delivery from design to deployment, and serve as the main point of contact for clients and internal stakeholders.
This role demands hands-on technical experience, strong project management expertise, and a proven record of 3–4 full lifecycle Data Lake implementations. The ideal candidate should also bring pre-sales experience / ability to craft RFP/RFQ responses and Statements of Work (SOWs).
Most preferred implementation experience, preferably, be data product based.
Application based implementation will be “acceptable”.
Key Responsibilities
- Design, develop, and maintain ETL/ELT pipelines using AWS services and Snowflake.
- End-to-End Delivery:
- Lead and execute 3–4 full lifecycle Data Lake implementations — from architecture design and development to deployment and post-go-live support.
- Architecture & Implementation:
- Design and build scalable, secure, and efficient Data Lake solutions using Snowflake or Databricks on AWS, following Medallion Architecture (Bronze, Silver, Gold layers).
- AWS Integration:
- Implement and optimize Data Lake solutions using AWS services such as S3, Glue, Lambda, CloudFormation, EC2, IAM, and Redshift where applicable.
- Pre-Sales & Client Engagement:
- Participate in pre-sales discussions, prepare technical proposals, respond to RFPs/RFQs, and draft Statements of Work (SOWs) in collaboration with sales and architecture teams.
- Leadership & Team Management:
- Lead a team of data engineers, architects, and analysts, ensuring high-quality deliverables, mentoring team members, and fostering a culture of excellence.
- Project Governance:
- Oversee planning, resource allocation, risk management, and execution to ensure projects are delivered on time, within scope, and with high customer satisfaction.
- Agile Delivery:
- Facilitate Agile ceremonies (daily stand-ups, sprint planning, retrospectives) and maintain delivery dashboards and KPIs.
- Data Quality & Compliance:
- Enforce best practices in data security, access controls, and data quality management across all implementations.
- Continuous Improvement:
- Drive innovation and process efficiency across delivery frameworks and technical practices.
Mandatory Requirements
- Proven hands-on experience with Medallion Architecture.
- Experience delivering 3–4 full lifecycle Data Lake projects using Snowflake or Databricks on AWS.
- Strong AWS / Azure Cloud implementation experience with data integration and orchestration services.
- Proven experience in pre-sales, including RFP/RFQ responses and writing SOWs for clients.
- Solution scoping and conducting full-fledged POC’s, including complete architectural workflow layout.
- Price scoping.
- Resource scoping.
- Deliverable milestone scoping.
Required Skills & Experience
- 15–20 years of overall experience in data engineering, analytics, or cloud delivery roles.
- Should have been an enterprise Architect the recent five to seven years.
- Strong understanding of data platforms, ETL/ELT pipelines, data warehousing, and analytics lifecycle.
- Deep knowledge of Snowflake or Databricks architecture, performance tuning, and optimization.
- Hands-on proficiency in SQL, Python, or Unix Shell scripting.
- Sound knowledge of data security, access management, and governance frameworks.
- Excellent communication, stakeholder management, and presentation skills.
- Strong project management capabilities with Agile/Scrum or hybrid methodologies.
- Proven ability to manage multiple projects and cross-functional teams simultaneously.
- Strengthened Top 5 Skills: hands-on Databricks/Snowflake on AWS, Medallion Architecture, and automation (Python, PySpark, Terraform).
- Clarified Candidate Type: highlight that we’re looking for a builder-leader hybrid, not a governance-only profile.
- Added Delivery & CI/CD depth: Include ownership of end-to-end lifecycle, pipeline automation, and DevOps practices (Airflow/Glue).
Preferred Qualifications
- Certifications in AWS (Solution Architect / Data Analytics), Snowflake, or Databricks.
- Experience with CI/CD for data pipelines, Terraform/CloudFormation, and modern orchestration tools (e.g., Airflow, dbt).
- Familiarity with data catalog/governance tools such as Collibra, Alation, or AWS Glue Data Catalog.
Kindly reply with your resume to Email- jnehru@nam-it.com