What are the responsibilities and job description for the AI Architect position at Infoverity?
About the Role
Infoverity is seeking a Data & AI Architect to design, build, and scale enterprise data and AI solutions. In this strategic role, you will bridge the gap between business objectives and technical execution, architecting scalable AI solutions that deliver measurable business value.
You will work across modern data platforms (Databricks, Snowflake, Microsoft Fabric etc.) and cloud ecosystems, supporting the full lifecycle from data ingestion and modeling through AI enablement, agent deployment, and production operations. This is a hands-on, consultative role requiring strong architectural judgment, customer engagement skills, and the ability to guide solutions from proof-of-value to enterprise scale.
Key Responsibilities
Architecture, Strategy & Roadmapping
- Define Data & AI Strategy: Partner with business and technical stakeholders to translate enterprise challenges into data, analytics, and AI solution architectures, producing clear roadmaps, reference architectures, and implementation plans.
- Enterprise Integration: Architect solutions that integrate with enterprise source systems, APIs, operational platforms, and downstream consuming layers.
- Standards & Governance: Establish design patterns and technical standards across data management, analytics, and AI—covering security, scalability, performance, lineage, and compliance requirements.
Data Platform & Ecosystem Design
- Modern Data Stack Architecture: Design and implement scalable data pipelines, transformation layers, and orchestration frameworks across platforms such as Databricks, Snowflake, Microsoft Fabric, GCP or similar.
- Analytics & Semantic Layers: Support dimensional modeling, feature stores, and semantic layers to enable BI, advanced analytics, and AI workloads.
- Cloud Infrastructure: Design secure and best practice architectures on AWS, Azure or GCP, leveraging managed services, serverless patterns, and containerized workloads (Docker/Kubernetes) where appropriate.
AI Engineering & Agent Deployment
- AI Enablement on Data Platforms: Enable AI and ML workloads using curated, governed enterprise data—supporting model training, inference, and retrieval-based architectures (e.g., RAG).
- Agent-Based Architectures: Design and deploy agentic or workflow-driven AI solutions, incorporating orchestration, tool/function calling, guardrails, and observability to support multi-step business processes.
- Prototyping to Production: Lead Proof-of-Concept and pilot initiatives, validating business value and technical feasibility before scaling to production-grade solutions.
Delivery, Leadership & Enablement
- Technical Leadership: Review solution designs, contribute to critical implementation components, and mentor engineers across data engineering, analytics, and AI best practices.
- Cross-Team Collaboration: Partner with delivery leads and stakeholders to manage technical risks, trade-offs, and dependencies throughout project lifecycles.
- Operational Readiness: Support deployment, monitoring, and optimization of data and AI solutions to ensure reliability, performance, and long-term maintainability.
Qualifications
Experience & Background
- Professional Experience: 5 years in data engineering, software engineering, or solution architecture, with experience spanning data platforms and analytics or AI workloads.
- Cloud Platforms: Strong hands-on experience with AWS or Azure, including data, analytics, and AI-related services.
- Data Platforms: Proven experience designing or implementing solutions on Databricks, Snowflake, Microsoft Fabric, or similar platforms.
Technical Skills
- Data Engineering: Strong understanding of data ingestion, transformation, orchestration, and modeling patterns.
- AI & Agentic Architecture: Familiarity with Generative AI concepts (LLMs, embeddings, RAG), ML workflows, and how AI integrates with enterprise data platforms. AI Agents vs. Workflows and business process automation best practices.
- MLOps / DataOps: Experience with CI/CD, model or pipeline deployment, monitoring, lineage, and governance tooling (e.g., MLflow, orchestration frameworks).
- Architecture Patterns: Experience with microservices, APIs, event-driven architectures, and containerization.
- Programming: Proficiency in SQL, Python, Spark, and distributed data processing frameworks.
- Excellent problem-solving: communication, and client-facing consulting skills.
- Work Style: Ability to work independently and as part of a team.
Competencies & Soft Skills
- Business Alignment: Ability to connect data and AI capabilities to measurable business outcomes.
- Communication: Comfortable explaining architectural decisions to both technical and non-technical audiences.
- Enterprise Mindset: Strong appreciation for security, governance, compliance, and operational rigor in large organizations.
Preferred Qualifications
- Degree in Computer Science, Data Engineering, Data Science, or related field.
- Cloud or platform certifications (AWS, Azure, Databricks, Snowflake).
- Experience working in consulting or customer-facing delivery environments.
Highlights of our Benefits Package
- Infoverity's benefits program is designed to provide comprehensive, affordable, medical, dental, and vision coverage and support for you and your family. Infoverity employees pay only 15% of the cost!
- In addition to these benefits Infoverity is also proud to offer life and disability insurance, a wide-ranging EAP, and pet insurance!
- Infoverity offers a 401(k) plan with competitive employer contributions
- Donate to charitable endeavors? That's great! Infoverity matches a portion of your donations
- Infoverity provides ample opportunities for professional development and upskilling
- Infoverity offers flexible hours and hybrid/remote work
- Finally...Infoverity is a 100% Employee Owned company (ESOP)!
Equal Employment Opportunity
Infoverity is an equal opportunity employer. We celebrate diversity and are committed to fostering, cultivating, and preserving a culture of inclusion. We believe that our people are our most valuable asset. The collective sum of the individual differences, life experiences, knowledge, and unique capabilities that our employees invest in their work represents