What are the responsibilities and job description for the AI Engineer position at NTT DATA North America?
Onsite in Warren New Jersey
Contract
Must be able to work on NTT DATA payroll. No C2C available at this time.
Company Overview:
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a AI Engineer to join our team in Warren, New Jersey (US-NJ), United States (US).
Job Description:
Position Overview
This role sits at the intersection of Generative AI, MLOps, and Intelligent Agent development — and is responsible for designing, building, and deploying AI-powered solutions that directly support our P&C insurance operations.
You will work closely with our data engineering, analytics, and business teams to deliver LLM-powered applications, automated AI agents, and production-ready ML pipelines across claims, underwriting, and actuarial domains. This is a hands-on, delivery-focused role for an engineer who is comfortable moving from architecture whiteboard to working code.
Key Responsibilities
Generative AI & LLM Engineering
Design, fine-tune, and deploy Large Language Models (LLMs) for insurance-specific use cases including document intelligence, claims summarization, policy interpretation, and underwriting Q&A.
Build Retrieval-Augmented Generation (RAG) pipelines using vector databases (e.g., Azure AI Search, Pinecone, ChromaDB) to ground LLM outputs in enterprise knowledge bases.
Develop prompt engineering frameworks and systematic evaluation pipelines to ensure LLM output quality, consistency, and safety in regulated insurance contexts.
Integrate LLM capabilities with internal data platforms via LangChain, LlamaIndex, or Semantic Kernel.
Evaluate and benchmark foundational models (OpenAI GPT-4o, Azure OpenAI, Claude, Mistral, Llama) against insurance-specific tasks to guide platform selection.
AI Agents & Automation
Architect and implement autonomous AI agents capable of multi-step reasoning, tool use, and decision-making for workflows such as FNOL triage, claims routing, policy lookup, and compliance checks.
Build agentic frameworks using patterns such as ReAct, Chain-of-Thought, and Tool-Augmented Agents to handle complex, multi-turn insurance workflows.
Design human-in-the-loop (HITL) checkpoints and escalation logic to ensure AI agents operate within defined risk and compliance boundaries.
Integrate agents with internal APIs, data platforms, and enterprise systems using orchestration tools such as Azure Logic Apps, Apache Airflow, or Databricks Workflows.
Develop guardrails, monitoring, and audit logging for all deployed agents to meet regulatory and governance standards.
MLOps & Model Deployment
Build and maintain end-to-end MLOps pipelines covering model training, versioning, validation, deployment, and monitoring using MLflow, Azure ML, and Databricks.
Implement CI/CD pipelines for ML models using Azure DevOps or GitHub Actions, enabling reliable, repeatable model releases.
Deploy models as REST APIs or batch inference services on Azure Kubernetes Service (AKS) or Azure Container Apps, ensuring scalability and low-latency response.
Establish model monitoring frameworks to detect data drift, model degradation, and prediction anomalies in production.
Manage the model registry and lineage tracking to maintain governance and auditability of all AI assets.
Collaborate with data engineering teams to ensure feature pipelines are production-grade, versioned, and integrated with the Feature Store on Databricks or Azure ML.
Collaboration & Delivery
Work closely with business analysts, actuaries, underwriters, and claims professionals to translate domain requirements into AI solution designs.
Participate in Agile/Scrum ceremonies including sprint planning, standups, and retrospectives as an active delivery contributor.
Produce clear, well-structured technical documentation including solution designs, API specs, model cards, and deployment runbooks.
Mentor junior engineers and contribute to internal AI engineering best practices and standards.
Required Qualifications
Education
Bachelor's degree in Computer Science, Data Science, Machine Learning, Software Engineering, or a related quantitative field. Master's degree is a plus.
Experience
3–5 years of professional experience in AI/ML engineering, with demonstrated delivery of production-grade AI systems.
Hands-on experience building and deploying LLM-powered applications using frameworks such as LangChain, LlamaIndex, or Semantic Kernel.
Proven experience implementing MLOps pipelines in cloud environments (Azure preferred).
Experience developing AI agents or automation workflows using agentic frameworks.
Prior experience in financial services, insurance, or regulated industries is strongly preferred.
Technical Skills
Generative AI & LLMs
OpenAI / Azure OpenAI (GPT-4o, GPT-4 Turbo), Claude, Mistral, or open-source LLMs (Llama 3, Falcon)
RAG architectures, vector search, embeddings (OpenAI, Cohere, SentenceTransformers)
LangChain, LlamaIndex, Semantic Kernel
Prompt engineering, few-shot learning, instruction tuning, RLHF concepts
AI Agents & Automation
Agentic frameworks: ReAct, Tool-Augmented Agents, LangGraph, AutoGen, CrewAI
Workflow orchestration: Apache Airflow, Databricks Workflows, Azure Logic Apps
API design and integration: REST, GraphQL, Webhooks
MLOps & Model Serving
MLflow (experiment tracking, model registry, model serving)
Azure Machine Learning, Databricks AutoML & Feature Store
Docker, Kubernetes (AKS), Azure Container Apps
CI/CD: Azure DevOps, GitHub Actions
Model monitoring: Evidently AI, Azure ML monitoring, or equivalent
Programming & Data Engineering
Python (expert level): PyTorch, Hugging Face Transformers, scikit-learn, Pandas, NumPy
PySpark and Delta Lake for large-scale data processing
SQL (T-SQL / Spark SQL) for feature engineering and data validation
Git for version control and collaborative development
Cloud & Platform
Microsoft Azure (Azure OpenAI, Azure AI Search, AKS, Azure Data Factory, Azure Key Vault)
Databricks (Unity Catalog, Delta Live Tables, Workflows)
Microsoft Fabric / OneLake (familiarity a strong plus)
Preferred Qualifications
Experience with P&C insurance workflows such as FNOL processing, claims triage, underwriting decisioning, or actuarial modeling.
Familiarity with insurance regulatory requirements including NAIC guidelines and data privacy standards (CCPA, GDPR).
Experience implementing responsible AI principles — fairness, explainability, and bias mitigation — in regulated environments.
Microsoft certifications: Azure AI Engineer Associate (AI-102) or Azure Data Scientist Associate (DP-100) preferred.
Exposure to Data Mesh patterns and publishing AI model outputs as domain data products.
Familiarity with Databricks Model Serving and Mosaic AI capabilities.
About NTT DATA:
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com
NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Where required by law, NTT DATA provides a reasonable range of compensation for specific roles. The starting hourly range for this remote role is ($75 - 80/hourly ). This range reflects the minimum and maximum target compensation for the position across all US locations. Actual compensation will depend on several factors, including the candidate's actual work location, relevant experience, technical skills, and other qualifications.
This position is eligible for company benefits that will depend on the nature of the role offered. Company benefits may include medical, dental, and vision insurance, flexible spending or health savings account, life, and AD&D insurance, short-and long-term disability coverage, paid time off, employee assistance, participation in a 401k program with company match, and additional voluntary or legally required benefits.
Salary : $4