What are the responsibilities and job description for the Sr. Azure Data Engineer with AI - Seattle WA (Onsite) - Full time position at TEK Hire Solutions?
Role: Senior Azure Data Engineer With AI
Location: Seattle WA (Onsite)
Role Overview
You will serve as a primary bridge between the firm s Data & Analytics team and the broader organization translating business needs into reliable, usable analytics and AI-powered solutions. Success in this role depends as much on listening, communication, and judgment as on technical skill.
The Team Is At An Inflection Point
Experience & Education:
Location: Seattle WA (Onsite)
Role Overview
You will serve as a primary bridge between the firm s Data & Analytics team and the broader organization translating business needs into reliable, usable analytics and AI-powered solutions. Success in this role depends as much on listening, communication, and judgment as on technical skill.
The Team Is At An Inflection Point
- Transitioning from traditional BI toward a purpose-built AI ecosystem
- Undergoing a major ERP replacement
- Scaling analytics capabilities to meet growing firm-wide demand
- This is a hands-on engineering role. You will write code, architect systems, configure cloud infrastructure, and deploy production AI applications. This role offers genuine influence over how data and AI are designed, governed, and used across the firm.
Experience & Education:
- Bachelor s degree in Data Science, Information Systems, Computer Science, Business Analytics, or equivalent experience
- 5 8 years of experience in data analytics, analytics engineering, business intelligence, or related roles with increasing scope and ownership
- Advanced SQL skills, including query optimization and data modeling
- Proficient Python developer: ETL/ELT pipelines, REST API backends, automation, and AI application development; experience with pandas, SQLAlchemy, FastAPI, and LangChain or similar
- Hands-on Power BI experience (data modeling, DAX, deployment, and maintenance)
- Experience architecting and building multi-system API integrations, including authentication flows (OAuth 2.0 / SSO/SAML) and webhook patterns
- Hands-on Git-based development workflows: branching, code review, CI/CD pipelines, testing, and production deployment
- Experience designing and implementing data lake or lakehouse architectures using structured and unstructured data sources (e.g., Delta Lake, Azure Data Lake Storage, S3, or equivalent)
- Hands-on cloud deployment experience (Azure preferred, AWS, Snowflake, Databricks): compute, storage, managed databases, container services, and identity/access management
- Experience working with messy, real-world data and legacy systems
- Hands-on experience building and deploying LLM applications: RAG pipelines, vector store management (Pinecone, pgvector, or Weaviate), embedding workflows, and API integration using Anthropic, OpenAI, or Azure OpenAI SDKs
- Experience implementing AI evaluation frameworks (e.g., LangSmith, Arize, or custom evals) to monitor model outputs, detect drift, and enforce output quality in production
- Experience deploying AI applications to cloud infrastructure (Azure, AWS, or GCP), including containerization (Docker), serverless functions, and SSO/identity integration for enterprise tool rollout
- Nice to Have:
- Experience with ERP systems in professional services or project-based organizations
- Familiarity with data governance frameworks or formal data quality practices
- Experience working in AEC, design, or creative professional environments
- Background in internal training, enablement, or change management.
Salary : $100,000 - $130,000