What are the responsibilities and job description for the Sr Data Engineer position at KamisPro?
Senior Data Engineer
Location: Washington, DC (onsite)
Compensation: $165,000 – $185,000 benefits
Overview
The Senior Data Engineer will be a key contributor in designing and building a modern data analytics ecosystem leveraging the Databricks platform. This environment will support a high-performing analytics team responsible for evaluating and improving a high-profile federal government digital product.
In this role, the engineer will integrate data from multiple source systems and modernize how information is collected, processed, and consumed across the organization. The position requires deep expertise in developing scalable data pipelines, maintaining reliable data infrastructure, and improving data quality, performance, and accessibility. Solutions may include real-time streaming, batch processing workflows, and API-based integrations.
The Senior Data Engineer will operate within an agile development framework, working closely with product managers, analysts, and engineers to deliver data solutions that enable informed, data-driven decisions.
Core Expertise Areas
- Data Engineering and Data Platform Development
- ETL / ELT Pipeline Design and Implementation
- Apache Kafka and Event-Driven Streaming Architectures
- AWS Cloud Architecture and Services
- Databricks Unified Data Analytics Platform
Responsibilities
- Design, build, and maintain resilient data pipelines supporting both batch and streaming workloads
- Optimize data infrastructure to ensure scalability, reliability, and performance
- Enhance data ingestion and transformation processes to improve accuracy and availability
- Integrate data from diverse internal and external systems using modern interface patterns
- Collaborate with cross-functional teams in an agile environment to deliver analytics-ready data solutions
- Contribute to architectural decisions and technical standards for enterprise data platforms
Requirements
- Ability to successfully complete a federal government Public Trust background investigation
- Bachelor’s degree
- 12 years of experience delivering data engineering solutions that provide measurable value to end users or customers
- Advanced hands-on experience with Databricks, Apache Kafka, ETL development, workflow orchestration, and large-scale data processing
- Strong understanding of data architecture principles, pipeline design patterns, and infrastructure optimization techniques
- Demonstrated experience working in customer-focused, agile development teams
- Excellent verbal and written communication skills, with the ability to clearly articulate technical concepts and tradeoffs in terms of business and customer impact to both technical and non-technical audiences.
Salary : $165,000 - $185,000