What are the responsibilities and job description for the Sr. Data Engineer - Remote position at Lensa?
Lensa is a career site that helps job seekers find great jobs in the US. We are not a staffing firm or agency. Lensa does not hire directly for these jobs, but promotes jobs on LinkedIn on behalf of its direct clients, recruitment ad agencies, and marketing partners. Lensa partners with DirectEmployers to promote this job for Sharecare. Clicking "Apply Now" or "Read more" on Lensa redirects you to the job board/employer site. Any information collected there is subject to their terms and privacy notice.
Job Description
Sharecare is a digital healthcare company that delivers software and tech-enabled services to stakeholders across the healthcare ecosystem to help improve care quality, drive better outcomes, and lower costs. Through its data-driven AI insights, evidence-based resources, and comprehensive platform – including benefits navigation, care management, home care resources, health information management, and more – Sharecare helps people easily and efficiently manage their healthcare and improve their well-being. Across its three business channels, Sharecare enables health plan sponsors, health systems and physician practices, and leading pharmaceutical brands to drive personalized and value-based care at scale. To learn more, visit www.sharecare.com .
Job Summary
Sharecare is seeking a Senior Data Engineer to help build and evolve our next-generation data platform supporting high-profile partners and customers. In this role, you will collaborate closely with Product, Account Management, QA, Analytics, and Architecture teams to deliver scalable, reliable, and secure data solutions.
This is a hands-on, high-impact position within a fast-paced, agile environment, offering the opportunity to develop breakthrough solutions in the health information and digital health space. The ideal candidate is self-driven, analytical, and detail-oriented, with deep expertise in Python-based data engineering, modern orchestration frameworks, and cloud-native architectures.
Essential Job Functions
Sharecare is an Equal Opportunity Employer and doesn't discriminate on the basis of race, color, sex, national origin, sexual orientation, gender identity, religion, age, disability, genetic information, protected veteran status,or other non-merit factor.
If you have questions about this posting, please contact support@lensa.com
Job Description
Sharecare is a digital healthcare company that delivers software and tech-enabled services to stakeholders across the healthcare ecosystem to help improve care quality, drive better outcomes, and lower costs. Through its data-driven AI insights, evidence-based resources, and comprehensive platform – including benefits navigation, care management, home care resources, health information management, and more – Sharecare helps people easily and efficiently manage their healthcare and improve their well-being. Across its three business channels, Sharecare enables health plan sponsors, health systems and physician practices, and leading pharmaceutical brands to drive personalized and value-based care at scale. To learn more, visit www.sharecare.com .
Job Summary
Sharecare is seeking a Senior Data Engineer to help build and evolve our next-generation data platform supporting high-profile partners and customers. In this role, you will collaborate closely with Product, Account Management, QA, Analytics, and Architecture teams to deliver scalable, reliable, and secure data solutions.
This is a hands-on, high-impact position within a fast-paced, agile environment, offering the opportunity to develop breakthrough solutions in the health information and digital health space. The ideal candidate is self-driven, analytical, and detail-oriented, with deep expertise in Python-based data engineering, modern orchestration frameworks, and cloud-native architectures.
Essential Job Functions
- Design, build, and maintain scalable data pipelines using Python, Apache Airflow, and Apache Spark
- Analyze business and technical requirements and translate them into reliable, future-proof data solutions
- Develop, validate, deploy, and support complex ETL/ELT pipelines at scale
- Build clean, secure, and maintainable REST APIs following company standards
- Implement real-time and batch processing solutions for diverse data sources
- Develop reusable data engineering and AI frameworks for enterprise-wide adoption
- Define and manage domain-based “source of truth” data models, ensuring scalability and end-to-end data lineage
- Implement data governance practices, automate data quality checks, and enable pipeline testing
- Optimize data infrastructure for performance, cost efficiency, and reliability
- Manage source control, CI/CD pipelines, and production deployments
- Partner with data scientists and analysts to support AI/ML initiatives
- Strong problem-solving and analytical thinking
- Excellent communication and collaboration skills across technical and non-technical teams
- Ability to work independently as well as within cross-functional, agile teams
- Strong organizational and time-management skills
- Ability to balance big-picture thinking with attention to technical detail
- Flexibility and adaptability in a rapidly evolving environment
- Bachelor’s degree (or higher) in Computer Science, Data Engineering, or a related field
- 10 years of experience in data engineering or related roles
- Strong proficiency in Python and related libraries (Pandas, SQLAlchemy, Boto3, Paramiko, Flask, FastAPI)
- Advanced SQL skills with experience analyzing healthcare datasets (e.g., claims, provider directories)
- Hands-on experience with orchestration tools such as Apache Airflow and Databricks Workflows
- Strong experience with Apache Spark; exposure to Apache Flink is a plus
- Experience integrating with .NET applications and working with SQL Server
- Cloud platform experience (AWS, Azure, or GCP required)
- Experience with data warehousing solutions such as Amazon Redshift or Vertica
- Solid understanding of distributed systems and functional programming concepts
- Proficiency with Git and modern CI/CD practices
- Exposure to streaming technologies, containerization, and ML pipelines preferred
- Familiarity with AI tools and large language models is a plus
Sharecare is an Equal Opportunity Employer and doesn't discriminate on the basis of race, color, sex, national origin, sexual orientation, gender identity, religion, age, disability, genetic information, protected veteran status,or other non-merit factor.
If you have questions about this posting, please contact support@lensa.com