What are the responsibilities and job description for the Data Engineer position at Kforce Inc?
Responsibilities
Kforce has a client that is seeking a Data Engineer in Westlake, TX. Key Tasks:
We offer comprehensive benefits including medical/dental/vision insurance, HSA, FSA, 401(k), and life, disability & ADD insurance to eligible employees. Salaried personnel receive paid time off. Hourly employees are not eligible for paid time off unless required by law. Hourly employees on a Service Contract Act project are eligible for paid sick leave.
Note: Pay is not considered compensation until it is earned, vested and determinable. The amount and availability of any compensation remains in Kforce's sole discretion unless and until paid and may be modified in its discretion consistent with the law.
This job is not eligible for bonuses, incentives or commissions.
Kforce is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.
By clicking “Apply Today” you agree to receive calls, AI-generated calls, text messages or emails from Kforce and its affiliates, and service providers. Note that if you choose to communicate with Kforce via text messaging the frequency may vary, and message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You will always have the right to cease communicating via text by using key words such as STOP.
Kforce has a client that is seeking a Data Engineer in Westlake, TX. Key Tasks:
- Architect and design scalable, secure, and high-performance data pipelines for on-prem and cloud environments
- Develop and Enhance automated ETL/ELT workflows leveraging modern frameworks and best practices
- Modernize Legacy Systems by migrating data pipelines to cloud-native solutions
- Implement CI/CD Pipelines for data workflows, ensuring robust version control and automated deployments
- Apply DevOps Practices to improve reliability, observability, and maintainability of data infrastructure
- Troubleshoot and Resolve Issues in data pipelines, identifying root causes and implementing permanent fixes
- Analyze Business Processes to identify inefficiencies and opportunities for automation and improvement
- Translate Business Insights into Technical Solutions by creating actionable technical plans aligned with business objectives
- Communicate effectively with both technical teams and business stakeholders to ensure clarity and alignment
- Execute Implementation Plans with urgency, high quality, and accountability for outcomes
- Leverage All Available Resources proactively to deliver projects on time and within scope
- Optimize Performance of large-scale data processing systems and troubleshoot pipeline failures
- Ensure Data Quality and Governance through validation frameworks, lineage tracking, and compliance with security standards
- Collaborate with Cross-Functional Teams including data scientists, analysts, and business stakeholders to deliver data solutions
- Mentor and guide junior engineers, promoting best practices and technical excellence
- Drive Innovation by evaluating emerging technologies and recommending improvements to data architecture
- Data Engineering expertise: Advanced experience in building and maintaining automated data pipelines; Strong knowledge of ETL/ELT processes and orchestration tools (Airflow, Prefect, Dagster)
- Programming & Querying: Proficiency in Python, Java, or Scala; Advanced SQL skills for complex queries and optimization
- Cloud Platforms: Hands-on experience with AWS, Azure, or GCP data services (Glue, Data Factory, Dataflow); Familiarity with object storage and IAM/security best practices
- Cloud Certification: AWS Certified Data Analytics or equivalent certification in Azure/GCP
- DevOps & CI/CD: Advanced experience with CI/CD tools (GitHub Actions, Jenkins, Azure DevOps); Infrastructure-as-Code (Terraform, CloudFormation)
- Containerization & Orchestration: Docker and Kubernetes for deploying data workflows
- Data Modeling & Storage: Expertise in relational and NoSQL databases; Advanced knowledge of data lake and data warehouse architectures (Snowflake, Redshift, BigQuery)
We offer comprehensive benefits including medical/dental/vision insurance, HSA, FSA, 401(k), and life, disability & ADD insurance to eligible employees. Salaried personnel receive paid time off. Hourly employees are not eligible for paid time off unless required by law. Hourly employees on a Service Contract Act project are eligible for paid sick leave.
Note: Pay is not considered compensation until it is earned, vested and determinable. The amount and availability of any compensation remains in Kforce's sole discretion unless and until paid and may be modified in its discretion consistent with the law.
This job is not eligible for bonuses, incentives or commissions.
Kforce is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.
By clicking “Apply Today” you agree to receive calls, AI-generated calls, text messages or emails from Kforce and its affiliates, and service providers. Note that if you choose to communicate with Kforce via text messaging the frequency may vary, and message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You will always have the right to cease communicating via text by using key words such as STOP.
Salary : $72 - $82