What are the responsibilities and job description for the Data Engineer position at Kelly Science, Engineering, Technology & Telecom?
Important information: To be immediately considered, please send an updated version of your resume to somp767@kellyservices.com
Title: Senior Software Engineer (Senior Cloud Data Engineer – AI & Analytics)
Location: Santa Clara, CA or Des Moines, IA
Duration: Until Oct/2026 (Possibility of extension)
W2 Contract (No C2C)
Pay rate: $64 - $71 per hour
Description:
We are seeking a highly technical and self-directed Senior Software Engineer to contribute to the development of data processing pipelines for a new AI-enabled data analytics product targeted at Large Ag customers.
Ideal candidates will have:
- 5 years of professional software development experience using Python
- 2 years of hands-on experience with AWS and Databricks in production environments
- We are looking for mid-career professionals with a proven track record of deploying cloud-native solutions in fast-paced software delivery environments.
- In addition to technical expertise, successful candidates will demonstrate:
- Strong communication skills, with the ability to clearly articulate technical concepts to both technical and non-technical stakeholders (this is extremely important - please vet out accorrdingly)
- The ability to work effectively with limited supervision in a distributed team environment
- A proactive mindset, adaptability, and a commitment to team success
Key Responsibilities:
- Design and implement AWS/Databricks solutions to process large geospatial datasets for real-time API services
- Develop and maintain REST APIs and backend processes using AWS Lambda
- Build infrastructure as code using Terraform
- Set up and maintain CI/CD pipelines using GitHub Actions
- Optimize system performance and workflows to improve scalability and reduce cloud costs
- Enhance monitoring and alerting across systems using Datadog
- Support field testing and customer operations by debugging and resolving data issues
- Collaborate with product managers and end users to understand requirements, build backlog, and prioritize work
- Work closely with data scientists to productionize prototypes and proof-of-concept models
- Required Skills & Experience:
- Excellent coding skills in Python with experience deploying production-grade software
- Strong foundation in test-driven development
- Solid understanding of cloud computing, especially AWS services such as IAM, Lambda, S3, RDS
- Professional experience building Databricks workflows and optimizing PySpark queries
Preferred Experience:
- Experience working with geospatial data and related libraries/tools
- Experience building and operating API using AWS lambda
- Familiarity with data lake architectures and Delta Lake
- Experience with event-driven architectures and streaming data pipelines (e.g., Kafka, Kinesis)
- Exposure to ML Ops or deploying machine learning models in production
- Prior experience in cross-functional teams involving product, data science, and backend engineering teams
Salary : $64 - $71