What are the responsibilities and job description for the Data Application Engineer (Sunnyvale, CA) position at The Mom Project?
- Please note that our customer is currently not considering applicants from the following locations: Alabama, Arkansas, Delaware, Florida, Indiana, Iowa, Louisiana, Maryland, Mississippi, Missouri, Oklahoma, Pennsylvania, South Carolina, and Tennessee.***
We are seeking a Data Application Engineer on a contract basis to support our Customer's business needs. This role is on-site in Sunnyvale, CA.
The Data Application Engineer designs, builds, and maintains data-driven applications and pipelines that enable seamless data integration, transformation, and delivery across systems. This role focuses on creating scalable, efficient, and reliable data solutions while collaborating with analysts and stakeholders to meet business objectives.
Responsibilities
- Develop Data Applications: Build and maintain data-centric applications, tools, and APIs to support both real-time and batch data processing.
- Data Integration: Design and implement data ingestion pipelines integrating data from multiple sources, including databases, APIs, and file systems.
- Data Transformation: Create reusable ETL/ELT pipelines to transform raw data into structured, consumable formats using tools such as Snowflake, DBT, or Python.
- Collaboration: Partner with analysts and stakeholders to define requirements and deliver scalable, production-ready data solutions.
- Documentation: Maintain comprehensive documentation for applications, workflows, and processes to ensure long-term maintainability and transparency.
- Education: Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience).
- Programming: Proficiency in Python, C#, ASP.NET (Core).
- Databases: Strong understanding of SQL, database design, and experience with relational databases such as Snowflake and SQL Server.
- ETL/ELT Tools: Hands-on experience with frameworks such as Apache Airflow (DBT experience a plus).
- Cloud Platforms: Familiarity with AWS, Azure, or Google Cloud and data services (e.g., S3, AWS Lambda).
- Data Pipelines: Experience with both real-time (Kafka, Spark) and batch data processing.
- APIs: Experience designing and integrating RESTful APIs for data access and application communication.
- Version Control: Proficiency with Git or similar systems for source code management.
- Analytical Skills: Strong problem-solving and troubleshooting ability for complex data challenges.
- Excellent communication and collaboration abilities across technical and non-technical teams.
- Strong organizational and time-management skills in a fast-paced environment.
- High attention to detail with a commitment to delivering high-quality, reliable results.
- Experience with containerization tools (Docker) and orchestration platforms (Kubernetes).
- Exposure to business intelligence tools such as Tableau, Power BI, or Looker.
Contractor benefits are available through our 3rd Party Employer of Record (Available upon completion of waiting period for eligible engagements) Benefits include: Medical, Dental, Vision, 401k.
An Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.