What are the responsibilities and job description for the Data Engineer (Remote) position at Da Vinci Software?
Overview:
Our client is seeking an experienced Data Engineer who will be responsible for the Design and maintain data pipelines and services using best practice for ETL/ELT, data management and data governance.
Key Responsibilities:
Position Requirements:
Our client is seeking an experienced Data Engineer who will be responsible for the Design and maintain data pipelines and services using best practice for ETL/ELT, data management and data governance.
Key Responsibilities:
- Contribute to the design, configuration, and support of data, analytics, and AI environments across Google Cloud Platform (GCP) and Microsoft Azure, including Microsoft Fabric -Build and maintain data pipelines to ingest, cleanse, transform, and curate structured and unstructured data
- Support batch and near–real-time data ingestion and transformation workflows
- Use Infrastructure as Code (IaC) tools (e.g., Terraform) to help automate cloud environment provisioning and configuration
- Configure and support cloud services related to data ingestion, integration, messaging, CI/CD, and data processing
- Assist with data modeling and performance optimization in cloud data warehouses (e.g., partitioning and clustering in BigQuery)
- Support the setup and tuning of operational databases or data-serving layers based on defined use cases
- Implement and maintain monitoring, logging, and alerting for data pipelines and platforms
- Write and maintain data transformations using SQL and Python
- Collaborate with engineers, analysts, and product teams in an iterative, product-focused environment
Position Requirements:
- Bachelor's degree in Computer Science or Engineering from an accredited University or College.
OR - Associate’s degree in Computer Science or Engineering from an accredited University or College with two (2) years of experience.
- Fully Remote
- W2 Candidates Only