What are the responsibilities and job description for the Principal Engineer position at Anblicks?
Job Duties
Principal Engineer is responsible for leading the design, development, and implementation of advanced engineering solutions to support enterprise-scale systems and applications. Specific duties include, but are limited to, the following:
Lead the end-to-end software architecture design and solution development process across complex, cross-functional projects using technologies such as Python, C, SQL, and JavaScript. Design and implement scalable, secure, and cloud-native applications and microservices using Flask, deployed on Docker containers and orchestrated via AWS EC2 and GCP services, including BigQuery, Dataflow, CloudStorage, Composer, and Airflow. Develop and optimize Data Pipelines and ETL workflows using Hadoop, GCP Dataflow, and Airflow, with seamless integration to structured and unstructured data sources like PostgreSQL and MongoDB. Build REST APIs and JSON-based service integrations using Flask, supporting real-time data exchange across distributed systems. Integrate email services and notifications via SendGrid to enhance application communication layers. Design and execute automated test suites with Pytest, ensuring high reliability and code coverage across APIs, backend services, and data workflows. Establish CI/CD pipelines using Git, Jenkins, and GCP Composer, driving continuous integration and automated deployments with robust testing. Collaborate closely with product managers, QA engineers, and DevOps teams to deliver full-stack solutions across Windows and Linux environments. Build intuitive, responsive front-end components using HTML, CSS, Bootstrap, and JavaScript, enabling seamless user experiences integrated with backend services. Work with software engineers, promote best practices in secure coding, test automation, and clean architecture design. Conduct rigorous code reviews, system performance assessments, and tuning to ensure enterprise-grade performance and maintainability. Document application architectures, system interfaces, and cloud integration patterns using standardized technical documentation. Stay current with evolving technologies in cloud computing, containerization, data processing, and software engineering methodologies to drive continuous improvement and innovation.
Job Requirements
Bachelor's degree in Computer Science, Computer Information Systems, Data Engineering, or Engineering related or Technical related fields plus 5 years of progressively responsible post-baccalaureate experience. Foreign degree equivalent is acceptable. In lieu of the above, we will also accept a Master's degree in Computer Science, Computer Information Systems, Data Engineering, or Engineering related or Technical related fields plus 2 years of experience. We will also accept any suitable combination of education, training and/or experience. Experience should include minimum 2 years of working with Python; SQL; Mongo DB; AWS EC2; Flask; Sendgrid; Docker; Git; Hadoop; GCP Services: including BigQuery, Airflow, Dataflow, CloudStorage, Composer; CI/CD; JSON; Data pipelines; REST API; Pytest.
HOURS: M-F, 8:00 a.m. – 5:00 p.m.
JOB LOCATION: Dallas, Texas. Travel is not required, but candidates must be willing to relocate to unanticipated locations across the country per company contract demand.
CONTACT: Email resume referencing job code# PE03132026ANB to Maruthi Technologies Inc. DBA Anblicks at recruitment@anblicks.com
Principal Engineer is responsible for leading the design, development, and implementation of advanced engineering solutions to support enterprise-scale systems and applications. Specific duties include, but are limited to, the following:
Lead the end-to-end software architecture design and solution development process across complex, cross-functional projects using technologies such as Python, C, SQL, and JavaScript. Design and implement scalable, secure, and cloud-native applications and microservices using Flask, deployed on Docker containers and orchestrated via AWS EC2 and GCP services, including BigQuery, Dataflow, CloudStorage, Composer, and Airflow. Develop and optimize Data Pipelines and ETL workflows using Hadoop, GCP Dataflow, and Airflow, with seamless integration to structured and unstructured data sources like PostgreSQL and MongoDB. Build REST APIs and JSON-based service integrations using Flask, supporting real-time data exchange across distributed systems. Integrate email services and notifications via SendGrid to enhance application communication layers. Design and execute automated test suites with Pytest, ensuring high reliability and code coverage across APIs, backend services, and data workflows. Establish CI/CD pipelines using Git, Jenkins, and GCP Composer, driving continuous integration and automated deployments with robust testing. Collaborate closely with product managers, QA engineers, and DevOps teams to deliver full-stack solutions across Windows and Linux environments. Build intuitive, responsive front-end components using HTML, CSS, Bootstrap, and JavaScript, enabling seamless user experiences integrated with backend services. Work with software engineers, promote best practices in secure coding, test automation, and clean architecture design. Conduct rigorous code reviews, system performance assessments, and tuning to ensure enterprise-grade performance and maintainability. Document application architectures, system interfaces, and cloud integration patterns using standardized technical documentation. Stay current with evolving technologies in cloud computing, containerization, data processing, and software engineering methodologies to drive continuous improvement and innovation.
Job Requirements
Bachelor's degree in Computer Science, Computer Information Systems, Data Engineering, or Engineering related or Technical related fields plus 5 years of progressively responsible post-baccalaureate experience. Foreign degree equivalent is acceptable. In lieu of the above, we will also accept a Master's degree in Computer Science, Computer Information Systems, Data Engineering, or Engineering related or Technical related fields plus 2 years of experience. We will also accept any suitable combination of education, training and/or experience. Experience should include minimum 2 years of working with Python; SQL; Mongo DB; AWS EC2; Flask; Sendgrid; Docker; Git; Hadoop; GCP Services: including BigQuery, Airflow, Dataflow, CloudStorage, Composer; CI/CD; JSON; Data pipelines; REST API; Pytest.
HOURS: M-F, 8:00 a.m. – 5:00 p.m.
JOB LOCATION: Dallas, Texas. Travel is not required, but candidates must be willing to relocate to unanticipated locations across the country per company contract demand.
CONTACT: Email resume referencing job code# PE03132026ANB to Maruthi Technologies Inc. DBA Anblicks at recruitment@anblicks.com