What are the responsibilities and job description for the Sr. Software Engineer position at BizTek People, Inc. | APA International Placement Consultants?
Responsibilities:
- Design and development of a technical solution for our analytics
platform on the AWS cloud that aligns with architectural and quality standards,
and effectively integrates solutions delivered with adjacent and tertiary
solutions/technologies.
- Independently identifies, defines, directs, and/ or
performs analysis to resolve complex, first-time issues in his or her area(s)
of expertise.
- Ensures knowledge sharing within the team and across
teams.
- Brings technical knowledge into the organization from
external sources and links new, emerging technologies with business needs.
- Implement full DevOps
culture of Build, Test automation with continuous integration and deployment.
Required Job Qualifications:
- Linux: 5 or more years in Unix
systems engineering with experience in Red Hat Linux, Centos or Ubuntu.
- Big Data: 3 year operational
experience with the Hadoop stack (Spark, Hive, Ranger, Sentry, HDFS).
- AWS: Working experience and
good understanding of the AWS environment, including VPC, EMR, EC2, EBS, S3,
RDS, SQS, Cloud Formation, Lambda and HBase.
- Container Runtimes: Hands-on experience with
container runtimes, such as Docker and Kubernetes.
- Programming: Experience programming
microservices or APIs with java, Python, Scala.
- AWS EMR: Experience in Amazon EMR
cluster configuration.
- DevOps Automation: Experience with DevOps - Orchestration/Configuration
Management and CI/CD tools (Jenkins, CircleCI, Atlantis, Puppet, Jenkins,
Troposphere, Terraform, Serverless, etc.).
- Networking: Working knowledge of
TCP/IP networking, SMTP, HTTP and HTTPS, load-balancers (ELB, HAProxy), NGINX
and high availability architecture.
- Monitoring: Hands on experience with
monitoring tools such as AWS CloudWatch, SignalFx and Splunk.
- Version Control: Working experience with
one or more version control platforms (Bitbucket, Github).
- ETL: Job scheduler experience
like Airflow or Data Pipeline. Nice to have Airflow experience.
- Security: Experience implementing
role based security, including AD integration, security policies, and auditing
in a Linux/Hadoop/AWS environment.
- Backup/Recovery: Experience with the
design and implementation of big data backup / recovery solutions.
- Ability
to keep systems running at peak performance, upgrade operating system, patches,
and version upgrades as required.
- Lead
other admins and platform engineers through design and implementation decisions
to achieve balance between strategic design and tactical needs.
- Research
and deploy new tools and frameworks to build a sustainable big data platform.
- Agile/Scrum/Kanban
experience.
- Demonstrated communication
and interpersonal skills.
- Proven track record of
success in fast-moving organizations with complex technology applications.
- Collaborate
with Project Managers, Product Managers, QA and Engineering teams to deliver
results.
Educational Qualifications:
- MS/BS in Computer Science
or related field.