Recent Searches

You haven't searched anything yet.

1 aws cloud data developer engineer architect terraform 1912a Job in princeton, nj

SET JOB ALERT
Details...
Cyrten
Princeton, NJ | Full Time
$125k-159k (estimate)
1 Month Ago
Sr. AWS Data  Architect - Terraform - Remote - 1917
Cyrten Princeton, NJ
$125k-159k (estimate)
Full Time 1 Month Ago
Save

Cyrten is Hiring a Remote Sr. AWS Data  Architect - Terraform - Remote - 1917

Sr. AWS Data Architect - Terraform - Remote

Location: Remote

Preferred Time Zones: EST, CST, MTN

Rate: DOE - W2

Length: 1 to 3 years

Manager Note: 

Top skills needed:

1. Hands-on Terraform. Build from scratch.

2. Scripts - Shell Unix python

3. Infrastructure experience deploying platforms

4. Data Processing platforms such as Databricks (or Snowflake)

5. Experience with data analytics in a cloud environment

Tip for success: Underneath the Skills section of your resume, put a Terraform section. Don't be shy; list everything you have ever done in Terraform. Terraform is such a major key to this opportunity.

Qualifications:

  • Bachelor’s degree in Computer Science, Management Information Systems, Computer Engineering, or related field or equivalent work experience; advanced degree preferred
  • 7 years of experience as an AWS Data Architect in designing and building large-scale solutions in an enterprise setting in both development and deployment
  • 5 years in designing and building solutions in the cloud
  • Expertise in building and managing Cloud databases such as AWS RDS, DynamoDB, DocumentDB, or analogous architectures
  • Expertise in building Cloud Database Management Systems in Databricks Lakehouse or analogous architectures
  • Expertise in Cloud Data Warehouses in Redshift, BigQuery, or analogous architectures is a plus
  • Deep SQL expertise, data modeling, and experience with data governance in relational databases
  • Experience with the practical application of data warehousing concepts, methodologies, and frameworks using traditional (Vertica, Teradata, etc.) and current (SparkSQL, Hadoop, Kafka) distributed technologies
  • Refined skills using one or more scripting languages (e.g., Python, bash, etc.)
  • Experience using ETL/ELT tools and technologies such as Talend, Informatica a plus
  • Embrace data platform thinking, design and develop data pipelines keeping security, scale, uptime, and reliability in mind
  • Expertise in relational and dimensional data modeling
  • UNIX admin and general server administration experience required
  • Presto, Hive, SparkSQL, Cassandra, or Solr, and other Big Data query and transformation experiences a plus
  • Experience using Spark, Kafka, Hadoop, or similar distributed data technologies a plus
  • Able to expertly express the benefits and constraints of technology solutions to technology partners, business partners, and team members
  • Experience with leveraging CI/CD pipelines
  • Experience with Agile methodologies and ability to work in an Agile manner is preferred
  • One or more cloud certifications

Responsibilities:

  • Understand technology vision and strategic direction of business needs
  • Understand our current data model and infrastructure, proactively identify gaps and areas for improvement, and prescribe architectural recommendations focusing on performance and accessibility.
  • Partner across engineering teams to design, build, and support the next generation of our analytics systems.
  • Partner with business and analytics teams to understand specific requirements for data systems to support the development and deployment of data workloads ranging from Tableau reports to ad hoc analyses.
  • Own and develop architecture supporting translating analytical questions into effective reports that drive business action.
  • Automate and optimize existing data processing workloads by recognizing data and technology usage patterns and implementing solutions.
  • Solid grasp of the intersection between analytics and engineering while maintaining a proactive approach to ensure solutions demonstrate high-performance levels, privacy, security, scalability, and reliability upon deployment.
  • Provide guidance to partners on the effective use of the database management systems (DBMS) platform through collaboration, documentation, and associated standard methodologies.
  • Design and build end-to-end automation to support and maintain software currency
  • Create build automation services using Terraform, Python, and OS shell scripts.
  • Develop validation and certification processes through automation tools
  • Design integrated solutions in alignment with design patterns, blueprints, guidelines, and standard methodologies for products
  • Participate in developing solutions by incorporating cloud native and 3rd party vendor products
  • Participate in research, perform POCs (proofs of concept) with emerging technologies, and adopt industry best practices in the data space to advance the cloud data platform.
  • Develop data streaming, migration, and replication solutions
  • Demonstrate leadership, collaboration, exceptional communication, negotiation, strategic, and influencing skills to gain consensus and produce the best solutions.

Please Note:

Federal client - required to be a U.S. Citizen

NO 3rd Party Candidates

NO 3rd Party Vendors

NO Sponsorship

Job Summary

JOB TYPE

Full Time

SALARY

$125k-159k (estimate)

POST DATE

04/10/2024

EXPIRATION DATE

06/09/2024

WEBSITE

cyrten.com

HEADQUARTERS

Norristown, PA

SIZE

<25

Related Companies
Show more