What are the responsibilities and job description for the Data Engineer _ Charlotte, NC _ W2 Only position at CADRE TECHNOLOGIES SERVICES LLC?
Role: Data Engineer
Location: Hybrid in Charlotte, NC ( MUST BE USC , GC, OPT or H4EAD. Not taking h1b candidates for this role)
Must have skills: Pyspark, AWS, Terraform and Building Data Pipelines
Collaborate with Data Engineers, Software Engineers, Data Scientists, and Technical Leads to gather requirements and define technical solutions
Design, build, and maintain scalable data pipelines (batch & real-time) within AWS
Develop and optimize data warehouse solutions to support analytics and reporting use cases
Ensure data quality, reliability, and long-term scalability across all pipelines
Partner with Data and Solution Architects on key technical decisions
Identify data gaps and implement automated solutions to enhance data availability and usability
Manage and troubleshoot production data environments in AWS
Build and maintain ETL workflows, orchestration pipelines, and supporting infrastructure
Implement monitoring, alerting, and automated remediation for production issues
Participate in code reviews and contribute to best practices across the team
Required Technical Skills
Experience working within AWS environments
Strong Experience With AWS Services Including
Redshift, S3, EMR, Glue, Lambda, Athena
CloudWatch, CloudTrail, SNS, SQS, Step Functions, QuickSight
Experience with data warehousing solutions (Redshift, Athena)
Hands-on experience building ETL pipelines and data models
Experience with real-time and batch data processing
Proficiency in Python, Spark, PySpark, Pandas
Strong SQL Skills And Experience With RDBMS Platforms
Experience with Kafka or messaging systems (Confluent preferred)
Familiarity with event-driven architectures
Experience With Workflow Orchestration Tools (Airflow Or Step Functions)
Experience with Infrastructure as Code (Terraform or CloudFormation)
Experience with CI/CD pipelines (Bitbucket, Concourse, or similar)
Experience with Secrets Management (Vault, AWS Secrets Manager)
Strong understanding of IAM roles, policies, and AWS security best practices
Knowledge of networking fundamentals (DNS, TCP/IP, VPN)
Experience with REST APIs and API Gateway
Location: Hybrid in Charlotte, NC ( MUST BE USC , GC, OPT or H4EAD. Not taking h1b candidates for this role)
Must have skills: Pyspark, AWS, Terraform and Building Data Pipelines
Collaborate with Data Engineers, Software Engineers, Data Scientists, and Technical Leads to gather requirements and define technical solutions
Design, build, and maintain scalable data pipelines (batch & real-time) within AWS
Develop and optimize data warehouse solutions to support analytics and reporting use cases
Ensure data quality, reliability, and long-term scalability across all pipelines
Partner with Data and Solution Architects on key technical decisions
Identify data gaps and implement automated solutions to enhance data availability and usability
Manage and troubleshoot production data environments in AWS
Build and maintain ETL workflows, orchestration pipelines, and supporting infrastructure
Implement monitoring, alerting, and automated remediation for production issues
Participate in code reviews and contribute to best practices across the team
Required Technical Skills
Experience working within AWS environments
Strong Experience With AWS Services Including
Redshift, S3, EMR, Glue, Lambda, Athena
CloudWatch, CloudTrail, SNS, SQS, Step Functions, QuickSight
Experience with data warehousing solutions (Redshift, Athena)
Hands-on experience building ETL pipelines and data models
Experience with real-time and batch data processing
Proficiency in Python, Spark, PySpark, Pandas
Strong SQL Skills And Experience With RDBMS Platforms
Experience with Kafka or messaging systems (Confluent preferred)
Familiarity with event-driven architectures
Experience With Workflow Orchestration Tools (Airflow Or Step Functions)
Experience with Infrastructure as Code (Terraform or CloudFormation)
Experience with CI/CD pipelines (Bitbucket, Concourse, or similar)
Experience with Secrets Management (Vault, AWS Secrets Manager)
Strong understanding of IAM roles, policies, and AWS security best practices
Knowledge of networking fundamentals (DNS, TCP/IP, VPN)
Experience with REST APIs and API Gateway