What are the responsibilities and job description for the Data Architect position at Tror - AI for everyone?
Role: AWS Data Architect
Location: Alameda, CA (Onsite & Locals)
Experience: 12 Years
Job Type: Contract/W2
Duration: Long Term
Key Skills: AWS Data Tableau Salesforce integration
A hands-on technical leader who can design, build, and scale data platforms on AWS, integrate Salesforce data, and enable impactful business insights through Tableau.
Role Overview
We are looking for a highly experienced AWS Data Architect with strong expertise in data engineering, Salesforce integration, and Tableau. The ideal candidate will design and implement scalable data platforms, enable seamless data integration, and drive analytics solutions for business insights.
Key Responsibilities
- Design and implement enterprise-level data architecture on AWS ensuring scalability, performance, and security
- Build and manage data lakes / lakehouse solutions using AWS services (S3, Glue, Redshift, Athena, EMR, Lambda, etc.)
- Develop and optimize ETL/ELT data pipelines (batch and real-time)
- Integrate and synchronize data between Salesforce (Sales Cloud / Service Cloud) and AWS platforms
- Define data models (conceptual, logical, physical) including dimensional modeling (Star/Snowflake schema)
- Establish and maintain BI architecture using Tableau, including dashboards, data models, and performance optimization
- Ensure data quality, governance, and security (IAM, encryption, masking)
- Implement CI/CD pipelines and Infrastructure as Code (Terraform/CloudFormation)
- Lead technical design, conduct code reviews, and mentor data engineering teams
- Collaborate with cross-functional teams to translate business requirements into scalable data solutions
- Drive data platform modernization and cloud migration initiatives
- Ensure monitoring, logging, and disaster recovery strategies are in place
Required Skills
- 12 years of IT experience with 10 years in Data Engineering / Analytics
- 6 years of hands-on experience with AWS data services (S3, Glue, Redshift, Athena, Lambda, EMR, Kinesis, RDS)
- Strong expertise in SQL and data modeling
- 3 years of experience in Salesforce integration using APIs, connectors, or middleware
- 4 years of experience with Tableau (dashboard development, data visualization, performance tuning)
- Experience with real-time and batch data processing architectures
- Strong programming skills in Python / PySpark
- Experience with CI/CD, DevOps practices, and Agile methodologies
Preferred Skills
- Experience with streaming tools like Kafka or Kinesis
- Knowledge of data governance and compliance frameworks
- Experience working with US-based clients
- Exposure to modern data architectures (Lakehouse, Data Mesh)
Soft Skills
- Strong communication and stakeholder management skills
- Ability to translate complex technical concepts into business-friendly language
- Problem-solving mindset and analytical thinking
- Self-driven and adaptable in fast-paced environments
Experience working in Agile/Scrum teams (Jira or similar tools)