What are the responsibilities and job description for the Data Architect - Remote / Telecommute position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Cynet Systems, is seeking the following. Apply via Dice today!
We are looking for Data Architect - Remote / Telecommute for our client in Jersey City, NJ
Job Title: Data Architect - Remote / Telecommute
Job Location: Jersey City, NJ
Job Type: Contract
Job Description:
Pay Range: $07hr - $75hr
Responsibilities:
We are looking for Data Architect - Remote / Telecommute for our client in Jersey City, NJ
Job Title: Data Architect - Remote / Telecommute
Job Location: Jersey City, NJ
Job Type: Contract
Job Description:
Pay Range: $07hr - $75hr
Responsibilities:
- Data Pipeline Development: Design, build, and maintain scalable data pipelines to ingest, process, and transform structured and unstructured data.
- Data Modeling: Create optimized data models to support analytics, reporting, and machine learning workflows.
- ETL/ELT Processes: Develop and manage ETL/ELT workflows to ensure clean, reliable, and high-quality data.
- Database Management: Work with relational and NoSQL databases to ensure efficient storage and retrieval of large datasets.
- Cloud Data Solutions: Implement and optimize data solutions on cloud platforms like AWS, Azure, or Google Cloud Platform.
- Data Quality & Governance: Ensure data integrity, security, compliance, and quality across systems.
- Collaboration: Partner with data scientists, analysts, and software engineers to deliver reliable data infrastructure.
- Automation: Streamline data processes using orchestration tools and automation frameworks.
- Monitoring & Optimization: Implement monitoring, logging, and performance tuning of data systems.
- Documentation: Maintain detailed documentation of data pipelines, architecture, and workflows.
- Programming Skills: Proficiency in Python, SQL, and familiarity with Java/Scala.
- Data Pipelines & ETL: Experience with ETL tools (Airflow, DBT, Informatica, Talend).
- Big Data Frameworks: Knowledge of Spark, Hadoop, Kafka, or Flink.
- Data Warehousing: Hands-on experience with Snowflake, Redshift, BigQuery, or Synapse.
- Cloud Platforms: Proficiency in AWS (Glue, Redshift, S3), Azure (Data Factory, Synapse), or Google Cloud Platform (BigQuery, Dataflow).
- Databases: Strong experience with relational databases (PostgreSQL, MySQL, Oracle) and NoSQL databases (MongoDB, Cassandra).
- Data Modeling: Expertise in designing star/snowflake schemas, OLTP/OLAP systems.
- DevOps & Version Control: Familiarity with Git, CI/CD pipelines, and Infrastructure as Code (Terraform).
- Data Governance & Security: Knowledge of GDPR, HIPAA, encryption, role-based access controls.
- Analytical Skills: Strong problem-solving and optimization skills in handling big data.
- Collaboration & Communication: Ability to work in cross-functional teams and clearly document technical processes.
Salary : $7 - $75