What are the responsibilities and job description for the GCP Data Engineer position at Black Rock Solutions INC?
Job Description
We are seeking an experienced GCP Data Engineer with strong hands-on expertise in building scalable data pipelines and cloud-based data solutions.
Responsibilities
We are seeking an experienced GCP Data Engineer with strong hands-on expertise in building scalable data pipelines and cloud-based data solutions.
Responsibilities
- Design, develop, and maintain data pipelines and ETL processes on Google Cloud Platform (GCP).
- Work with Dataflow, Airflow (DAG creation), Pub/Sub, and streaming platforms like Kafka.
- Build scalable and reusable data engineering frameworks using Python, Flask, SQL, and other related technologies.
- Implement end-to-end data warehouse and data mart solutions, including data modelling and optimization.
- Develop and enhance Python-based reusable frameworks to support ongoing data engineering initiatives.
- Collaborate with cross-functional teams to gather requirements, design solutions, and ensure smooth delivery.
- Implement CI/CD pipelines using Terraform, Git, and cloud automation tools.
- Ensure high performance, security, and reliability across all data engineering solutions.
- Minimum 5 years of experience in Information Technology.
- Strong hands-on experience with GCP data engineering services: Dataflow, Airflow, Pub/Sub, Kafka.
- Solid background in ETL development and strong proficiency in SQL.
- Experience with programming languages such as Python (mandatory), Scala, Java, Hive, or Spark.
- Strong understanding of data warehousing concepts, data modelling, and end-to-end DW/DM implementation.
- Experience developing frameworks using Python and Flask.
- Working knowledge of CI/CD pipelines, especially with Terraform and Git.
Salary : $52 - $60