What are the responsibilities and job description for the Senior Data Engineer position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Raas Infotek LLC, is seeking the following. Apply via Dice today!
Job Summary:
We are seeking an experienced Senior Data Engineer with 12 years of expertise in designing, building, and optimizing scalable data pipelines and data architectures. The ideal candidate will have strong experience in big data technologies, cloud platforms, and data warehousing, along with leadership capabilities.
Key Responsibilities:
Experience:
Job Summary:
We are seeking an experienced Senior Data Engineer with 12 years of expertise in designing, building, and optimizing scalable data pipelines and data architectures. The ideal candidate will have strong experience in big data technologies, cloud platforms, and data warehousing, along with leadership capabilities.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines (ETL/ELT processes)
- Architect and implement data lakes, data warehouses, and data marts
- Work with large datasets using distributed systems (Spark, Hadoop)
- Collaborate with data scientists, analysts, and business teams for data requirements
- Optimize data flow, performance, and reliability
- Ensure data quality, governance, and security compliance
- Lead and mentor junior data engineers
- Implement real-time and batch data processing systems
- Build and manage APIs for data access and integration
- Troubleshoot and resolve complex data-related issues
Experience:
- 12 years of experience in Data Engineering / Big Data domain
- Strong programming skills in Python, Java, or Scala
- Expertise in SQL & NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB)
- Hands-on experience with Big Data tools:
- Apache Spark
- Hadoop
- Kafka
- Experience with ETL tools (Informatica, Talend, AWS Glue)
- Strong experience with Cloud Platforms:
- AWS (S3, Redshift, Lambda, Glue)
- Azure (Data Factory, Synapse)
- Google Cloud Platform (BigQuery, Dataflow)
- Knowledge of data warehousing concepts (Snowflake, Redshift, BigQuery)
- Experience with workflow orchestration tools (Airflow)
- Familiarity with CI/CD pipelines & DevOps practices
- Experience in real-time streaming (Kafka, Flink)
- Knowledge of Data Governance & Data Security
- Exposure to Machine Learning pipelines
- Experience in containerization (Docker, Kubernetes)
- Domain experience (Finance, Healthcare, Retail, etc.)