What are the responsibilities and job description for the RTOI Data Engineer position at LOGIXtech Solutions?
Job Details
Our client, a leading financial services company, is hiring a RTOI Data Engineer on a contract basis.
Job ID: 83891
Work Location:
Alpharetta, GA (Hybrid Working Model)
Summary:
Real-time operation intelligence team for the client's enterprise computing is responsible to stream terabytes of data daily. The client has built job frameworks to run large scale ETL pipelines with Kafka, ElasticSearch ELK, Snowflake, Hadoop.
Our applications run both on-perm and on the cloud. There are hundreds of dashboards built for business and operations to provide insight and actionable items at real-time.
We are looking for streaming, data engineer
- Understand distributed systems architecture, design and trade-off.
- Design and develop ETL pipelines with a wide range of technologies.
- Able to work on full cycle of development including defining requirement, design, implementation, testing, deployment.
- Strong communication skills to collaborate with various teams.
- Able to learn new technologies and work independently.
Required Qualifications:
- 5 years of application development experience, at least 2 years data engineering with Kafka
- Working experience writing and running applications on Linux
- 5 years of coding experience with at least one of the languages: Python, Ruby, Java, C/C , GO
- SQL and database experience
Optional:
- AWS or other cloud technologies
- ElasticSearch ELK
Pay Rate: $60- $70 /hour W2
Job ID: 83891
Work Location:
Alpharetta, GA (Hybrid Working Model)
Summary:
Real-time operation intelligence team for the client's enterprise computing is responsible to stream terabytes of data daily. The client has built job frameworks to run large scale ETL pipelines with Kafka, ElasticSearch ELK, Snowflake, Hadoop.
Our applications run both on-perm and on the cloud. There are hundreds of dashboards built for business and operations to provide insight and actionable items at real-time.
We are looking for streaming, data engineer
- Understand distributed systems architecture, design and trade-off.
- Design and develop ETL pipelines with a wide range of technologies.
- Able to work on full cycle of development including defining requirement, design, implementation, testing, deployment.
- Strong communication skills to collaborate with various teams.
- Able to learn new technologies and work independently.
Required Qualifications:
- 5 years of application development experience, at least 2 years data engineering with Kafka
- Working experience writing and running applications on Linux
- 5 years of coding experience with at least one of the languages: Python, Ruby, Java, C/C , GO
- SQL and database experience
Optional:
- AWS or other cloud technologies
- ElasticSearch ELK
Pay Rate: $60- $70 /hour W2
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Salary : $60 - $70
WAF Security Engineer
PY DATA, INC. -
Atlanta, GA
Data Engineer
Infodyne Solutions -
Alpharetta, GA
Data Engineer
SoHo Dragon -
Atlanta, GA