What are the responsibilities and job description for the Bid Data Engineer- W2 Only position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, NGTalentTech Group LLC, is seeking the following. Apply via Dice today!
Role: Big Data Engineer
Location: Onsite- Omaha, NE
Duration: 12 Months with possible extension
Duties/Responsibilities
Role: Big Data Engineer
Location: Onsite- Omaha, NE
Duration: 12 Months with possible extension
Duties/Responsibilities
- Design and implement data pipelines and transformations using big data technologies such as Spark, Hadoop, and related ecosystems
- Develop and maintain complex SQL queries for data extraction and reporting, and optimize query performance and scalability
- Design and develop software components that integrate with our identity graphs, client data, and keying processes
- Participate in capacity monitoring and planning
- Develop and maintain technical documentation
- Ensure data integrity and quality in development process, including conducting data research and analysis to identify trends and patterns
- Work with cross-functional teams to ensure software systems meet business requirements and are scalable for future growth, including developing predictive models and enhancing data quality
- Position will require that the individual understands all regulations and laws applicable to their assigned roles and responsibilities. Additionally, the individual will be responsible for the development, implementation, and regular maintenance of policies and procedures that govern the work of assigned roles and responsibilities, including compliance with the security requirements of ePHI.
- Experience with large-scale data processing and analytics platforms, including Hadoop, Spark,or related ecosystems
- Knowledge of distributed computing concepts and experience with cloud-based infrastructure
- Experience with fuzzy logic matching and tools
- Experience with AWS infrastructure
- 8 years of experience working with Big Data technologies, including, Spark, and SQL databases
- Proficiency in programming languages such as Scala, Java, Python, Shell or a combination thereof
- Expertise in data modeling, database design, and development
- Experience in building and maintaining data transformations (ETL) using SQL, Python, Scala, or Java
- Ability to analyze, troubleshoot and performance tune queries
- Ability to identify problems, and effectively communicate solutions to peers and management
- Strong analytical and problem-solving skills to address complex software and data challenges
- Ability to work effectively with cross-functional teams and communicate technical designs and solutions to non-technical stakeholders.