What are the responsibilities and job description for the Data Engineer position at Veterans Sourcing Group?
Staff Data EngineerPay Rate: $80-$82/hr. W2Hybrid required Hartford, CT preferred or Charlotte, NC. Hands on experience in SnowflakePrimary skill set is SnowflakeHybrid position - 3 days in the office in Hartford or CharlotteMust be prepared for an in-person interview Data mapping and data modelingCoordinate with stakeholders and understand how the new model in snowflake will look likePutting together a new agile team of 5 to 6 peopleRed flag on resume: if they have all the technologies; primary looking for someone able to explain what they worked and what are the challenges they facedInsurance experience is an added advantage; would chose someone with P&C experience over one withoutCoding assessment in interview: SQL, SnowflakeServe as subject matter expert and/or technical lead for large-scale data products.Drive end-to-end solution delivery across multiple platforms and technologies, leveraging ELT solutions to acquire, integrate, and operationalize data.Partner with architects and stakeholders to define and implement pipeline and data product architecture, ensuring integrity and scalability.Communicate risks and trade-offs of technology solutions to senior leaders, translating technical concepts for business audiences.Build and enhance data pipelines using cloud-based architectures.Design simplified data models for complex business problems.Champion Data Engineering best practices across teams, implementing leading big data methodologies (AWS, Hadoop/EMR, Spark, Snowflake, Talend, Informatica) in hybrid cloud/on-prem environments.Operate independently while fostering a collaborative, transformation-focused mindset.Work effectively in a lean, fast-paced organization, leveraging Scaled Agile principles.Promote code quality management, FinOps principles, automated testing, and environment management practices to deliver incremental customer value. Qualifications 5 years of data engineering experience.2 years developing and operating production workloads in cloud infrastructure.Bachelor’s degree in Computer Science, Data Science, Information Technology, or related field.Hands-on experience with Snowflake (including SnowSQL, Snowpipe).Expert-level skills in AWS services, Snowflake, Python, Spark (certifications are a plus).Proficiency in ETL tools such as Talend and Informatica.Strong knowledge of Data Warehousing (modeling, mapping, batch and real-time pipelines).Experience with DataOps tools (GitHub, Jenkins, UDeploy).Familiarity with P&C Commercial Lines business.Knowledge of legacy tech stack: Oracle Database, PL/SQL, Autosys, Hadoop, stored procedures, Shell scripting.Experience using Agile tools like Rally.Excellent written and verbal communication skills to interact effectively with technical and non-technical stakeholders.
Salary : $80 - $82