What are the responsibilities and job description for the Analytics Engineer position at hackajob?
hackajob is collaborating with Verisk to connect them with exceptional professionals for this role.
Description
As an Analytics Engineer, you will be responsible for transforming raw data from our applications into structured datasets for large-scale analysis and machine learning models. You will work closely with our development, data science, and business intelligence teams to ensure data integrity, quality, and accessibility.
Responsibilities
Description
As an Analytics Engineer, you will be responsible for transforming raw data from our applications into structured datasets for large-scale analysis and machine learning models. You will work closely with our development, data science, and business intelligence teams to ensure data integrity, quality, and accessibility.
Responsibilities
- Data Modeling: Research and work with business stakeholders to develop our data warehouse model.
- Data Transformation: Clean, transform, and enrich data to create high-quality datasets suitable for analysis and machine learning.
- Collaboration: Work closely with product teams, software developers, data scientists, and analysts to understand data needs and deliver innovative solutions.
- Data Management: Ensure data accuracy, consistency, and reliability across all datasets.
- Optimization: Optimize data processes for performance and scalability.
- Documentation: Maintain comprehensive documentation of transformation logic and lineage.
- Educational Background: Bachelor’s degree in computer science, Data Engineering, or a related field.
- Experience: 3 years of experience as an Analytics Engineer or in a similar role.
- Strong Communication skills: Ability to work with technical and non-technical audiences to translate business requirements into data models.
- Technical Proficiency:
- Data Warehousing: Knowledge of data warehousing concepts and solutions (e.g., Redshift, Snowflake).
- Data modeling: experience in modern data modeling practices, ideally dimensional modeling.
- Programming Languages: Proficiency in SQL and a familiarity with Python.
- Data Processing: Experience with ETL tools and frameworks (e.g., Apache Airflow, Luigi, DBT).
- Database Management: Strong knowledge of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra).
- Version Control: Proficient with version control systems (e.g., Git).
- Machine Learning: Understanding of machine learning concepts and experience working with data for ML model training.
- AI: Familiarity and enthusiasm for bleeding-edge analytical enablement using tools such as Large Language Models and Prompt Engineering.
Salary : $100,000 - $140,000