What are the responsibilities and job description for the Senior Data Engineer position at Delaware North?
Delaware North is searching for a Senior Data Engineer to join our team. The Data Engineering team is tasked with collecting raw data from our various systems, transforming that data into canonical models, harmonizing, and enhancing the data to provide trusted datasets to our consumers. The work this team does is the foundation of company initiatives meant to automate business processes, gather insights, and make more informed decisions. The Senior Data Engineer will work closely with the Business to identify and specify opportunities for integration with existing systems in a highly resilient manner using Delaware North's Data Platform.
Our technology stack includes Apigee, Python, Django, Apache Airflow, MongoDB, PostgreSQL, and Amazon Web Services (Lambda, EBS, S3, SQS, ECS, RDS, EC2, and Redshift).
Our preferred location is Buffalo, NY, however, we are able to consider remote candidates.
At Delaware North, we care about our team member’s personal and professional journeys. These are just some of the benefits we offer:
- Health, dental, and vision insurance
- 401(k) with company match
- Performance bonuses
- Paid vacation days and holidays
- Paid parental bonding leave
- Tuition and/or professional certification reimbursement
- Generous friends-and-family discounts at many of our hotels and resorts
- Design and implement project-based solutions
- Implement data platform improvements and new features
- Assist the support team with the resolution of data platform bug fixes
- Interface with clients, vendors, and internal users of the data platform on understanding the data
- Participate in group design and architecture sessions
- Author documentation for standard operating procedures, knowledge base articles, etc.
- Develop integration tests to validate solutions
- Analyze data and make recommendations to optimize current business operations
- Take part in regular agile rituals (stand-ups, sprint planning/retro, technical feasibility analysis)
- Bachelor's degree in computer science, information systems, or similar STEM field preferred
- 4 years of experience working with data and data analytics development
- Minimum 2 years of experience with ETL processes and orchestration tools.
- Experience with Databricks Delta Lake highly desired
- Ability to work in a fast-paced environment and consistently meet internal and external deadlines
- Proficiency in communicating with clients, partners, and team members
- Experience designing and developing solutions with a modern programming language such as Python, JavaScript, etc. to enable acquiring data from internal systems and 3rd parties
- Able to construct source to target mapping documents and develop accompanying components to automate translations
- Working knowledge of data structures and formats such as JSON and XML
- Ability to effectively collaborate with business users and vendors to address development issues
- Ability to effectively triage and track issues through to resolution
- Technical specification and use case (story) documentation, such as UML, Domain and Entity-Relationship Modeling, Business Process Notation
- Working knowledge of data persistence methods such as RDBMS (MSSQL, PostgreSQL, MySQL)
- Working knowledge of CI/CD practices
- Passion for delivering high-quality and meaningful results
- A passion for data continued learning, and collaboration are must-have attributes!
- Must be authorized to work in the US without sponsorship.
- Nice to have skills and experience:
- RESTful Web Services and microservices
- Integration testing and automation using a continuous integration (CI) platform
- Cloud computing, analysis & implementation experience (bonus if it relates to Amazon Web Services specifically)
- NoSQL engines (MongoDB)
- Redshift
- Unix/Linux