What are the responsibilities and job description for the Senior Data Engineer position at E.A. Buck Financial Services?
E.A. Buck is a fast-growing financial services company with dual headquarters in Denver and Honolulu. We’re seeking a Senior Data Engineer to join our team in Denver. The Senior Data Engineer will assist in the development and implementation of our enterprise data and data warehouse. This includes building scalable data pipelines integrating primarily from Salesforce, Excel, and other disparate sources into a unified, cloud-based analytics platform, while establishing robust data governance frameworks and security controls essential for financial services compliance. The ideal candidate will bring 3 years of data engineering experience with expertise in modern cloud data platforms (Snowflake, AWS, Azure), SQL, Python, and ETL/ELT tools, combined with a strong understanding of data governance, RBAC implementation, data masking, and regulatory requirements (SEC, FINRA, SOC 2).
No agencies please.
Salary / Benefits Package:
- Base Salary range of $110,000 to $130,000 per year, based on experience.
- Competitive benefits package including 100% employer paid health, dental, and vision insurance.
- Eligibility to enroll in health savings account, flexible savings account, and more.
- Paid vacation (PTO) and paid holidays.
- Join our mission-driven company and partake in 401(k) with generous matching.
Duties and Responsibilities:
Data Engineering:
- Data Integration – Implement ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) processes to move and integrate data from disparate sources.
- Data Modeling – Develop and implement data models that ensure the data warehouse is properly structured and organized for analysis.
- Data Quality & Integrity – Ensure the accuracy, completeness, and reliability of data through validation, quality checks, and error handling.
- Performance Optimization – Monitor and tune the performance of databases, data loads, and queries to ensure efficiency and scalability.
Data Governance:
- Security Framework – Design and implement role-based access controls and manage role object ownership to safeguard sensitive data.
- Data Protection – Apply column-level and row-level security, and manage metadata to ensure proper protection and governance of data assets.
- Network Security – Enforce IP access restrictions and integrate single sign-on (SSO) for secure data access.
- Performance Optimization – Optimize data processing by managing batch loads, data partitioning, cluster keys, and monitoring resource usage to maintain system efficiency.
Job Requirements:
- 3 years of data engineering experience
- Expert-level SQL skills including complex queries, performance optimization, and database design
- Hands-on Salesforce integration experience with REST/SOAP APIs, Bulk API, and data extraction patterns
- Cloud data warehouse expertise (Snowflake preferred) including security, optimization, and cost management
- Advanced Python proficiency with data libraries (Pandas, SQLAlchemy, FastAPI) and workflow orchestration tools
- Production experience with modern data stack including DBT, Fivetran, Git, CI/CD pipelines
- CRM and sales operations experience with complex data modeling and business process automation
- Financial services industry experience preferred
Salary : $110,000 - $130,000