What are the responsibilities and job description for the Power BI Developer position at Sacrum Technologies LLC?
Introduction
We are seeking a versatile BI Developer to join our Operations Business Intelligence team. This role is centered on delivering high-impact Power BI reporting that turns raw data into actionable executive insights across our core operational areas: Collections, Fleet, Post Collections, and Dispatch.
Responsibilities
Required Technical Skills:
We are seeking a versatile BI Developer to join our Operations Business Intelligence team. This role is centered on delivering high-impact Power BI reporting that turns raw data into actionable executive insights across our core operational areas: Collections, Fleet, Post Collections, and Dispatch.
Responsibilities
- Power BI Reporting & Development (Primary Focus)
- Visualization Excellence:
- Design Executive Grade dashboards that prioritize UX/UI, ensuring that complex operational data across Collections, Fleet, Post Collections, and Dispatch is digestible and visually compelling.
- DAX & Power Query: Author advanced DAX measures and utilize Power Query (M) for sophisticated data shaping and business logic implementation.
- Report Lifecycle Ownership:
- Manage the full development cycle from prototype through UAT to production deployment, ensuring quality and accuracy at every stage.
- Security & Governance: Administer Power BI workspaces and implement Row-Level Security (RLS) to ensure data is accessible only to authorized stakeholders.
- Leadership & Strategic Partnership:
- Requirements Discovery: Lead collaborative sessions with senior leadership across Collections, Fleet, Post Collections, and Dispatch to define KPIs and translate business objectives into technical specifications.
- Data Storytelling: Present and demo reports to executive stakeholders, focusing on the why behind the data and providing recommendations based on findings.
- Agile Collaboration: Work within Azure DevOps to track stories, manage backlogs, and provide transparency into the development roadmap.
- Data Engineering & Architecture:
- Fabric & Snowflake Ecosystem: Build and optimize end-to-end data pipelines leveraging MS Fabric (Lakehouse/Warehouse) and Snowflake to support real-time and batch reporting across all operational areas.
- Advanced Modeling: Design, implement, and maintain robust data models (Star Schema) that ensure high performance and data integrity across the organization.
- Optimization: Perform root-cause analysis on slow-running reports; optimize SQL queries in Snowflake and DAX expressions in Power BI to ensure sub-second response times.
Required Technical Skills:
- Power BI: Expert-level proficiency (DAX, Power Query, Service).
- SQL: Expert-level SQL for complex data transformation and validation.
- Snowflake: Experience with SnowSQL, partitioning, and warehouse optimization.
- MS Fabric: Familiarity with OneLake, Data Factory, and the Fabric environment.
- Executive Presence: Confidence in presenting to and taking feedback from C-suite leadership.
- Curiosity: A proactive desire to understand the business operations behind the data — particularly the workflows driving Collections, Fleet, Post Collections, and Dispatch.
- Clarity: Ability to explain complex technical concepts to non-technical audiences.
- Experience with Azure DevOps for project tracking.
- Certification in PL-300 (Microsoft Data Analyst) or Snowflake Pro Core.
- Prior experience in a client-facing or leadership-facing internal consultancy role.
- Familiarity with logistics, or field operations is a plus.
- Opening paragraph now names all four areas as the core focus.
- Visualization Excellence references the operational data spanning those areas.
- Requirements Discovery highlights cross-functional engagement with leadership in each area.
- Curiosity (Soft Skills) ties directly to understanding the workflows behind those operations.