What are the responsibilities and job description for the Senior BI Developer & Analytics Engineer position at Wecom Fiber?
Job Description
Senior BI Developer & Analytics Engineer
Reports to: VP of Corporate Finance & Data
Location: Hybrid — Phoenix, AZ
Type: Full-Time
Why This Role Exists
We've built a modern data stack — Snowflake, Tableau, Matillion, Sigma, and Secoda — and it's generating real value. Now we need someone to own it. This role exists to take day-to-day execution of our reporting layer and data governance program off the VP's plate, so the business gets a dedicated expert, and the platform gets the sustained attention it deserves. Success looks like a reporting layer stakeholders trust, a data catalog teams actually use, and a governance foundation that scales with us.
Who You Are
You're the kind of person who gets genuinely curious about why a number looks off — and doesn't stop until you know. You enjoy the intellectual challenge of translating messy business questions into clean, trustworthy analytical outputs, and you take real pride in the quality of what you ship. You work best when you own your domain, set your own standard, and aren't waiting for someone to tell you what needs fixing.
You're a strong communicator — not in a polished, corporate sense, but in the way that actually matters: you can sit with a Finance or Operations stakeholder, understand what they're really asking for, push back when the ask is wrong, and explain your work in plain language. You're comfortable with ambiguity and changing priorities, and you don't need a perfect brief to make progress. When something breaks or a requirement shifts, you adapt.
Accountability isn't something imposed on you — it's something you hold yourself to, often more than anyone else would. You'll fit right in here. We operate in an environment built on trust, humor, and the shared understanding that we drive ourselves the hardest.
Key Responsibilities:
- Own the Tableau environment end-to-end: development, performance tuning, version control, and quality standards for dashboards and data sources across operational, financial, and infrastructure reporting
- Maintain and evolve our KPI library — partnering with business stakeholders to ensure definitions are documented, agreed upon, and consistently reflected across reporting surfaces
- Drive data governance adoption using Secoda as our primary catalog tool — including metadata management, lineage documentation, data classification, and onboarding business teams to self-service discovery
- Enforce and refine Snowflake RBAC policies, working with data engineering to ensure access controls align with data sensitivity, compliance requirements, and least-privilege principles
- Serve as the first line of quality assurance for the reporting layer — catching data issues before stakeholders do, diagnosing root causes, and closing the loop with upstream fixes
- Partner directly with Finance, Operations, and Network/Infrastructure teams to translate business questions into scalable analytical solutions, not one-off reports
- Build and document BI development standards — reusable templates, naming conventions, and review processes that make the platform easier to maintain as the team grows
- Support ad hoc analytical requests with the judgment to distinguish what should be self-service, what warrants a new dashboard, and what's a deeper analytical problem worth solving properly
- Actively contribute to the evolution of our Snowflake gold layer — as KPI definitions change and reporting capabilities expand based on stakeholder feedback, you'll be responsible for updating and extending transformation logic, not just consuming it; this means being comfortable reading existing model structure, making targeted changes, and documenting what you've built
- Collaborate closely with Data Engineering on pipeline outputs and Snowflake layer structure so that reporting surfaces are built on a stable, well-understood foundation
- Contribute to early-stage data science initiatives as the platform matures — anomaly detection, forecasting, and operational ML use cases are on the roadmap
- Manage your work in agile scrum using Jira — maintaining a well-groomed backlog, participating in sprint planning and retrospectives, and keeping your work visible and prioritized without needing to be chased
Required Qualifications:
- 5–7 years of hands-on BI development experience; you've been the go-to person on a data or analytics function — not managing a team, but the senior individual contributor others relied on
- 3 years with Tableau as a primary tool, including ownership of a Tableau Server or Cloud environment — publishing to someone else's server doesn't count here
- Deep SQL proficiency in Snowflake specifically — writing complex queries is table stakes; we're looking for demonstrated experience with query optimization, virtual warehouse sizing, clustering keys, and materialization strategies
- Hands-on experience writing or maintaining transformation logic in a layered warehouse model (bronze/silver/gold or equivalent) — you understand how business logic gets encoded in a transformation layer and you're comfortable extending it, not just reading it
- Experience with data governance concepts in practice — not just theory: data cataloging, lineage, access controls, and documentation workflows
- Demonstrated ability to work with business stakeholders to gather requirements, push back on scope, and deliver analytical work that actually gets used
- Comfortable operating as a senior individual contributor in a lean environment — you set the standard, you don't wait for one
Preferred / Nice-to-Have Qualifications:
- Experience with Secoda or a comparable data catalog tool (Alation, Atlan, Collibra)
- Exposure to Sigma Computing as a self-service analytics layer
- Experience in telecom, ISP, or infrastructure-heavy industries — understanding subscriber metrics, network KPIs, or financial reporting in a capital-intensive business is a plus
- Foundational knowledge of Python for analytical work — not a requirement on day one, but a signal of growth trajectory toward data science
- Hands-on experience with generative AI tools and frontier models (OpenAI, Anthropic, Google, etc.) — we're not looking for a researcher, but someone who has experimented with embedding these capabilities into analytical workflows and is genuinely enthusias
tic about where the technology is headed - Familiarity with Git/GitLab for version-controlling BI assets
- Comfortable working in agile scrum — you manage your own backlog, participate meaningfully in sprint ceremonies, and keep your work visible without needing to be chased; experience with Jira is preferred
Growth Opportunity
This role is designed to evolve with you and with the platform. In the near term, the priority is stabilizing and scaling our reporting layer and governance program — but that's the foundation, not the ceiling.
As the data platform matures, there is a deliberate and supported path into applied data science work. That means contributing to forecasting models, building anomaly detection for operational KPIs, and eventually helping shape an ML infrastructure layer as the business scales. You won't have to make a case to move in that direction — it's already on the roadmap, and the expectation is that this role grows into it.
Beyond the technical trajectory, there's real organizational scope here. The standards, governance frameworks, and BI development practices you establish will define how this data function operates for years. As the team expands, you'll have had a direct hand in building what it's built on — and the influence that comes with that won't quietly disappear when new people join.
NOTE:
Wecom (and its subsidiaries) complies with all Equal Employment Opportunity (EEO). Wecom does not discriminate on the basis of age, race, sex, religion, color, national origin, marital status, genetic information, sexual orientation, gender identity and expression, disability, veteran status, pregnancy status or other status protected by law.
Company benefits subject to change. www.wecomfiber.com
Salary : $130,000 - $140,000