What are the responsibilities and job description for the PowerBI Developer position at Jobs via Dice?
The Senior Power BI (with data engineering skillset) will play a key role in designing, developing, and optimizing data models, metrics, and reports that drive data-informed decision-making across the organization. This role combines strong Power BI visualization expertise with advanced data engineering skills in Databricks, ensuring high-quality, scalable data products for both enterprise reporting and self-service analytics use cases.
LOGISTICS:
LOGISTICS:
- Candidate must physically work at Store Support Center, located in Coppell, TX; Monday - Thursday. Friday is WFH until further notice
- If the candidate opts to relocate to the DFW area, relocation assistance will NOT be provided
- Design, develop, and maintain interactive Power BI dashboards and reports for various business domains (finance, operations, marketing, etc.).
- Power BI/Fabric Reporting & Visualization
- Leveraged Microsoft Fabric for end-to-end data architecture - integrating Lakehouse, Data Flows, and Direct Lake models to power advanced Power BI analytics.
- Create semantic models and DAX measures that are intuitive and optimized for performance.
- Collaborate with business users to translate requirements into visual insights and define KPI frameworks.
- Implement row-level security (RLS) and governance standards within Power BI workspace environments.
- Support self-service BI adoption by developing certified datasets and user-friendly data models.
- Design and build data pipelines and curated layers (Bronze, Silver, Gold) within Databricks using Unity Catalog, PySpark / SQL / Delta Lake.
- Develop metric layers and data marts in Databricks to support Power BI and other analytical tools.
- Ensure data quality, consistency, and accuracy through robust data validation and monitoring processes.
- Implement performance tuning and cost optimization best practices for Databricks jobs and clusters.
- Collaborate with the data governance and architecture teams to align data models with enterprise standards.
- Develop and maintain automated testing frameworks for data validation using Python, PySpark, or Databricks notebooks.
- Integrate automated tests into CI/CD pipelines (Azure DevOps, GitHub Actions, etc.).
- Build reusable data validation scripts leveraging Databricks and other data testing tools.
- Work closely with data engineers, data modelers, BI developers, and governance teams to define testing standards.
- Participate in code reviews and data quality reviews to ensure compliance with QA standards.
- Document test cases, results, and maintain traceability between business requirements and validation outcomes.
- Partner with business analysts, product owners, and stakeholders to identify new reporting opportunities.
- Work closely with data governance, QA, and DevOps teams to ensure compliant, secure, and automated delivery.
- Provide technical leadership and mentorship to junior engineers on best practices for BI and data engineering.
- Support the development of metadata-driven frameworks and data catalogs to enable discoverability and reusability.
- Strong expertise in Power BI development.
- Hands-on experience working with cloud platforms (AWS, Azure, or Google Cloud).
- Experience with CI/CD tools like Azure DevOps, GitHub Actions, or Terraform for infrastructure-as-code.
- Familiarity with data modeling, data warehousing, and database design.
- Strong understanding of data formats such as Parquet, ORC, JSON, and Avro.
- Ability to work in an agile environment and adapt to changing requirements.
- Strong analytical and problem-solving skills with a focus on optimizing data engineering solutions.
- Study, analyze and understand business requirements in context to business intelligence.
- Spot key performance indicators with apt objectives
- Make essential technical and strategic changes to improvise present business intelligence systems
- SQL querying for better results
- Bachelor's or master's degree in computer science, Information Systems, or a related field.
- 8 years of experience in data engineering and BI development.
- Strong hands-on experience with:
- Databricks (PySpark, Delta Lake, SQL)
- Power BI (DAX, Power Query, Data Modeling)
- Azure Data Services (Data Lake, Synapse, ADF preferred)
- Proven ability to develop and maintain metric frameworks for enterprise reporting.
- Experience implementing CI/CD pipelines for Power BI and Databricks deployments (nice to have).
- Strong understanding of data modeling, ETL design, and data architecture principles.
- Excellent analytical thinking, problem-solving, and stakeholder communication skills.
- Creation of a reusable, well-governed metric layer in Databricks for enterprise use.
- Improved data reliability, transparency, and self-service enablement across teams.
- THE GOODS:
- Minimum 8 years of experience of working on Data platforms.
- Strong SQL skills for data validation and test case development.
- Hands-on experience with Databricks, Spark, and Delta Lake environments.
- Experience testing data pipelines and transformations in Azure or cloud-based data platforms.
- Knowledge about database management, SQL querying, data modeling, and Online Analytical Processing.
- Additional consideration given for experience scripting and programming language such as Python
- Ability to mentor junior team members and contribute to technical leadership