What are the responsibilities and job description for the Database Implementation Support Services position at Tech Army?
The Public Service Department ( "Department ) collects extensive energy data, but current processes are complex, manual, and inefficient, with no centralized system for management. Over the last several years the Department has been working to develop and implement a solution to streamline and, where possible, employ automation to support better internal management and external sharing of energy-related data. This has included working with an Agency of Digital Services ( "ADS ) business analyst to refine the business case for this project and propose the architecture for a solution. Identified business requirements include:
" Developing a robust data collection and management infrastructure, reducing Department staff hours on data processing and management
" Creating or using existing State of Vermont data portals to share energy data
" Reducing data reporting burden for key partners (i.e., utilities)
" Leveraging existing State of Vermont infrastructure, when it makes sense from a feature and cost perspective
The ADS proposed solution includes one or more database(s) (i.e., operational database, reporting database) to support the storage and querying of a variety of datapoints. A well-designed SQL database integrated with an analytics platform (PowerBI) would enable the Department to automate data collection, ETL, and routine analysis, allowing staff to focus on deeper analysis and regulatory compliance. Further data engineering support would allow us to automate manual processes and to integrate data feeds from external sources to replace manual collection efforts where possible.
A separate penetration testing services SOW RFP, for the proposed database solution, is currently in active procurement.
Existing technology environment includes the following:
1. Data Collection
a. Email: Most data collection is currently done by email. Obligated entities share spreadsheets by email in response to requests made by the department.
b. Manual Download: Third party data is often manually downloaded from the web by staff members and converted into Excel spreadsheets.
c. API: There are limited instances where data is collected via API integration with Power BI.
2. Data storage
a. SharePoint: Most data assets are stored in Excel files in SharePoint.
b. Local Storage: Some data assets are stored in Excel files on local hard drives.
3. ETL/Analytics
a. Most analysis is conducted manually in Excel workbooks.
b. Some ETL and analysis is done Power BI / Power Query
4. Sharing
a. Most data is shared manually via email in response to a direct request or is manually uploaded to ePUC (the Public Utility Commission's case management platform).
b. Analysis and visualizations are often imbedded in Power Point presentations or Word documents which are then posted to the Department's web site or are uploaded to ePUC
REQUIREMENTS:
Knowledge Requirements
1. Azure SQL DB
The vendor must demonstrate strong experience in designing, implementing, and maintaining Azure SQL Databases. This includes expertise in database architecture, performance tuning, indexing strategies, security configurations, backup and restore operations, and managing high availability and disaster recovery setups. Familiarity with scaling databases to meet varying workloads and implementing monitoring and alerting solutions is required.
2. ETL Processing with Python in Data Factory
The vendor must have hands-on experience in building, deploying, and managing ETL pipelines using Azure Data Factory, with Python-based transformations. This includes extracting data from multiple sources, transforming it according to business rules, and loading it into target systems efficiently. Knowledge of orchestrating
pipelines, scheduling, error handling, logging, and implementing reusable and maintainable Python scripts is essential.
3. Azure Blob Storage
The vendor must have practical knowledge of Azure Blob Storage for storing, managing, and retrieving large datasets. This includes configuring containers, managing access policies, implementing security measures, optimizing storage for performance and cost, and integrating Blob Storage with other Azure services. Experience with automated data movement, lifecycle management, and data archiving is preferred.
Functional Requirements
1. Requirements Elaboration
o Working with PSD and ADS staff, refine functional (business) and non-functional (technical) requirements, culminating in a requirements repository.
o Working with PSD and ADS staff, establish minimum performance expectations and success criteria for the end product.
2. Documentation & Training
o Deliver schema design and database structure documentation.
o Provide security configurations and access policies.
o Train VT PSD personnel on database usage and Power BI integration.
Non-Functional Requirements
1. Data Management & Storage
o Support storage and retrieval of specified data types and sources.
o Handle expected data volumes efficiently.
o Ensure data integrity and accuracy during migration/import.
2. Database Schema & Design
o Implement an optimized, normalized relational database schema.
o Support relational integrity with primary and foreign keys.
o Include indexing for performance optimization.
o Allow for future scalability and extensibility.
3. Extract Transform Load (ETL) Processing
o Implement an optimized ETL solution that utilizes Python scripting to handle the transformation in Azure infrastructure.
o ETL processing should be able to be scheduled when new data files are received.
o ETL processing needs be standardized between the data files and source systems.
o ETL Templates based on the data sources for future expansion.
4. Integration Capabilities
o Enable integration with existing VT PSD systems.
o Ensure Azure SQL Database connectivity with Power BI.
o Provide necessary data pipelines or views for Power BI.
o Enable API connections with current and future utility data systems, including, NEPOOL GIS, ISO-NE, and VELCO's VX platform.
5. Security & Compliance
o Implement role-based access control (RBAC).
o Ensure data encryption at rest and in transit.
o Adhere to compliance requirements from VT PSD and VT ADS.
6. Testing & Validation
o Validate database integrity and security controls.
o Implement and validate a database maintenance plan.
7. Deployment & Support
o Deploy the database in the Azure SQL production environment.
o Provide post-implementation monitoring and issue resolution.
o Ensure database connectivity via VPN and office locations.
8. Performance & Optimization
o Optimize database performance for querying and analytics.
o Ensure query performance under expected load conditions.
" Developing a robust data collection and management infrastructure, reducing Department staff hours on data processing and management
" Creating or using existing State of Vermont data portals to share energy data
" Reducing data reporting burden for key partners (i.e., utilities)
" Leveraging existing State of Vermont infrastructure, when it makes sense from a feature and cost perspective
The ADS proposed solution includes one or more database(s) (i.e., operational database, reporting database) to support the storage and querying of a variety of datapoints. A well-designed SQL database integrated with an analytics platform (PowerBI) would enable the Department to automate data collection, ETL, and routine analysis, allowing staff to focus on deeper analysis and regulatory compliance. Further data engineering support would allow us to automate manual processes and to integrate data feeds from external sources to replace manual collection efforts where possible.
A separate penetration testing services SOW RFP, for the proposed database solution, is currently in active procurement.
Existing technology environment includes the following:
1. Data Collection
a. Email: Most data collection is currently done by email. Obligated entities share spreadsheets by email in response to requests made by the department.
b. Manual Download: Third party data is often manually downloaded from the web by staff members and converted into Excel spreadsheets.
c. API: There are limited instances where data is collected via API integration with Power BI.
2. Data storage
a. SharePoint: Most data assets are stored in Excel files in SharePoint.
b. Local Storage: Some data assets are stored in Excel files on local hard drives.
3. ETL/Analytics
a. Most analysis is conducted manually in Excel workbooks.
b. Some ETL and analysis is done Power BI / Power Query
4. Sharing
a. Most data is shared manually via email in response to a direct request or is manually uploaded to ePUC (the Public Utility Commission's case management platform).
b. Analysis and visualizations are often imbedded in Power Point presentations or Word documents which are then posted to the Department's web site or are uploaded to ePUC
REQUIREMENTS:
Knowledge Requirements
1. Azure SQL DB
The vendor must demonstrate strong experience in designing, implementing, and maintaining Azure SQL Databases. This includes expertise in database architecture, performance tuning, indexing strategies, security configurations, backup and restore operations, and managing high availability and disaster recovery setups. Familiarity with scaling databases to meet varying workloads and implementing monitoring and alerting solutions is required.
2. ETL Processing with Python in Data Factory
The vendor must have hands-on experience in building, deploying, and managing ETL pipelines using Azure Data Factory, with Python-based transformations. This includes extracting data from multiple sources, transforming it according to business rules, and loading it into target systems efficiently. Knowledge of orchestrating
pipelines, scheduling, error handling, logging, and implementing reusable and maintainable Python scripts is essential.
3. Azure Blob Storage
The vendor must have practical knowledge of Azure Blob Storage for storing, managing, and retrieving large datasets. This includes configuring containers, managing access policies, implementing security measures, optimizing storage for performance and cost, and integrating Blob Storage with other Azure services. Experience with automated data movement, lifecycle management, and data archiving is preferred.
Functional Requirements
1. Requirements Elaboration
o Working with PSD and ADS staff, refine functional (business) and non-functional (technical) requirements, culminating in a requirements repository.
o Working with PSD and ADS staff, establish minimum performance expectations and success criteria for the end product.
2. Documentation & Training
o Deliver schema design and database structure documentation.
o Provide security configurations and access policies.
o Train VT PSD personnel on database usage and Power BI integration.
Non-Functional Requirements
1. Data Management & Storage
o Support storage and retrieval of specified data types and sources.
o Handle expected data volumes efficiently.
o Ensure data integrity and accuracy during migration/import.
2. Database Schema & Design
o Implement an optimized, normalized relational database schema.
o Support relational integrity with primary and foreign keys.
o Include indexing for performance optimization.
o Allow for future scalability and extensibility.
3. Extract Transform Load (ETL) Processing
o Implement an optimized ETL solution that utilizes Python scripting to handle the transformation in Azure infrastructure.
o ETL processing should be able to be scheduled when new data files are received.
o ETL processing needs be standardized between the data files and source systems.
o ETL Templates based on the data sources for future expansion.
4. Integration Capabilities
o Enable integration with existing VT PSD systems.
o Ensure Azure SQL Database connectivity with Power BI.
o Provide necessary data pipelines or views for Power BI.
o Enable API connections with current and future utility data systems, including, NEPOOL GIS, ISO-NE, and VELCO's VX platform.
5. Security & Compliance
o Implement role-based access control (RBAC).
o Ensure data encryption at rest and in transit.
o Adhere to compliance requirements from VT PSD and VT ADS.
6. Testing & Validation
o Validate database integrity and security controls.
o Implement and validate a database maintenance plan.
7. Deployment & Support
o Deploy the database in the Azure SQL production environment.
o Provide post-implementation monitoring and issue resolution.
o Ensure database connectivity via VPN and office locations.
8. Performance & Optimization
o Optimize database performance for querying and analytics.
o Ensure query performance under expected load conditions.
Azure SQL Database Implementation Consultant
Kainos Innovative Solutions Inc -
Montpelier, VT
Direct Support Professional-Float
Green Mountain Support Services -
Morrisville, VT
OSS Desktop Support Engineer
Sybex Support Services -
Colchester, VT