What are the responsibilities and job description for the Cloud Data Architect with AL/ML position at Sligo Software Solutions Inc.?
Greetings from Sligo Software Solutions, Inc. !!
This is Dasharatham from SLIGO. We have an urgent requirement for Cloud Data Architect with AL/ML position with our client.
Title : Cloud Data Architect with AL/ML
Location : Albany, New York
Duration : 30 Months
Client : NY State - Department of Labor
Job ID: SLG - 19 - 11370
Respond by: 11/20/25 5:00 PM
Description
This position will assist in supporting critical DOL operational functions which include the development, support, and optimization of data related functions within the new NYS Unemployment Insurance System.
Duties
Katta Dasharatham
Recruiting Team Lead
Sligo Software Solutions Inc.
dasharatham.katta@sligosoft.com
www.sligosoft.com
Direct: (518) 818-6239
Office: (518) 444 0459 x 116
Fax : (518) 444 0457
https://www.linkedin.com/in/dasharatham/
**Due to my back-to-back calls, there are times when I might miss your calls. Email is the best way to reach me**
This is Dasharatham from SLIGO. We have an urgent requirement for Cloud Data Architect with AL/ML position with our client.
Title : Cloud Data Architect with AL/ML
Location : Albany, New York
Duration : 30 Months
Client : NY State - Department of Labor
Job ID: SLG - 19 - 11370
Respond by: 11/20/25 5:00 PM
Description
This position will assist in supporting critical DOL operational functions which include the development, support, and optimization of data related functions within the new NYS Unemployment Insurance System.
Duties
- Review overnight alerts and monitoring dashboards for any critical issues across all systems (DB2, Oracle, Cognos, Datastage).
- Check for any new service requests or incidents related to data management.
- Prioritize tasks based on urgency and impact.
- Respond to emails and messages related to data management tasks.
- Participate in daily stand-up meetings (if applicable) to provide updates on progress and any roadblocks.
- Document all work performed, including changes made, issues encountered, and resolutions.
- Update knowledge base articles as needed.
- Monitor DB2 database performance metrics (CPU usage, memory usage, disk I/O, buffer pool hit ratios).
- Check for any database errors or warnings in the DB2 diagnostic logs.
- Monitor database backup and recovery processes.
- Perform routine database maintenance tasks (e.g., running RUNSTATS, REORGCHK, REORG).
- Apply database patches and upgrades as needed (following change management procedures).
- Review DB2 security settings and user access privileges.
- Monitor for any unauthorized access attempts.
- Update data models based on changes to business requirements or system implementations.
- Ensure data models are consistent and accurate.
- Maintain version control of data models.
- Work with business analysts and developers to understand data requirements.
- Review data models with stakeholders to ensure they meet business needs.
- Monitor the performance of Cognos reports and dashboards.
- Investigate and resolve any report errors or performance issues.
- Manage Cognos content (reports, dashboards, data modules).
- Ensure content is organized and accessible to users.
- Provide support to Cognos users, answering questions and troubleshooting issues.
- Monitor the execution of ETL (Extract, Transform, Load) processes that load data into the Oracle data warehouse.
- Investigate and resolve any ETL errors or performance issues.
- Monitor data quality in the data warehouse.
- Identify and correct data quality issues.
- Tune Oracle database performance to optimize query response times.
- Monitor the execution of DataStage jobs.
- Investigate and resolve any job failures or performance issues.
- Modify and update DataStage jobs as needed to meet changing data integration requirements.
- Review and update metadata in the Knowledge Catalog.
- Monitor data lineage and impact analysis.
- Enforce data governance policies.
- Address any data quality or compliance issues identified.
- Monitor data pipelines loading data into Snowflake.
- Troubleshoot any pipeline failures or performance issues.
- Optimize Snowflake queries and data structures for performance.
- Monitor the performance of deployed AI models.
- Work with data scientists to address any model degradation or accuracy issues.
- Prepare data for AI model training and deployment.
- Experience (15 Years)
- Determines database structural requirements by analyzing client operations, applications, and programming; reviews objectives with clients; evaluates current systems; provides database support by coding utilities, responding to user questions, and resolving problems.
- This is an 100% Onsite.
- 72 Months DB2, Oracle, Google Cloud Database Administration: experience in administering databases on bare metal Linux environments, including installation, configuration, performance tuning, backup/recovery, and security management
- 84 Months Red Hat Linux Proficiency: experience in Red Hat Linux operating systems, including command-line administration, scripting, and system troubleshooting.
- 84 Months Data Modeling Experience: experience with data modeling techniques and tools, specifically Erwin Data Modeler, for designing and maintaining OLTP systems.
- 84 Months IBM Cognos Analytics Skills: Experience in developing, deploying, and maintaining reports and dashboards using IBM Cognos Analytics. This includes knowledge of Cognos Framework Manager, Data Modules, Report Studio, and related components.
- 60 months Oracle Data Warehousing Knowledge: experience in data warehousing concepts on Oracle, DB2, or BigQuery platforms.
- 60 months Data Integration Expertise: Experience with data integration tools and techniques, particularly IBM DataStage, for designing, developing, and executing ETL processes.
- 60 months IBM Cloud Pak for Data / Knowledge Catalog Experience: Experience implementing, managing, and utilizing IBM Cloud Pak for Data, specifically the Knowledge Catalog, for metadata management, data governance, and data lineage.
- 84 Months Snowflake Data Warehouse Experience: Experience with Snowflake Data Warehouse, including data loading, performance optimization, and query development.
- 84 Months AI/ML experience in Artificial Intelligence (AI) and Machine Learning (ML) concepts, data preparation, model deployment, and model monitoring.
- 84 Months Communication and Collaboration experience collaborate with business analysts, developers, data scientists, and other stakeholders.
- Résumé
- Copy of Candidate Identification (i.e., Driver’s License/Green Card/Visa and Passport if applicable)
- Any documents referenced in the above Requested Qualifications (i.e., professional certifications, degrees, etc.)
Katta Dasharatham
Recruiting Team Lead
Sligo Software Solutions Inc.
dasharatham.katta@sligosoft.com
www.sligosoft.com
Direct: (518) 818-6239
Office: (518) 444 0459 x 116
Fax : (518) 444 0457
https://www.linkedin.com/in/dasharatham/
**Due to my back-to-back calls, there are times when I might miss your calls. Email is the best way to reach me**
Salary : $78