What are the responsibilities and job description for the QA Automation+Performance testing Engineer-12+ Years-Chicago, IL (Day 1 Onsite) position at iMedhas Consulting Services?
Title: QA Automation Performance testing Engineer
Location: Chicago, IL (Day 1 Onsite)
Duration: 6 Months
Note: Need 12 Years candidates.
Job Description:
- Hands on experience in software test automation and performance engineering (or equivalent), covering AWS and on-perm platforms.
- Strong coding skills in one or more: Java, Python, JavaScript, Groovy, TypeScript
- Hands-on experience with testing tools: Cypress, Selenium, JMeter, Blazemeter or equivalent tools
- Working proficiency with development toolsets and CI/CD: Jenkins, GitHub, Bitbucket, IntelliJ IDEA, VS Code
- Familiarity with distributed services, microservices architecture in AWS; Kubernetes; databases such as Oracle and MongoDB
- Experience with observability tooling and metrics: Dynatrace, Geneos, Splunk, Grafana; ability to interpret system/resource metrics and application telemetry.
- Familiar with Agile Methodology; ability to work independently with strong analytical, communication, and documentation skills; ability to influence cross-functional teams
Preferred qualifications, capabilities, and skills
- Experience in financial services; domain knowledge in Receivables or Lockbox
- Performance tuning across JVM/.NET, databases, and front-end assets
-
Experience leveraging GenAl to accelerate test creation, analysis, or documentation.
-
Define and own the automation strategy and roadmap; design, build, and maintain robust test automation frameworks; implement reliable automated tests for UI, API, microservices, and data workflows.
-
Integrate automated tests into CI/CD; establish quality gates, automated reporting, and release readiness criteria.
-
Lead test strategy and coverage for functional and non-functional requirements; set and enforce standards for test data management and environment readiness to ensure deterministic, reproducible runs. Instrument systems and build observability dashboards to capture SLIs/SLOs, latency, throughput, error rates, and resource utilization; integrate these signals with test reports.
-
Own the performance testing lifecycle end-to-end: requirements, workload modeling, environment setup, and execution (load and stress).Java, Python, JavaScript, Groovy, TypeScript, Cypress, Selenium, JMeter, Blazemeter or equivalent tools, CI/CD: Jenkins, GitHub, Bitbucket, IntelliJ IDEA, VS Code, AWS; Kubernetes; databases such as Oracle and MongoDB, Dynatrace, Geneos, Splunk, Grafana; ability to interpret system/resource metrics and application telemetry. JVM/.NET, databases