Capio Group
Get AI‑powered advice on this job and more exclusive features.
Capio Group
is looking for an experienced
Software Engineer/Tester!
Full‑time employee - Sacramento
Salary: $125,000 - $135,000
About Us Capio Group is a California‑based Information Technology Consulting firm serving the public sector since 2010. We assist the Government in delivering large, complex systems and solutions. Capio Group is a small, but quickly growing firm that mixes good ideas with great people to achieve extraordinary results for our clients. We offer the salary and benefits of the bigger companies, with the added bonus of a flexible workplace and a great work‑life balance.
Scope of Work Capio Group is looking for an experienced and qualified Software Engineer/Tester to provide testing expertise in usability, security (vulnerability and penetration), regression, user acceptance, role‑based, defect testing, functional and Selenium‑based automation testing. The Software Engineer/Tester will be required to research, create and execute test plans, test case scenarios, and test scripts; ensure the quality of test artifacts created is in support of the client’s quality assurance efforts; comply with clients project validation and testing metrics, methods, and best practices. Additionally, the Software Engineer/Tester will contribute to the overall quality for the project by providing clear documentation and effective knowledge transfer to the client personnel.
Task Management
Monthly Status Report (MSR): Submit written monthly status reports using the MSR template provided. Include an overview of staff tasks, activities, schedule adherence, issues, and planned work. Due the 15th calendar day of each month following the first full month of service.
Weekly Status Reports (Written): Submit by the client’s request weekly written status reports to include updates to the project schedule tasks.
Final Status Report: Develop and submit a Final Status Report, due thirty (30) calendar days prior to the end of the Agreement. Include—summary of SOW tasks and activities, deliverables, milestone accomplishments, lessons learned, actual agreement expenditures vs. planned expenditures.
Communications
Communicate with client staff, management, other consultant staff sponsors, System Integrator staff, interface partners, county staff both written and orally.
Communicate to the client’s management on critical issues that arise during testing and escalate issues and risks as required.
Monitoring and Control
Work collaboratively with client staff and System Integrator testing teams to maintain product and solution quality by adhering to and improving quality frameworks, methods, and practices.
Ensure the client’s User Acceptance Testing (UAT) is consistent with the client’s Test Plans.
Participate and attend all meetings as required.
Support all activities associated with quality assurance testing support.
Assist in providing oversight to System Integrator testing activities.
Perform test validation and support Sponsor testing activities.
Testing Operation and Maintenance (O&M) Phase
Review the System Integrator test plans, scenarios, test cases and test scripts to ensure adherence with requirements and design artifacts.
Review and execute user acceptance test cases and test scripts aligned to functional and non‑functional requirements.
Conduct testing efforts including disaster recovery testing, advocate testing, and release cut‑over validation.
Identify and document defects from user acceptance testing.
Knowledge Transfer and Lessons Learned
Develop a Knowledge Transfer Plan and Lessons Learned Report. Capture, document, track, and report lessons learned and parking lot items. Due sixty (60) days prior to end of Agreement.
Engage in and complete knowledge transfer training of testing processes and activities to test staff. Provide documentation of all required test processes and activities. Deadline: sixty (60) days prior to end of Agreement term.
Mandatory Experience
A minimum of three (3) years of experience with all phases of the Software Development Life Cycle (SDLC), including requirements, systems design, development, and testing concepts including both waterfall and agile methodologies for complex enterprise systems.
A minimum of three (3) years of experience with System test/User Acceptance Testing (UAT) for large‑scale Eligibility and Enrollment government IT projects.
A minimum of five (5) years of experience in developing scenario roadmaps; validating, reviewing, and writing scripts from requirements; and implementing functionality for large‑scale government IT projects.
A minimum of three (3) years of experience with querying data from Aurora Postgres.
A minimum of three (3) years of experience testing high‑volume (more than 10,000 transactions per day), scalable web applications across multiple SDLCs and contemporary QA processes and automated tools.
A minimum of three (3) years of experience using testing tools such as SoapUI, React, Jenkins, Gitlab, Selenium, and an Application Lifecycle Management (ALM) or comparable system.
A minimum of five (5) years total experience testing Conversion, JAVA web‑based applications, integration with platforms such as ForgeRock, AWS, and Aurora Postgres.
A minimum of three (3) years of experience identifying and documenting test results and discrepancies between approved design documents and application performance.
A minimum of three (3) years of experience testing role‑based security, batch jobs, applications on various Internet browsers (IE, Firefox, Chrome, etc.), operating systems (Windows, Mac, etc.) and mobile devices.
A minimum of three (3) years of experience testing applications with multiple stakeholders and customers with varied business priorities and levels of experience.
A minimum of five (5) years of experience developing and testing with an automation framework using data‑driven or keyword‑driven methodologies.
A minimum of five (5) years of experience using Selenium automation/regression testing.
A minimum of two (2) years of experience testing with Salesforce.
Provide client/user end reference contact information for each applicable project meeting the requirements upon request.
Equal Opportunity Employer At Capio Group, our employees are our greatest asset and diversity, equity, and inclusion are at the core of who we are. Our commitment to these values is unyielding and is central to our mission and to our impact. We know that having diverse perspectives helps to generate better ideas to solve the complex problems of our diverse clients and the communities they serve.
#J-18808-Ljbffr
Capio Group
is looking for an experienced
Software Engineer/Tester!
Full‑time employee - Sacramento
Salary: $125,000 - $135,000
About Us Capio Group is a California‑based Information Technology Consulting firm serving the public sector since 2010. We assist the Government in delivering large, complex systems and solutions. Capio Group is a small, but quickly growing firm that mixes good ideas with great people to achieve extraordinary results for our clients. We offer the salary and benefits of the bigger companies, with the added bonus of a flexible workplace and a great work‑life balance.
Scope of Work Capio Group is looking for an experienced and qualified Software Engineer/Tester to provide testing expertise in usability, security (vulnerability and penetration), regression, user acceptance, role‑based, defect testing, functional and Selenium‑based automation testing. The Software Engineer/Tester will be required to research, create and execute test plans, test case scenarios, and test scripts; ensure the quality of test artifacts created is in support of the client’s quality assurance efforts; comply with clients project validation and testing metrics, methods, and best practices. Additionally, the Software Engineer/Tester will contribute to the overall quality for the project by providing clear documentation and effective knowledge transfer to the client personnel.
Task Management
Monthly Status Report (MSR): Submit written monthly status reports using the MSR template provided. Include an overview of staff tasks, activities, schedule adherence, issues, and planned work. Due the 15th calendar day of each month following the first full month of service.
Weekly Status Reports (Written): Submit by the client’s request weekly written status reports to include updates to the project schedule tasks.
Final Status Report: Develop and submit a Final Status Report, due thirty (30) calendar days prior to the end of the Agreement. Include—summary of SOW tasks and activities, deliverables, milestone accomplishments, lessons learned, actual agreement expenditures vs. planned expenditures.
Communications
Communicate with client staff, management, other consultant staff sponsors, System Integrator staff, interface partners, county staff both written and orally.
Communicate to the client’s management on critical issues that arise during testing and escalate issues and risks as required.
Monitoring and Control
Work collaboratively with client staff and System Integrator testing teams to maintain product and solution quality by adhering to and improving quality frameworks, methods, and practices.
Ensure the client’s User Acceptance Testing (UAT) is consistent with the client’s Test Plans.
Participate and attend all meetings as required.
Support all activities associated with quality assurance testing support.
Assist in providing oversight to System Integrator testing activities.
Perform test validation and support Sponsor testing activities.
Testing Operation and Maintenance (O&M) Phase
Review the System Integrator test plans, scenarios, test cases and test scripts to ensure adherence with requirements and design artifacts.
Review and execute user acceptance test cases and test scripts aligned to functional and non‑functional requirements.
Conduct testing efforts including disaster recovery testing, advocate testing, and release cut‑over validation.
Identify and document defects from user acceptance testing.
Knowledge Transfer and Lessons Learned
Develop a Knowledge Transfer Plan and Lessons Learned Report. Capture, document, track, and report lessons learned and parking lot items. Due sixty (60) days prior to end of Agreement.
Engage in and complete knowledge transfer training of testing processes and activities to test staff. Provide documentation of all required test processes and activities. Deadline: sixty (60) days prior to end of Agreement term.
Mandatory Experience
A minimum of three (3) years of experience with all phases of the Software Development Life Cycle (SDLC), including requirements, systems design, development, and testing concepts including both waterfall and agile methodologies for complex enterprise systems.
A minimum of three (3) years of experience with System test/User Acceptance Testing (UAT) for large‑scale Eligibility and Enrollment government IT projects.
A minimum of five (5) years of experience in developing scenario roadmaps; validating, reviewing, and writing scripts from requirements; and implementing functionality for large‑scale government IT projects.
A minimum of three (3) years of experience with querying data from Aurora Postgres.
A minimum of three (3) years of experience testing high‑volume (more than 10,000 transactions per day), scalable web applications across multiple SDLCs and contemporary QA processes and automated tools.
A minimum of three (3) years of experience using testing tools such as SoapUI, React, Jenkins, Gitlab, Selenium, and an Application Lifecycle Management (ALM) or comparable system.
A minimum of five (5) years total experience testing Conversion, JAVA web‑based applications, integration with platforms such as ForgeRock, AWS, and Aurora Postgres.
A minimum of three (3) years of experience identifying and documenting test results and discrepancies between approved design documents and application performance.
A minimum of three (3) years of experience testing role‑based security, batch jobs, applications on various Internet browsers (IE, Firefox, Chrome, etc.), operating systems (Windows, Mac, etc.) and mobile devices.
A minimum of three (3) years of experience testing applications with multiple stakeholders and customers with varied business priorities and levels of experience.
A minimum of five (5) years of experience developing and testing with an automation framework using data‑driven or keyword‑driven methodologies.
A minimum of five (5) years of experience using Selenium automation/regression testing.
A minimum of two (2) years of experience testing with Salesforce.
Provide client/user end reference contact information for each applicable project meeting the requirements upon request.
Equal Opportunity Employer At Capio Group, our employees are our greatest asset and diversity, equity, and inclusion are at the core of who we are. Our commitment to these values is unyielding and is central to our mission and to our impact. We know that having diverse perspectives helps to generate better ideas to solve the complex problems of our diverse clients and the communities they serve.
#J-18808-Ljbffr