References

Our client's satisfaction is very important to us.
You can find below a list of references about solutions we have provided for some of our clients.

AfroCentric Helios IT


To whom it may concern, This letter makes recommendation of the Testing Services provided by Ubungcweti Management Services (Pty) LTD to AfroCentric Helios IT (Pty) LTD


Customer Name:
AfroCentric Helios IT
Team Size:
12-15 resources
Project Duration:
01/02 to Current
Application:
Oracle - Medscheme's (Nexus System)
Type of Testing:
Requirements Testing, Integration Testing, Regression testing and Performance testing.
UMS Role in Testing:
Life Cycle Phases - Initiation, Planning Analysis and Design, Execution readiness, Closeout, Handover
High Level Responsibilities:
Assess if the system can support day-to-day business and user scenarios and ensure the system is sufficient and correct for business usage. Verify the system’s behaviour is consistent with the requirements.
Metrics per release:
Test Cases executed, effort distribution, defect find rate, defect fix rate etc
Metrics daily:
The number of Defects, reviewed defects per day, Test Case Effectiveness - effectiveness of the test cases and the stability of the software, The number of test cases executed per person per day, The of test cases derived per day per person
Acceptance Testing::
UMS is AfroCentric's (Helios IT) preferred Supplier for Development and Testing. They also provide IT Executive placements. As a result of this contract, UMS has placed Test Managers, Test Leads as well as Test Analysts to execute the following:
  • Analysis of Business Requirements
  • Creation of UAT test plan
  • Identify Test Scenarios
  • Create UAT Test Cases
  • Preparation of Test Data(Production like Data)
  • Run the Test cases
  • Record the Results
  • Confirm business objectives
Lead Times:
UMS has proven that they are able to provide quality resources to AfroCentric with the shortest Lead times possible. They have a number of resources on the bench that are able to start within a week. Turnaround times are fast and efficient. If specialised skills are required, this could take a bit longer but nothing less than a more than a 2 month period for perm placements
Quality of Resources:
Provide an overview of the types of resources that were provided to you who have the necessary skills applicable to the specific project and functional subject areas being contracted, inclusive of software testing experience and proficiency, test tools and software applications under test (knowledge and experience in the specified test tools and software applications
UMS Account Model:
The UMS Sales and Services Model enables UMS to develop a strategic partnership with Helios IT. The Account Manager liaises once a month with the Line Manager to ensure that the Client is receiving quality deliverables by conducting monthly KPI sessions. UMS Service Delivery Manages are weekly on the floor ensuring UMS consultant are delivering. They are able to look at the staff performance and rectify issues
Commitment of Resources:
UMS has always been committed to delivering quality resources with the shortest turnaround times. Resources are bound into the system using the UMS MOU document ensuring that resources are committed and available once client is happy with the interviews.
UMS Flexibility of notice periods, etc:
UMS Client and Contract Notice periods are aligned to the BCEA, 1 Week for Resources who have worked for less than 6 months, 2 weeks for resources that have worked longer than months but under a year, 30 days for resources who have worked over a year. UMS is flexible in replacing resourcing as long as sufficient notice is provided or adequate planning carried out between the Client and UMS.
Business Continuity:
Backup Resources are readily available. Handover is managed by the UMS Sales Team and little impact is felt by the business with minimum impact to the business.
Projects:
Work Breakdown structure and schedule is done in conjunction with the client and at a Test Analyst / Test Manager or Lead level.
Resource and Capacity Planning:
The Service Delivery Manager works with the Client Team leads to ensure that resource capacity is measure adequately for projects. UMS is flexible is sharing resources based at the same client across projects to ensure maximum delivery.
Involvement:
Since UMS offers Analysis, Project Management and Testing Resources to Helios IT, a number of the resources are involved in the planning and design phases of a number project deliverables. Since the inception of Helios IT as the IT company of AfroCentric , UMS resources have been involved in a number of initiatives to define the Testing Areas PPP's.
Areas of Testing:
UMS resources have conducted the following types of testing for Afrocentric- Acceptance Testing, Functional Testing, Integration Testing, Post - Implementation Support .
Data Clean-up:
Quality Center is the tool that testers are using to manage test plans and test cases as well as test execution.
Process Improvements:
UMS Resources are actively involved in test process improvements and using different testing methodologies.
Reporting:
Monthly KPI are done with their staff and performance is discussed on a monthly basis with clients. Resources provide weekly status reports to SDMS.
Testing Services:
There have been no breaches to date with UMS regarding the testing services provided by UMS to AfroCentric.
Testing Information:
Quality Center is the toolset used for test planning and test cases.
Acceptance Testing::
Before the User Acceptance testing can be done the application is fully developed. Various levels of testing (Unit, Integration and System) are already completed before User Acceptance Testing is done. As various levels of testing have been completed most of the technical bugs have already been fixed before UAT. Effective User Acceptance Testing Test cases are created. The following steps are carried out by the Testing team:
  1. User Acceptance Test (UAT) Planning
  2. Designing UA Test Cases
  3. Selecting a Team that would execute the (UAT) Test Cases
  4. Executing Test Cases
  5. Documenting the Defects found during UAT
  6. Resolving the issues/Bug Fixing
  7. Sign Off
Testing Tools
Quality Center is used to do all test planning
Services
UMS resources are able to perform complex functional, application, regression, and performance tests, Design and maintain UAT test cases to be re-used for recurring and common implementations, Map business requirements into test cases and/or test scripts. Execute objectives and targets for internal UAT team, establishing baseline of results, milestones of achievement, Archive a process and review implementation requirements and specification, identifying all points of testing and expected outcomes, Document and report test results as well as identify and track issues and defects, Assist in the parallel running of the current system to ensure the new product results are validated against the existing application, Help analyse and improve test plans and results for each product, Work with implementation teams to ensure the quality of the product and timely bug resolution, Provide afterhours support during maintenance windows as needed.
Core Functions
Please provide detail around the involvement of UMS Resources delivering these core functions for your company
One of the most important activities in the UAT is to identify and develop test scenarios. These test scenarios are derived from the following documents:
  • Project Charter
  • Business Use Cases
  • Process Flow Diagrams
  • Business Requirements Document(BRD)
  • System Requirements Specification(SRS)
The UAT test plan outlines the strategy that will be used to verify and ensure an application meets its business requirements. It documents entry and exit criteria for UAT, Test scenarios and test cases approach and timelines of testing .Identify the test scenarios with respect to high level business process and create test cases with clear test steps. Test Cases should sufficiently cover most of the UAT scenarios. Business Use cases are input for creating the test cases.
Preparation of Test Data.
Confirm that business objectives are met.
  • Analysis of Business Requirements
  • Creation of UAT test plan
  • Identify Test Scenarios
  • Create UAT Test Cases
  • Preparation of Test Data(Production like Data)
  • Run the Test cases
  • Record the Results
  • Confirm business objectives
Business Analysts or UAT Testers needs to send a sign off mail after the UAT testing . After sign-off the product is good to go for production. Deliverables for UAT testing are Test Plan, UAT Scenarios and Test Cases, Test Results and Defect Log, Before moving into production, following needs to be considered:
  • No critical defects open
  • Business process works satisfactorily
  • UAT Sign off meeting with all stakeholders

MTN


To whom it may concern, This letter makes recommendation of the Testing Services provided by Ubungcweti Management Services (Pty) Ltd to MTN Business Optimisation on the ME 2 U Project.


Customer Name:
MTN Business Optimisation
Team Size:
12-15 resources
Project Duration:
2010-11-03 - 2012-06-30
Application:
HP Quality Center / Rational Quality Manager
Type of Testing:
Functional Testing , Regression Testing, Business Process Testing, User
Acceptance Testing, Post Implementation Testing, Change Requests, Business
Acceptance Testing, exploratory Testing, Boundary Value Analysis, Systems
Intergration Testing, Business Requirement Acceptance Testing.
UMS Role in Testing:
Full support in the SDLC and STLC lifecycle testing Initiation, Planning
Analysis and Design, Execution readiness, Closeout, Handover; and resources
to cover additional testing demands for projects , Hardware supplied to
resources , Back up of resources as and when required. Testing support and
training.
High Level Responsibilities:
Ensuring all Test Deliverables were managed according to the Project
requirements as well as ensuring quality.
Metrics per release:
Cumulative pre and post release changes improvements or issues .Test
Effectiveness Improvements, Staff retention, Production Defects
Metrics daily:
30 -40 Test cases designed. 80 - 100 Test Cases daily executed, 10 -15 defects
logged and retested
Acceptance Testing::
Performing and managing Business Requirements Acceptance Testing, in
terms of planning, analysis, design and execution. Ensuring Test data set up
for UAT, managing execution of the UAT, Managing the progress reporting on
the UAT, Managing KPI'S and 360 evaluation.
Lead Times:
Resources on-boarded from Interview to active work over a 2 week .sources
were highly skilled. Where a replacement was required due to resources goin
on leave a stand in was allocated within 24 hours.
Quality of Resources:
The testing resources provided were of very good quality. Where there were
issues relating to a single resource, this was escalated, managed and the
replacement for the tester was done very professionally and quickly. Never
had any issues relating to any resources supplied by UMS
UMS Account Model:
The UMS model of having regular meetings with the Test Manager as well as
with the resources that are on site worked very well. Any performance issues
are quickly identified and turned around. UMS was also very flexible in
accommodating my needs and went beyond the call of duty in having some
of these meetings on weekends because of heavy project schedule and
workload.
Commitment of Resources:
There was never any hesitation from any of the resources interviewed to join
the organisation. For each position advertised UMS provided at least 3-5
suitable candidates
UMS Flexibility of notice periods, etc:
No issues were encountered with UMS Resources, Should a stand in be
required the lead time was met as per SLA. Where resources had to give
notice due to personal and family issues, they worked the complete notice
period including weekends.
Business Continuity:
When resources go on leave they are able provide backup resources , ensure
effective handover etc. Leave - Continuity, least impact to productivity as the
resources are fully skilled for handover. Backup resources and handover plans
were effectively implemented.
Projects:
Some resources taken on board for a specific project in junior roles, proved
themselves very capable and were put in charge of running/leading the next
project. The sizes of the projects varied across several systems . Where
resource’s continually complete assignments within time and are always
looking for work, makes the job managing easier and also trust can be placed
in assigning more responsibilities to these resources.
Resource and Capacity Planning:
Not for this engagement
Involvement:
Not for this engagement
Areas of Testing:
Functional Testing , Regression Testing, Business Process Testing, User
Acceptance Testing, Post Implementation Testing, Change Requests, Business
Acceptance Testing ,exploratory Testing, Boundary Value Analysis, Systems
Intergration Testing, Business Requirement Acceptance Testing
Data Clean-up:
Resources had to analyse existing test assets for relevance export from
Quality Centre to Rational Quality Manager. UMS provided support in
sending the resources for a certification in RQM, and the resources were well
in terms of migrating from QC to RQM.
Process Improvements:
Not for this engagement
Reporting:
Monthly KPI meetings to discuss the resource performance were held, as
stated above some of these had to take place out of working hours and
during weekends.
Testing Services:
The service expected from UMS is the provision of skilled local and offshore
resources for software testing. There have been no breaches to date. All
testing was performed according to the testing processes and governance
performing resource was keeping up with the work schedule.
Testing Information:
HP Quality Center was the toolset used. For most of the analysis and design
phases a single workshop per specification was sufficient for the resources to
continue their work and deliver the required outputs.
Acceptance Testing::
UMS Resources work collectively and actively with all stakeholders across
MTN. Understanding and product knowledge is acquired quickly and
deliverables are met within the agreed timelines
Testing Tools
HP Quality Center ,Rational Quality Manager, Loadrunner , Winrunner
 
Core Functions
Measure compliance with business and system requirements
Validate end-to-end business processes to confirm use of the system
Confirm integrity of converted and additional data (e.g. values that appear in a look-up table)
Identify areas where user needs are not included in the system or are incorrectly specified or interpreted in the system
Co-ordinate, monitor and manage end-user acceptance testing,
Evaluate and sign-off go-live readiness

FNB Shared Services


To whom it may concern, This letter makes recommendation of the Testing Services provided by Ubungcweti Management Services (Pty) LTD to FNB (PTY) LTD for the Shared Services Imaging Project for 2011


Customer Name:
FNB Shared Services
Team Size:
7-25
Project Duration:
2010-07-01 - 2012-07-31
Application:
HP Quality Center
Type of Testing:
Functional Testing , Regression Testing, Business Process Testing, User Acceptance Testing, Post Implementation Testing, Change Requests, Test Automation, Performance Testing, DR Testing, HA Testing, BCP Testing, System and System Integration Testing, Workflow Testing, Equipment Testing (Scanners), Compliance Testing, SANS Compliance.
UMS Role in Testing:
Full testing lifecycle testing Initiation, Planning Analysis and Design, Execution readiness, Closeout, Handover; and resources to cover additional testing demands for projects
High Level Responsibilities:
Managed Team to ensure Testing Deliverables and Quality was achieved on the projects.
Metrics per release:
Test Effectiveness Improvements, Staff retention, Production Defects
Metrics daily:
50-75 Test Cases daily executed,
Acceptance Testing::
Performing and managing UAT. Managing the Business Users acceptance testing efforts. Ensuring Test Lab Equipment positioned for the UAT, co-ordinating the execution of the UAT, Managing the progress reporting on the UAT
Lead Times:
Resources on-boarded from Interview to active work over a 2 week (being the shortlist), to having a problem testing resource being replaced within a week after escalation to UMS (less than 1 week)
Quality of Resources:
The testing resources provided were of very good quality. Where there were issues relating to a single resource, this was escalated, managed and the replacement for the tester was done very professionally and quickly. Another testing resource required to work 7 days a week for a few months, had transport issues and UMS came to the party by providing
UMS Account Model:
The UMS model of having regular meetings with the Test Manager as well as with the resources that are on site worked very well. Any performance issues are quickly identified and turned around. UMS was also very flexible in accommodating my needs and went beyond the call of duty in having some of these meetings on weekends because of heavy project schedule and workload.
Commitment of Resources:
There was never any hesitation from any of the resources interviewed to join the organisation. For each position advertised UMS provided at least 3-5 suitable candidates.
UMS Flexibility of notice periods, etc:
UMS was very accommodating with the single instance of having a poorly performing resource. There was no enforcement of any notice period. Where resources had to give notice due to personal and family issues, they worked the complete notice period including weekends.
Business Continuity:
As stated above - the single instance of a poorly performing resource was addressed within a week. Where the resource had to return to India for family matters, a suitable replacement was put forward before the end of the notice period and on-boarded with sufficient time for handover.
Projects:
Some resources taken on board for a specific project in junior roles, proved themselves very capable and were put in charge of running/leading the next project. The sizes of the projects were huge involving the re-write of the banking systems. Where resources continually complete assignments within time and are always looking for work, makes the job managing easier and also trust can be placed in assigning more responsibilities to these resources.
Resouce and Capacity Planning:
Not for this engagement
Involvement:
Not for this engagement
Areas of Testing:
Functional Testing , Regression Testing, Business Process Testing, User Acceptance Testing, Post Implementation Testing, Change Requests, Test Automation, Performance Testing, DR Testing, HA Testing, BCP Testing, System and System Integration Testing, Workflow Testing, Equipment Testing (Scanners), Compliance Testing, SANS Compliance.
Data Clean-up:
Resources had to analyse existing test assets for relevance and priority. Some of these had to be re-written before they could be executed. Poor legacy data was identified and communicated to the development and operations team for attention. Once the clean-up was done the results were verified. The testing approach for test data was to create the data needed for the scenarios and baseline this. Once base lined, these could be used for repeated cycles.
Process Improvements:
Not for this engagement.
Reporting:
Monthly KPI meetings to discuss the resource performance were held, as stated above some of these had to take place out of working hours and during weekends.
Testing Services:
All testing was performed according to the testing processes and governance. There were no deviations on the process. The instance of the poor performing resource was keeping up with the work schedule.
Testing Information:
HP Quality Center was the toolset used. For most of the analysis and design phases a single workshop per specification was sufficient for the resources to continue their work and deliver the required outputs.
Acceptance Testing::
UAT testers are extremely busy and for the systems being tested separate plans were drawn up to perform the Acceptance phase. The UAT team had a separate lab dedicated for performing the UAT and part of the resources job was to ensure that all the machines in the lab had the correct configuration, deployment, access and required software. This also included separate scanners for the UAT department.
Core Functions:
Measure compliance with business and system requirements
Validate end-to-end business processes to confirm use of the system
Confirm integrity of converted and additional data (e.g. values that appear in a look-up table)
Identify areas where user needs are not included in the system or are incorrectly specified or interpreted in the system
Co-ordinate, monitor and manage end-user acceptance testing
Evaluate and sign-off go-live readiness
Testing Tools
HP Quality Center, QTP, Performance Center, Requirements, Test Plan, Defects, Reporting

Multichoice


To whom it may concern, This letter makes recommendation of the Testing Services provided by Ubungcweti Management Services (Pty) LTD to Multichoice
BTD.


Customer Name:
Multichoice BTD Decoder Development
Team Size:
45
Project Duration:
June 2011 to Aug 2013
Application:
Decoder software testing
Type of Testing:
Requirements, Functional, Regression, Performance, Stress, Reliability and
Acceptance Testing.
UMS Role in Testing:
Test planning and execution for Unit Testing, Pre-Integration, Integration
Testing.
High Level Responsibilities:
Test Team Lead, Test Execution, Defect logging and Reporting
responsibilities.
Metrics per release:
Volume of Defects detected, resolved, open, closed.
Metrics daily:
Defects Opened, Closed by Severity
Acceptance Testing::
Involved, but not responsible.
Lead Times:
On average, the UMS lead time was about 1 month for generics skills and
about 2 months for specialists.
Quality of Resources:
The skill sets provided were mostly from the mobile telecommunications,
broadcast and banking industry with proven experience in software testing
and development. All resources came skilled in usage of defect tracking
tools, testing techniques and general computing knowledge.
UMS Account Model:
The Key Resources assigned to an Account include the following Key
Personnel: Account Executive and Delivery Manager. This model has proved
very effective as all contractual, commercial and delivery requirements were
effectively managed.
Commitment of Resources:
They have usually had a pool of 3-5 resources available with generic testing
and admin skills. Specialist skills have proven to be more scarce.
UMS Flexibility of notice periods, etc:
We usually try to abide by our contractual agreements in this area. When
the situation demanded flexibility, UMS have always been accommodated
our needs.
Business Continuity:
Backup resources and handover plans were effectively implemented.
Projects:
This cannot be listed as it’s the IP of the company
Resource and Capacity Planning:
Projects, resource planning and allocation schedule is done with the Client
and resource, managed and overseen by the Test Lead / Manager
Areas of Testing:
UMS staff are part of teams responsible for Functional Testing, Integration
Testing, User Acceptance Testing, Performance Testing and Automation
Testing of all Multichoice decoder products.
Data Clean-up:
All UMS testing staff are trained by Multichoice in the use of the Spitateam
and Jira tools. Data clean-up of test cases and effective defect logging
techniques are actively driven by the Test Leads.
Process Improvements:
A UMS Team Lead was responsible for co-ordinating our Defect Tracking
Workgroup which implemented improvements in the way we used our
tools.
Reporting:
We have not enforced detailed and formalised client reporting as have
implemented a flexible and collaborative process with staff and suppliers.
We rely on regular face to face meetings and KPI discussions with
Account Managers for reporting.
Testing Services:
UMS staff are responsible for the accurate logging of the following data on
Jira and Spirateam:
Testing Information:
UMS staff are responsible for the acurate logging of the following dataon Jira and Spirateam

  • Test cases with execution steps
  • Defect reports
  • Defect severity
  • Steps to reproduce defect
  • Software version data
  • Number of occurrences of defect

  • Acceptance Testing::
    UMS staff actively works with decoder test teams that are responsible for
    end to end acceptance testing.
    Testing Tools
  • Spirateam
  • Jira