Evaluation of Acme Audio's Online Workflow and Software Feature Packages
Background
Acme Audio (a pseudonym) provides CAD / CAM (Computer Aided Design / Computer Aided Manufacturing) solutions to companies who create custom in-ear products. These companies range from hearing aid manufacturers to in-ear monitor and hearing protection providers.
To support Acme Audio customers, Acme created videos and PDF descriptions for new features and workflows that customers might find relevant. These resources are referred to collectively as "packages." The packages include both a video outlining the workflow or software feature and a PDF document outlining step-by-step instructions on how to implement the workflow or software features into their own production environment.
There are three types of stakeholders for the Online Workflow and Software Features Packages.
Upstream Stakeholders: Several stakeholders played a role in selecting, designing, developing and making the packages available to customer via the Partner Portal. The key upstream stakeholders are the following:
- Audio Director
- Technical Account Manager
- Training and Application Specialist
- Two Product Managers
- Partner Portal Content Team
- Around 150 Production Managers
- Around 80 3D Specialists
- Possibly upwards of 80 Software Trainers
- Anywhere between 400 to 500 actual users of the software in a production environment
- People purchasing custom in-ear products
- Dispensers of custom in-ear products
- Governments
- Companies with users of custom hearing protection
- Acme Dental Academy Manager
The packages include both a video outlining the workflow or software feature and a PDF document outlining step-by-step how to implement the workflow or software feature into their own production environment. Acme Audio engaged in an evaluation to provide conclusions on the overall quality and worth of the packages to end customers. The client for the evaluation is the Audio Director of Acme Audio.
Evaluation Methods
To determine the evaluation methods, the evaluation team (Crane et al. 2019) worked with the following stakeholders:
- Product Management
- Technical Support Manager
- Director of Acme Audio
- Determine the impact on customers, both positive and negative, intended and unintended, of the current packages available to customers on the Partner Portal.
- Make better decisions on what material to develop.
- Identify changes required to the material development process.
Through initial communication with the client (John Smith) and based on the Program Logic Model (PLM) for the Online Workflow and Software Feature Packages, the evaluation team began to develop a list of specific program dimensions to investigate.
Based on input from key stakeholders, the evaluation team lead proposed several dimensions and met with the Director of Acme Audio and members of Product Management to finalize key dimensions to investigate. During the meeting, the stakeholders and the evaluation team consulted the PLM. They initially agreed on investigating five dimensions in relation to the packages, two of which were not specifically outlined in the PLM (this was later whittled down to four dimensions, with two dimensions merged to ensure ease of data collection. The update was agreed to by key stakeholders).
Once the evaluation team established the dimensions, they discussed with the stakeholders how they intended to make use of the evaluation findings. Based on their input, the evaluation team helped them identify the relative degrees of importance weighting (IW) among the dimensions:
- Package Design and Development (Important)
- Customers Accessing Packages (Very Important)
- Customer Knowledge of Package Usefulness and Customer Utilization of Packages (Extremely Important)
- Other Outcomes (Extremely Important)
Data Collection Methods
As the overall approach to this evaluation, the evaluation team followed Chyung’s (2018) 10-step evaluation procedure. The 10-step procedure assists the evaluation team to design an evaluation based on the stakeholders’ needs and the stakeholders’ use of the evaluation findings.
The evaluation team used the ARCS model (Keller, 2010) to develop a checklist to determine how well the packages were designed, specifically about what elements of attention, relevance, confidence and satisfaction (ARCS) were present in the packages and what areas may be worth adding. Due to limitations in time and the importance placed on this element (ranked least important by stakeholders), the team evaluated the top ten downloaded elements only.
As a primary focus of the evaluation, the evaluation team used Brinkerhoff’s (2006) Success Case Method, which provided structure and direction for the team to investigate Dimension 3 and allowed the team to collect information to find success cases and non-success cases.
While incorporating these frameworks, the evaluation team used multiple sources of data that were collected from stakeholders, including Product Management, the Technical Support Manager, the Director of Acme Audio, resellers, and customers. Additionally, the evaluation team used multiple types of data collection methods:
- Survey—Web-based survey was sent out to customers.
- Email—Correspondence with unknown Acme staff and resellers of Acme Audio products who had downloaded / previewed packages.
- Interview—Scripted phone interviews were conducted. Also, one-on-one Interviews were planned to capture success cases and non-success cases.
- Extant data review— number of packages previewed and downloaded.
Results
Dimension |
Importance
Weighting |
Results |
1. Package
Design and Development |
Important |
Very
Good |
2. Customers
Accessing Packages |
Very Important |
Good |
3. Customer
Knowledge of Package Usefulness and Customer Utilization of Packages |
Extremely
Important |
Good |
4. Other
Outcomes |
Extremely Important |
Very Good |
Table 1 Dimensional Results
Conclusions |
If both Extremely
Important dimensions are rated |
and |
If both Very
Important and Important dimensions are rated |
then |
The overall quality
is: |
|
Excellent |
Both Excellent or
Very Good |
Very High Quality |
|||
Very Good or Excellent |
Very Good (one or both) or Excellent |
High Quality |
|||
Good or Very
Good |
Good (one or
both) or Very Good or Excellent |
Good, but
improvement needed |
|||
Good or Barely Adequate |
Barely Adequate (up to only one), Good,
Very Good, or Excellent |
Much Improvement Needed |
|||
Barely Adequate or Poor |
Poor (one or more), and Barely
Adequate, Good, or Excellent |
Significant Improvement Needed |
|||
Table 2 Final Rubric
- Use the ARCS checklist to review the remaining Online Packages and identify those that require updating (overall scores less than 8).
- Update materials from evaluation review that have an average below 8.
- Engage with customers to find out what would make the packages more useful for them.
If Acme Audio wishes to continue using the Partner Portal as the touchpoint for customers accessing the online workflow and software feature packages, then it is strongly recommended that they make coordinated efforts to communicate this to customers, as well as the value and benefits of having this material available.
References
Advances in Developing Human Resources, 7(1), 86-101. https://doi.org/10.1177/1523422304272172
Crane, E., Shires, J., & Stevens, J. (2019). An evaluation of the online workflow and software feature packages for customers [MS
Word document]. OPWL 530 course site. https://Blackboard.boisestate.edu
Chyung, S. Y. (2018). 10-step evaluation for training and performance improvement. Sage.
Chyung, S. Y., Wisniewski, A., Inderbitzen, B., & Campbell, D. (2013). An improvement- and accountability-oriented program
evaluation: An evaluation of the Adventure Scouts program. Performance Improvement Quarterly, 26(3), 87-115.
http://.doi.org/10.1002/piq.21155
Keller, J. M. (2010). Motivational design for learning and performance: The ARCS model approach. Springer.
Crane, E., Shires, J., & Stevens, J. (2019). An evaluation of the online workflow and software feature packages for customers [MS
Word document]. OPWL 530 course site. https://Blackboard.boisestate.edu