top of page

Stage 4

Deliverables 

A link to our full Stage 4 on GitHub (click here).

 

A) Heuristic Evaluation Report 

       i) Cover Page

      ii) Project Description 

     iii) Updated User Tasks

     iv) Heuristic Process and Findings 

      v) Reflection 

     vi) Appendix

B) Presentation 

       i) Hi-Fi Prototype Demo

      ii) 8-12 minute Presentation 

 

Heuristic Evaluation Report

A link to our full Heuristic Evaluation Report (click here).

Project Description 

Our Project is a Drone Fleet Management System (also known as DFMS). This project is a desktop program that will allow a user to manage a fleet of drones. It will give them the ability to oversee drones as they pick up and deliver packages.

Updated User Task

Vertical - List of Drones 

Vertical - Drone History 

Vertical - Status of Drones  

Vertical - Map

Horizontal - Fleet Statstics

Horizontal - Issue Advisory 

Horizontal - Register Drone

Horizontal - Login

Horizontal - Manual Control

Horizontal - Drone Recovery Request 

Horizontal - Recall

Heuristic Process and Findings 

Our heuristic evaluation process consisted of multiple steps:

  1. Andy, Kathryn, and Macks went through the prototype (separately) looking for violations of the usability heuristics and then each evaluator filled out the provided evaluation template containing Jakob Nielsen’s 10 usability heuristics.

  2. Nicholas and Stéphane separately went through all 3 evaluation reports and rated the severity of each problem. After that was done, they met, discussing their ratings and creating a list that rated each problem on its severity.

  3. After the list was created, we discussed briefly which, if any, of the items on the list should be changed. The document containing the severity rating was reviewed by a member of our group (Andy) who then went through the prototype fixing all issues that we decided to fix, starting with the most severe ones.

 

Findings:

The most severe issues the three evaluators found (based off Jakob Nielsen’s 10 usability heuristics) were related to Error prevention and Consistency.

For error prevention, we had no confirmations of actions that seriously affect the drones. Examples of such confirmations that were missing from the prototype are a popup asking the operator to confirm whether they truly wish to recall a drone, or take control of a drone, or even send maintenance out to recover a drone.   

For Consistency and standards, since each of us worked on separate parts of the system prototype, much of the wording was not matching, such as “pick up” instead of “pickup”, “hr” instead of “hrs”, etc.

Reflection 

What worked well for us was having 3 independent evaluators. Since we didn’t meet and work together to review it, we got a much wider variety of problems and views on our prototype. 

What went poorly for us was almost the same as what went well. Because we did not discuss the prototype before the heuristic evaluation, we found many problems throughout our prototype.

What would be done differently is after the reviewers were done rating the severity together, we should have met and discussed every single problem and figured out which to implement changes on rather than only discussing a couple.

Appendix

Our appendix consisted of our evaluations and reviewing documents that can be found in the given Heuristic Evaluation Report link. 

 

Presentation 

Watch our full presentation and high-fidelity prototype video below or follow the link (click here).

A link to our interactive high-fidelity prototype (click here). 

 

bottom of page