• Bmc Med Inform Decis · Jan 2019

    A usability design checklist for Mobile electronic data capturing forms: the validation process.

    • Alice Mugisha, Victoria Nankabirwa, Thorkild Tylleskär, and Ankica Babic.
    • Centre for International Health, University of Bergen, Bergen, Norway. mugishaalice@gmail.com.
    • Bmc Med Inform Decis. 2019 Jan 9; 19 (1): 4.

    BackgroundNew Specific Application Domain (SAD) heuristics or design principles are being developed to guide the design and evaluation of mobile applications in a bid to improve on the usability of these applications. This is because the existing heuristics are rather generic and are often unable to reveal a large number of mobile usability issues related to mobile specific interfaces and characteristics. Mobile Electronic Data Capturing Forms (MEDCFs) are one of such applications that are being used to collect health data particularly in hard to reach areas, but with a number of usability challenges especially when used in rural areas by semi literate users. Existing SAD design principles are often not used to evaluate mobile forms because their focus on features specific to data capture is minimal. In addition, some of these lists are extremely long rendering them difficult to use during the design and development of the mobile forms. The main aim of this study therefore was to generate a usability evaluation checklist that can be used to design and evaluate Mobile Electronic Data Capturing Forms in a bid to improve their usability. We also sought to compare the novice and expert developers' views regarding usability criteria.MethodsWe conducted a literature review in August 2016 using key words on articles and gray literature, and those with a focus on heuristics for mobile applications, user interface designs of mobile devices and web forms were eligible for review. The data bases included the ACM digital library, IEEE-Xplore and Google scholar. We had a total of 242 papers after removing duplicates and a total of 10 articles which met the criteria were finally reviewed. This review resulted in an initial usability evaluation checklist consisting of 125 questions that could be adopted for designing MEDCFs. The questions that handled the five main categories in data capture namely; form content, form layout, input type, error handling and form submission were considered. A validation study was conducted with both novice and expert developers using a validation tool in a bid to refine the checklist which was based on 5 criteria. The criteria for the validation included utility, clarity, question naming, categorization and measurability, with utility and measurability having a higher weight respectively. We then determined the proportion of participants who agreed (scored 4 or 5), disagreed (scored 1 or 2) and were neutral (scored 3) to a given criteria regarding a particular question for each of the experts and novice developers. Finally, we selected questions that had an average of 85% agreement (scored 4 or 5) across all the 5 criteria by both novice and expert developers. 'Agreement' stands for capturing the same views or sentiments about the perceived likeness of an evaluation question.ResultsThe validation study reduced the initial 125 usability evaluation questions to 30 evaluation questions with the form layout category having the majority questions. Results from the validation showed higher levels of affirmativeness from the expert developers compared to those of the novice developers across the different criteria; however the general trend of agreement on relevance of usability questions was similar across all the criteria for the developers. The evaluation questions that were being validated were found to be useful, clear, properly named and categorized, however the measurability of the questions was found not to be satisfactory by both sets of developers. The developers attached great importance to the use of appropriate language and to the visibility of the help function, but in addition expert developers felt that indication of mandatory and optional fields coupled with the use of device information like the Global Positioning System (GPS) was equally important. And for both sets of developers, utility had the highest scores while measurability scored least.ConclusionThe generated checklist indicated the design features the software developers found necessary to improve the usability of mobile electronic data collection tools. In the future, we thus propose to test the effectiveness of the measure for suitability and performance based on this generated checklist, and test it on the end users (data collectors) with a purpose of picking their design requirements. Continuous testing with the end users will help refine the checklist to include only that which is most important in improving the data collectors' experience.

      Pubmed     Free full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…

What will the 'Medical Journal of You' look like?

Start your free 21 day trial now.

We guarantee your privacy. Your email address will not be shared.