Tamires Santos de Melo DDS1, Giselle de Albuquerque Pacheco DDS1, Maria Isabel de Castro de Souza DDS, MSc, PhD2, Karla Figueiredo MSc, PhD3

1 Postgraduate Programme in Telemedicine and Telehealth, Department of Medical Sciences, State University of Rio de Janeiro, Brazil
2 Department of Dentistry, School of Community and Preventive Dentistry, State University of Rio de Janeiro, Brazil
3 Department of Informatics and Computer Science Institute of Mathematics and Statistics, State University of Rio de Janeiro, Brazil


Background: Accessing dental treatment is still a challenge in several countries. In Brazil, there is a deficit in the provision of dental services by the public health system, with an emphasis on specialised treatments. The poorest are most affected and usually have to travel long distances for treatment in other regions, with the associated costs of travel, food, and often the loss of a day's work. Aiming to improve care for this public, the Dental Pre-Screening System (in Portuguese, Sistema de Pré-Triagem Odontológica - STO) was developed, which is responsive (mobile) and aims at remotely pre-screening these patients by an oral health professional. Objective: This pilot study was developed to analyse the system's usability by developers and patients and the subjects' perceptions during the use of the STO through semi-structured interviews. Methods: A convenience sample (35 patients and 10 developers) with a mixed methodology was used to evaluate usability (System Usability Scale) and to measure the user's perception (semi-structured interview) of the components of the Dental Pre-Screening System. Results: The system's overall System Usability Scale (SUS) score was 76.9. Among the patients, twenty-four patients found the system easy to use, 83% agreed or completely agreed that they would like to use this system frequently" and 83% agreed or strongly agreed that people would learn to use it quickly. Qualitative data suggest that all of the patients would recommend using the system, and the patients' perception of the system was positive and optimistic. However, it is also possible to identify points to be improved, such as changing the calendar format. Conclusions: The results of the SUS and semi-structured interview showed that the Dental Pre-Screening System was well accepted and that it can optimise the face-to-face screening process, reducing unnecessary trips and saving money.

Keywords: user-centred design; dentistry; teledentistry; usability test; system usability scale; Brazil

de Melo TS, et al. J Int Soc Telemed eHealth 2023;11:e2(1-7)
DOI: https:doi.org/10.29086/JISfTeH.11.2
Copyright:© The Authors 2023
Open access, published under Creative Commons Attribution 4.0 BY International Licence


Accessing dental treatment is still a challenge in several countries. In Brazil, there is a deficit in the provision of dental services by the public health system, with an emphasis on specialised treatments.Accessing dental treatment is still a challenge in several countries. In Brazil, there is a deficit in the provision of dental services by the public health system, with an emphasis on specialised treatments.1,2 Those most affected are the population with lower purchasing power, who do not have the option of seeking private assistance. For this reason, they seek treatment in other regions. Dental schools, despite their academic character, have become an option for the needy population, as they offer specialised services at an affordable price.3

Aiming to improve care for this sector, the Dental Pre-Screening System (in Portuguese, Sistema de Pré-Triagem Odontológica - STO) was developed, which is responsive (mobile) and aims at enabling remote pre-screening of these patients by an oral health professional. The development of the tool, unprecedented in Brazilian public assistance, was guided by the requirements and functions identified by dentists and patients. Once registered, it can be downloaded for free from the app store, which will allow patients to undergo guided pre-screening before scheduling a first face-to-face appointment.

The development of digital tools, mainly for use in the health area, is not restricted to its design and coding - tests and adjustments are essential for the tool to fulfil its purpose. Jokela et al.4 showed that usability is one of the most important attributes of software quality, being defined as "The ability of the software product to be understood, learned, used and attractive to the user when used under specified conditions" in accordance with ISO 9126.5

Design or usability problems can trigger a loss of interest in using software by users. The literature describes different instruments to test systems that even allow for improving their experience and performance. For this study, a mixed methodology was used, the System Usability Scale (SUS),6 a quantitative instrument, and the semi-structured interview,7,8 a qualitative method, since combining data from methods allows balancing the strengths and compensating possible limitations of the respective method.9 The aim of this study was to evaluate the usability and to measure the patient’s perception of the STO so that it can be implemented as a tool for pre-screening patients for clinics at public dentistry universities.


Study design

The research started after the development and registration (National Institute of Industrial Property - 512022001449-1) of web-type software (Figure 1).

Figure 1
Figure 1. The STO home screen on the monitor and cell phone screen.

The project was applied to evaluate usability and measure the participant's perception of the system in an observational-cross-sectional study using a mixed methodology (quantitative and qualitative). The test evaluation consisted of two independent sequential parts: in the first part, quantitative data were obtained through an online questionnaire after the participants' interaction with the system. In the second part, qualitative data were collected through a semi-structured interview using a script. Usability testing was not face-to-face for any of the users, whether IT professionals or possible patients. Thus, the usability testing was entirely remote. Therefore, if the users had questions while filling out the questionnaire, no one was nearby to answer them. A visual summary of the process is shown in Figure 2.

Figure 2
Figure 2. Graphic representation of the steps performed during the usability test.

Participants and Recruitment

Specialist in the Information Technology area (IT): 10 specialists in the area of computing or who have knowledge in the area of programming participated in the usability test phase. This sample was selected for convenience in order to verify the system's technological failures. SUS was applied to both samples to complement the analyses.
Patient: 35 individuals over 18 years of age, literate and enrolled in the Faculdade de Odontologia da Universidade do Estado do Rio de Janeiro (FOUERJ), with a tablet or smartphone (Android or iOS) with Internet access and browser (Google Chrome, Firefox, Opera, Safari, etc.) and without special needs. This group was selected for convenience to evaluate usability and measure the perception of potential end users. It is worth noting that the group was quite heterogeneous, with different age groups and levels of education.

Usability test

The SUS instrument was selected because it is valid and widely used for quantitative analysis due to its simplicity of application and ease of understanding. It contains ten items with five alternative answers for each question, providing quantitative data on the overview and user satisfaction during the use of the system, being structured based on the Likert scale. According to the SUS methodology, a number represents the system's general usability, and the global result will represent a user’s satisfaction index varying between 0 (negative usability) and 100 (positive usability). The SUS scoring system has a range of one to five that reflects the following responses, respectively, Strongly disagree, Disagree, Neutral, Agree and Strongly agree. An average of 70 points is considered satisfactory usability. SUS scores have been correlated with seven adjectives from worst imaginable to Best imaginable. The average scores are between Good and Excellent.10 (Figure 3)

Figure 3
Figure 3. Relationship between the seven adjective classifications and mean SUS scores. (Adapted from Bangor et al (2009).10

This phase was initially carried out with information technology specialists (n=10) to calibrate the STO and evaluate its usability by more technically demanding people. After obtaining satisfactory results (Table 1), we continued with the tests on the patients.

Semi-structured interviews

For the qualitative analysis, a semi-structured interview was carried out with the patients in order to understand "deeper" their perceptions when using the STO, complementing the data obtained by the quantitative method. The interview was executed at a distance, individually and recorded, lasting approximately 30 minutes. A pre-established script containing 15 questions categorised into thematic blocks was used during this period. All questions were equally asked for each patient. This instrument was adapted from the study by Xiao et al. 2020.11  

For qualitative data analysis, the audio recordings were transcribed, tabulated in EXCEL, and later verified by the same researcher. The transcribed data were categorised and analysed, in the same way, after exhaustive reading. During the analysis, sentences, phrases or paragraphs were used as units, considering the contextualisation to understand the factors. From this, the user's perception, probability of recommendation, strengthening of the service, problems and resolutions were identified.

The project was approved by the Research Ethics Committee of UERJ (CAAE nº45809221.0.0000.5282). All participants compulsorily completed and signed a Free and Informed Consent Form (in Portuguese, Termo de Consentimento Livre e Esclarecido - TCLE). The study was undertaken in June and July 2022.



A total of 35 participants were involved in the study, and only 30 (86%) finished the tests. (Figure 4) Subjects 17 and 21 were removed from the survey because they did not perform all the usability testing tasks. Three subjects submitted their response twice, and their second responses, subjects 18, 20 and 31, were discarded. It is worth noting that the group was quite heterogeneous, with different age groups.

‏Figure 4
Figure 4. Graph of distribution by gender and age of patients.

System usability score
Calculations of the SUS for IT staff and patients are shown in Tables 1 and 2.

Table 1. Calculation and arithmetic mean of the SUS General Score of the IT staff respondents.
Table 1

Table 2. Calculation and arithmetic mean of the SUS General Score of the participants.
Table 2

The average SUS for IT participants was 78.3 and patients and 76.9. An average of 70 points is considered satisfactory usability.

Based on question 1 of the SUS, “I think I would like to use this system often”, we can see that 83% of the patients agreed or strongly agreed. Among the patients, 24 of them found the system easy to use. Regarding the learning speed, seventh question, 83% of the participants agreed or totally agreed with this statement. Twenty-two of the patients pointed out that there is no need for technical help to use STO.

There was no significant difference between the results of the IT staff and patients p=0.6818 (Man-Whitney test. In this case, the system does not seem to offer end users any more difficulty than system analysts might have. Upon the qualitative analysis of the thematic blocks, an exhaustive reading of the data was carried out, leading to the use of sentences, phrases or paragraphs as units of analysis, to identify the users’ perceptions, probability of recommendation, service strengthening, problems and resolutions were identified in Table 3.
Table 3. Qualitative analyses over perception and likelihood to recommend.
Table 33

The perception of difficulty when using the STO by a lay patient was a point we sought to understand, so the participants were asked if there was any challenge when using the STO.
Feedback on this topic was:

"No, very easy... I didn't find it difficult at all." - P3
"I didn't have any difficulties...everything was fine" - P9
"I found everything very clear, very simple and even fast" - P19

In order to analyse the item “likelihood to recommend”, users were asked if they would recommend the application to a family member or friend. All users stated that they would refer the STO to someone else.

"Yes definitely." - P3
"Yes, yes, it is a very important service." - P11
"Yes! I even have people to recommend already." - P28

In block 3 on Strengthening the service, the question was asked, "What changes do you suggest to improve the virtual dental service?" The responses provide some feedback on possible barriers when using the STO.

"... if you have more illustrative images..." – P1
"I think there should be an option to edit the photo" - P8, "a little more zoom" - P19
“the specialty of rx (dental radiography)" - P6
“telling how college works" " - P20.

In the block about "Use of phone application" two questions were asked, in the first "Do you use any smartphone application related to oral health? If so, which one?" most patients said no. Patients were asked if they would use an app to take pictures of their teeth and send them to the dentist, they said:

"I would, I think it would be really cool. It would be easier." - P3
"I would use it, because it facilitates the diagnosis and would not waste time" - P8
"Yes, I think this interaction between the patient and the dentist would be more dynamic." - P20

The patients’ declarations for the thematic block problems and resolutions are described in Table 4.

Table 4. Tabulation of the identified improvements and possible adjustments.


The WHO predicted the exponential increase in applications and software aimed at the health sector, and it has been increasing. According to Allied Market Research (Feb 2021), the global eHealth market is expected to reach $230,6 million by 2027, growing at a CAGR (Compound Annual Growth Rate) of 14.5% from 2020 to 2027.12 Developing successful digital solutions that address health problems is a complex task. When implemented, many applications fall short of expectations or end up as unsuccessful pilot studies. One of the reasons is the failure to evaluate an application.13,14

Usability assessments are a key step in developing digital applications, considered essential by several organisations such as WHO, requiring rigorous eHealth assessments.15-18 Therefore, the importance of usability is clear so that health technologies are appropriately planned and focused on the needs of end users before being used as possible technological tools for the health area.19Developing a system centred on the user destined for dentistry pre-screening in the public service has a significant challenge: the technology’s global access to users. Therefore, the software was developed for any smartphone, tablet or computer user. To the institution, the systematic triage process may help in the service’s management through the creation of a database that makes it possible to identify users and their necessities. That data enables a more assertive distribution of patients to the appropriate clinic.

Among the current methods used in usability testing investigated, questionnaires were the most used, and the SUS is the most prevalent in this category. However, questionnaires that reflect a Likert scale are not always able to identify or capture all the problems of an app, and in this sense, qualitative research can be more useful.22,23 Transparency in a study is essential; for this reason, we point out three limitations of this study. First, the final sample size (N=30) can be seen as too small for meaningful assessments to be carried out, given that larger samples are often recommended for quantitative research. Nonetheless, Stetson and Tullis (2004)20 and Nielsen (2000)21 demonstrate that the SUS questionnaire and the usability tests present significant results even when sampling 12 and 15 participants, respectively.

Another highlighted limitation is the users' location. The usability tests were conducted remotely, so users were in different geographic locations, which can influence the degree of attention they gave during the tests. Finally, participation in the study was voluntary. Therefore, those who agreed to collaborate on the survey had already had experience with mobile devices and were interested in participating, which may lead to a higher ranking in a sample of respondents who are mobile-friendly or inclined to collaborate on the survey.

The acceptance of the technology was observed in the sample studied, despite heterogeneous groups regarding experiences handling different mobile devices. With all of this in consideration, it can be assumed that the online use of a dental pre-screening system is a viable solution for collecting initial data from patients attended by the clinics of the FOUERJ.


The results of the usability evaluation and patient perception demonstrate that the Dental Pre-Screening System was well accepted. Through qualitative data, it was possible to identify implemented improvements. The tool could be useful in optimising the face-to-face screening process, reducing unnecessary trips, and saving money. The following steps include installing the system for homologation and publicising the application for use with new patients.

Corresponding author:
Tamires Melo
Dentistry School
State University of Rio de Janeiro (UERJ)
Boulevard 28 de Setembro 157
Rio de Janeiro, Brazil
Email: tamiressantosdemelo@gmail.com

Conflict of interest. The authors declare no conflicts of interest.

Authors’ contributions: Author 1: Conceptualisation (equal); formal analysis (equal); investigation editing (lead); writing – original draft; methodology (equal); writing – review and editing (lead); Author 2: investigation (equal); Author 3: conceptualisation (lead); supervision (lead); project administration (lead); methodology (lead); writing – review and editing (equal); writing – review and editing (equal).Author 4: project administration (equal); supervision (equal); formal analysis (lead).

Funding: This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

Ethics approval and consent to participate
The project was presented to the Research Ethics Committee of UERJ (CAAE nº45809221.0.0000.5282).

Consent for publication: All participants necessarily completed and signed a Written Informed Consent Form (WICF). 


  1. Piotrowska DE, Jankowska D, Huzarska D, Szpak AS, Pędziński B. Socioeconomic inequalities in use and non-use of dental services in Poland. Int J Public Health 2020;65:637-647. DOI: https://doi.org/10.1007/s00038-020-01379-2
  2. Ministério da Saúde (BR). Secretaria de Vigilância em Saúde. SB Brasil 2010: Pesquisa Nacional de Saúde Bucal: resultados principais. [Internet]. [Brasília, DF]: Editora MS. Ministério da Saúde (BR). 2012. Available at: https://bvsms.saude.gov.br/bvs/publicacoes/pesquisa_nacional_saude_bucal.pdf accessed 3 January 2023.
  3. Nalliah RP. Could dental school teaching clinics provide better care than regular private practices? J Investig Clin Dent 2019;10(2):e12329. DOI: https://doi.org/10.1111/jicd.12329
  4. Jokela T, Abrahamsson P. Modelling usability capability–introducing the dimensions. In: International Conference on Product Focused Software Process Improvement. Springer, Berlin, Heidelberg, 2000:73-87. DOI: https://doi.org/10.1007/978-3-540-45051-1_10
  5. International Organization for Standardization. (1991). ISO 9126 (1991). Software product quality.
  6. Brooke J. SUS: a “quick and dirty” usability scale. Usability evaluation in industry. 1996;189(194):4-7
  7. Minayo MCS. O desafio do conhecimento: pesquisa qualitativa em saúde. 14ª ed. São Paulo: Hucitec; 2014.
  8. Greenhalgh, T. Como ler artigos científicos: fundamentos da medicina baseada em evidências. 5 ed. Porto Alegre: Artmed; 2015. Artigos que vão além dos números (pesquisa qualitativa). Artmed Editora, 2015.
  9. Reichold M, Heß M, Kolominsky-Rabas P, Gräßel E, Prokosch H. Usability evaluation of an offline electronic data capture app in a prospective multicenter dementia registry (digidem bayern): mixed method study. JMIR Form Res 2021;5(11):e31649. DOI: https://doi.org/10.2196/31649
  10. Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: Adding an adjective rating scale. J Usability Stud 2009;4(3):114-123.
  11. Xiao J, Meyerowitz C, Ragusa P,  et al. Assessment of an Innovative Mobile Dentistry eHygiene Model amid the COVID-19 Pandemic in the National Dental Practice-Based Research Network: Protocol for design, implementation, and usability testing. JMIR Res Protoc 2021;Oct 26;10(10):e32345. DOI: https://doi.org/10.2196/32345
  12. Allied Market Research. (2021). E-health Market by Type and End User: Global Opportunity Analysis and Industry Forecast. (2021). Available at: https://www.alliedmarketresearch.com/ehealth-market-A10166 accessed: 3 January 2023.
  13. van Limburg M, van Gemert-Pijnen JE, Nijland N, et al. Why business modeling is crucial in the development of eHealth technologies. J Med Internet Res 2011;13(4):e124. DOI: https://doi.org/10.2196/jmir.1674
  14. Jones RBJ, Ashurst EJ, Trappes-Lomax T. Searching for a sustainable process of service user and health professional online discussions to facilitate the implementation of e-health. Health Inf J 2016;22(4):948–961; DOI: https://doi.org/10.1177/1460458215599024
  15. Zapata BC, Fernández-Alemán JL, Idri A, Toval A. Empirical studies on usability of mHealth apps: a systematic literature review. J Med Syst. 2015;39:1-19. DOI: https://doi.org/10.1007/s10916-014-0182-2.
  16. World Health Assembly, 71. (‎2018)‎. Seventy-first World Health Assembly: Geneva, 21-26 May 2018: summary records of committees, reports of committees. World Health Organization. Available at: https://apps.who.int/iris/handle/10665/325993 accessed: 11 January 2022.
  17. Broderick J, Devine T, Langhans E, et al. Designing health literate mobile apps. NAM Perspectives. Discussion Paper, National Academy of Medicine, Washington, DC. 2014; DOI: https://doi.org/ 10.31478/201401a
  18. Brown III W, Yen PY, Rojas M, Schnall R. Assessment of the Health IT Usability Evaluation Model (Health-ITUEM) for evaluating mobile health (mHealth) technology. J Biomed Inform 2013;46(6):1080–1087. DOI: https://doi.org/10.1016/j.jbi.2013.08.001
  19. Maramba I, Chatterjee A, Newman C. Methods of usability testing in the development of eHealth applications: A scoping review. Int J Med Inform 2019;126:95-104; DOI: https://doi.org/10.1016/j.ijmedinf.2019.03.018
  20. Stetson JN, Tullis TS. A comparison of questionnaires for assessing website usability. UPA Presentation. 2004. Available at: https://www.researchgate.net/publication/228609327_A_Comparison_of_Questionnaires_for_Assessing_Website_Usability accessed 5 February 2023.
  21. Nielsen J. Why you only need to test with 5 users. Nielsen Norman Group. Nielsen. 2000. Available at: https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/ accessed 5 February 2023.

de Melo TS, et al. J Int Soc Telemed eHealth 2023;11:e2