Abstract
Empirical evaluations can be conducted with students or professionals as subjects. Students are much more accessible than professionals and they are inexpensive, allowing a greater number of empirical studies to be conducted. Professionals are preferred over students due to concerns regarding the external validity of student-based experiments. Professionals are believed to perform differently, most likely better than students. But with respect to evaluating the cognitive effectiveness of software engineering notations, are professionals really better? The literature has suggested that the presentation of information is just as critical as the content it conveys, hence necessitating this type of empirical studies. If professionals are not much better than students, then such important finding can be a springboard to many much-needed empirical evaluations in this field. In this paper, we report on an experiment that compare the performances of professionals and students with respect to syntactical model comprehension, which is a core factor for evaluating the cognitive effectiveness of notations. The experiment involved two groups of professionals and two student groups, totaling 74 professionals and 75 students. The results of the experiment indicate that students can be used as an adequate replacement to professionals in such type of empirical studies.
Similar content being viewed by others
Notes
International Software Engineering Research Network http://isern/iese.de/.
References
Amoroso, E.G.: Fundamentals of Computer Security Technology. Prentice-Hall Inc, Upper Saddle River, NJ (1994)
Anda, B., Dreiem, H., Sjøberg, D., Jørgensen, M.: Estimating software development effort based on use cases—experiences from industry. Submitted to UML’2001 (Fourth International Conference on the Unified Modeling Language)
Berander, P.: Using students as subjects in requirements prioritization. ISESE pp. 167–176 (2004)
Booch, G., Rumbaugh, J., Jacobson, I.: The Unified Modeling Language User Guide, 2nd edn. Addison-Wesley, Boston (2005)
OMG: Business Process Model and Notation (BPMN): Specification 2.0 V0.9.15. Object Management Group Inc., Needham (2009)
Buhr, R.J.A., Casselman, R.S.: Use Case Maps for Object-Oriented Systems. Prentice Hall, Englewood Cliffs (1996)
Carver, J., Shull, F., Basili, V.: Observational studies to accelerate process experience in classroom studies: an evaluation. In: Proceedings of the 2003 International Symposium on Empirical Software Engineering (ISESE ‘03), pp. 72–79
Ciolkowski, M.: What do we know about perspective-based reading? An approach for quantitative aggregation in software engineering. In: 2009 3rd International Symposium on Empirical Software Engineering and Measurement, ESEM 2009, pp. 133–144 (2009)
Cliff, N.: Dominance statistics: ordinal analyses to answer ordinal questions. Psychol. Bull. 114(3), 494–509 (1993)
Cliff, N.: Answering ordinal questions with ordinal data using ordinal statistics. Multivar. Behav. Res. 31(3), 331–350 (1996)
Cliff, N.: Ordinal Methods for Behavioral Data Analysis. Lawrence Erlbaum Associates, New Jersey (1996)
Dubois, E., Wu, S.: A framework for dealing with and specifying security requirements in information systems. Inf. Syst. Secur. pp. 88–99 (1996)
El-Attar, M.: Companion Website to the Analysis of the New Use Case and Statecharts Notations. https://faculty.alfaisal.edu/melattar/physicsofnotations.html. Last accessed December 2018
Falessi, D., Juristo, N., Wohlin, C., Turhan, B, Münch, J., Jedlitschka, A., Oivo, M.: Empirical software engineering experts on the use of students and professionals in experiments. Empir. Softw. Eng. 23(1), 452–489 (2018)
Genon, N., Heymans, P., Amyot, D.: Analysing the Cognitive Effectiveness of the BPMN 20 Visual Notation. Software Language Engineering, pp. 377–396. Springer, Berlin Heidelberg (2011)
Genon, N., Amyot, D., Heymans, P.: Analysing the Cognitive Effectiveness of the UCM Visual Notation. System Analysis and Modeling: About Models, pp. 221–240. Springer, Berlin (2011)
Gopalakrishnan, S., Krogstie, J., Sindre, G. (2010) Adapting UML activity diagrams for mobile work process modelling: experimental comparison of two notation alternatives. In: PoEM. pp. 145–161
Hassan, R., Bohner, S., El-Kassas, M.: Hinchey integrating formal analysis and design to preserve security properties. In: 42nd Hawaii International Conference on System Sciences, HICSS ‘09, pp. 1–10 (2009)
Hess, M.R., Kromrey, J.D., Ferron, J.M., Hogarty, K.Y., Hines, C.V.: Robust inference in meta-analysis: an empirical comparison of point and interval estimates using the standardized mean difference and cliff’s delta. In: Annual meeting of the American Educational Research Association (2005). www.coedu.usf.edu/main/departments/me/documents/RobustMeta-AnalysisAERA2005.pdf. Last accessed July 2018
Höfer, A., Tichy, W.F.: Status of empirical research in software engineering. No January 1996, pp. 10–19 (2007)
Höst, M., Regnell, B., Wohlin, C.: Using students as subjects–a comparative study of students and professionals in lead-time impact assessment. Empir. Softw. Eng. 5, 210–214 (2000)
Humphrey, W.S.: A Discipline for Software Engineering. Addison Wesley, Boston (1995)
Jacobson, I., Ericsson, M., Jacobson, A.: The Object Advantage. ACM Press, New York (1995)
Jürjens, J.: Secure Systems Development with UML. Springer, Berlin (2004)
Jürjens J.: UMLsec: extending UML for secure systems development. In: 5th International Conference UML 2002—The Unified Modelling Language. Model Engineering, Languages, Concepts, and Tools. Dresden, Germany, September/October 2002, Proceedings, volume 2460 of LNCS, pp. 412–425. Springer (2002)
Kaner, C., Bach, J., Pettichord, B.: Lessons Learned in Software Testing. Wiley, New York (2003)
Kárpáti, P., Sindre, G., Opdahl, A.L.: Visualizing Cyber Attacks with Misuse Case Maps. In: REFSQ pp. 262–275 (2010)
Katta, V., Karpati, P., Opdahl, A.L., Sindre, G., Raspotnig, C.: Comparing two misuse modeling techniques for intrusion visualization. In: Lecture Notes in Business Information Processing, vol. 68, pp. 1–15 (2010)
Kromrey, J., Hogarty, K., Ferron, J., Hines, C., Hess, M.: Robustness in meta-analysis: an empirical comparison of point and interval estimates of standardized mean differences and Cliff’s delta. In: American Statistical Association 2005 Joint Statistical Meetings, pp. 7 (2005). http://luna.cas.usf.edu/~mbrannic/files/meta/Robust%20Estimates.pdf. Last accessed July 2018
Kromrey, J., Hogarty, K.: Analysis options for testing group differences on ordered categorical variables: an empirical investigation of Type 1 error control and statistical power. Mult. Linear Regres. Viewp. 25, 70–82 (1998)
Lehmann, E.L.: Non-Parametrics: Statistical Methods Based On Ranks, Revised. Pearson (1998)
Lin, L., Nuseibeh, B., Ince, D., Jackson, M.: Moffett introducing abuse frames for analysing security requirements. In: Proceedings of the 11th IEEE International Requirements Engineering Conference, pp. 371–37 (2003)
Lodderstedt T.: SecureUML: a UML-based modelling language for model-driven security. In: UML 2002—The Unified Modelling Language. Model Engineering, Languages, Concepts, and Tools. 5th International Conference, Dresden, Germany, September/October 2002, Proceedings, volume 2460 of LNCS, pp. 426–441. Springer, Berlin (2002)
Mader, P., Cleland-Huang, J.: A Visual Traceability Modeling Language. Model Driven Engineering Languages and Systems. Springer, Berlin (2010)
McMeekin, D.A., Von Konsky, B.R., Robey, M., Cooper, D.J. A.: The significance of participant experience when evaluating software inspection techniques. In: Proceedings of the Australian Software Engineering Conference, ASWEC, pp. 200–209 (2009)
Moody, D.L.: The ‘physics’ of notations: toward a scientific basis for constructing visual notations in software engineering. IEEE Trans. Softw. Eng. 35(6), 756–779 (2009)
Moody, D., van Hillegersberg, J.: Evaluating the Visual Syntax of UML: An Analysis of the Cognitive Effectiveness of the UML Family of Diagrams. Software Language Engineering. Springer, Berlin (2009)
Moody, D.L., Heymans, P., Matulevicius, R.: Visual syntax does matter: improving the cognitive effectiveness of the i* visual notation. Requir. Eng. 15(2), 141–175 (2010)
Mouratidis, H., Giorgini, P.: Secure tropos: a security-oriented extension of the tropos methodology. Int. J. Softw. Eng. Knowledge Eng. 17(2), 285–309 (2007)
OMG.: Unified Modeling Language, Version 2.4.1. Object Management Group, Inc, Needham (2012). http://www.uml.org
Pickard, L.M., Kitchenham, B.A., Jones, P.W.: Combining empirical results in software engineering. Inf. Softw. Technol. 40(11), 811–821 (1998)
Porter, A.A., Votta, L.G., Basili, V.R.: Comparing detection methods for software requirements inspections: a replicated experiment. IEEE Trans. Softw. Eng. 21, 563–575 (1995)
Porter, A., Votta, L.: Comparing detection methods for software requirements inspections: a replication using professional subjects. Empir. Softw. Eng. 3, 355–379 (1998)
Purchase, H.C., Carrington, D., Allder, J.-A.: Empirical evaluation of aesthetics-based graph layout. Empir. Softw. Eng. 7(3), 233–255 (2002)
Purchase, H.C., Welland, R., McGill, M., Colpoys, L.: Comprehension of diagram syntax: an empirical study of entity relationship notations. Int. J. Hum. Comput. Stud. 61, 187–203 (2004)
Reijers, H.A., Mendling, J.: A study into the factors that influence the understandability of business process models. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 41(3), 449–462 (2011)
Remus, W.: Using students as subjects in experiments on decision support systems. In: Proceeding of the Twenty-Second Annu. Hawaii Int. Conf. Syst. Sci. Vol. III Decision Support and Knowledge Based Systems Track, vol. 3 (1989)
Ricca, F., Di Penta, M., Torchiano, M., Tonella, P., Ceccato, M.: How developers' experience and ability influence web application comprehension tasks supported by UML stereotypes: a series of four experiments. IEEE Trans. Softw. Eng. 36, 96–118 (2010)
Røstad, L.: An extended misuse case notation, including vulnerabilities and the insider threat. In: Proceedings of the 12th International Working Conference on Requirements Engineering: Foundation for Software Quality, pp. 33–43 (2006)
Runeson, P.: Using students as experiment subjects–an analysis on graduate and freshmen student data. In: Proceedings 7th International Conference on Empirical Assessment & Evaluation in Software Engineering, pp. 95–102 (2003)
Salviulo, F., Scanniello, G.: Dealing with identifiers and comments in source code comprehension and maintenance: results from an ethnographically-informed study with students and professionals. In: Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering, pp. 48:1–48:10 (2014)
Salman, I. Misirli, A.T., Juristo, N. : Are students representatives of professionals in software engineering experiments?. In International Conference on Software Engineering (ICSE), pp. 666– 676 (2015)
Schneier, B.: Attack trees. Dr Dobb’s J. 24(12), 21–29 (1999)
Shapiro, S.S., Wilk, M.B.: An analysis of variance test for the exponential distribution. TechnoMetrics 14, 355–370 (1972)
Siegel, S., Castellan, Jr. N.J.: Non-Parametric Statistics for the Behavioral Sciences, 2nd edn. McGraw-Hill (1988)
Sindre, G., Opdahl, A.: Eliciting security requirements with misuse cases. Requir. Eng. J. 10, 34–44 (2005)
Sindre, G.: Mal-activity diagrams for capturing attacks on business processes. In: Proceedings of the 13th International Working Conference on Requirements Engineering: Foundation for Software Quality, pp. 355–366 (2007)
Sindre, G., Opdahl, A.L., Breivik, G.F.: Generalization/specialization as a structuring mechanism for misuse cases. In: Proceedings of the 2nd Symposium on Requirements Engineering for Information Security (SREIS’02), pp. 1–16 (2002)
Sjoberg, D.I., Anda, B., Arisholm, E., Dyba, T., Jorgensen, M., Karahasanovic, A., Vokác, M.: Conducting realistic experiments in software engineering. In: Proceedings of the 2002 International Symposium on Empirical Software Engineering (ISESE’02), Washington DC, 2002. IEEE Computer Society
Svahnberg, M., Aurum, A., Wohlin, C.: Using students as subjects–an empirical evaluation. In: ESEM, pp. 288–290 (2008)
van Lamsweerde, A.: Elaborating security requirements by construction of intentional anti-models. In: Proceedings of the 26th International Conference on Software Engineering, IEEE Computer Society ICSE ‘04, Washington, DC, USA, pp. 148–157 (2004)
Wohlin, C., Runeson, P., Host, M., Ohlsson, M.C., Regnell, B., Wesslen, A.: Experimentation in Software Engineering–An Introduction. Kluwer, Alphen aan den Rijn (2000)
Acknowledgements
We would like to thank all the software engineering professionals and students who took part in this experiment.
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by: Dr Timothy Lethbridge.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix A: Experimental artifacts
Appendix B: Use case and statechart diagrams notations
Figure 9 above provides an example use case diagram for illustrative purposes. Each notational element from presented in Fig. 9 is explained in Table 18. Figure 10 above provides an example statechart diagram representing the state-dependent behavior of a light bulb for illustrative purposes. Each notational element from presented in Fig. 10 is explained in Table 19.
Appendix C: Questionnaires used
Appendix D: Shapiro–Wilk tests results and normality plots
The data sets are subjected to normality assumptions using the Shapiro–Wilk test [54]. Results of using the Shapiro–Wilk test on all 32 data sets are presented in Table 20. The results indicate that more than half the data set to not conform to a normal distribution. Normal probability plots provide more reliable interpretation of the results. Normal probability plots are displayed in “Appendix D”.
Below are normality plots as histograms for all 32 datasets used in this experiment (see Table 21. Source files for the normality plots as histograms and as scatter diagrams is available for download on [13].
Rights and permissions
About this article
Cite this article
El-Attar, M. A comparative study of students and professionals in syntactical model comprehension experiments. Softw Syst Model 18, 3283–3329 (2019). https://doi.org/10.1007/s10270-019-00720-5
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10270-019-00720-5