A comprehensive usability study of 3D printing slicer software: Integrating SUS, USE questionnaire, and key UX dimensions

Authors

  • Astrid Wahyu Adventri Wibowo Institute of Informatics, University of Szeged, Szeged, Hungary; Department of Industrial Engineering, Universitas Pembangunan Nasional Veteran Yogyakarta, Yogyakarta, Indonesia
  • Ismianti Ismianti Department of Industrial Engineering, Universitas Pembangunan Nasional Veteran Yogyakarta, Yogyakarta, Indonesia
  • Rochmat Husaini Department of Informatics, Universitas Pembangunan Nasional Veteran Yogyakarta, Yogyakarta, Indonesia
  • Hasan Mastrisiswadi Department of Industrial Engineering, Universitas Pembangunan Nasional Veteran Yogyakarta, Yogyakarta, Indonesia
  • Puji Handayani Kasih Department of Industrial Engineering, Universitas Pembangunan Nasional Veteran Yogyakarta, Yogyakarta, Indonesia
  • Keny Rahmawati Department of Business Administration, Universitas Pembangunan Nasional Veteran Yogyakarta, Yogyakarta, Indonesia
  • Sarah Iftin Atsani Mechanical Engineering Department, King Fahd University of Petroleum & Minerals, Dhahran, Saudi Arabia

DOI:

https://doi.org/10.31315/opsi.v18i2.15537

Keywords:

3D printing slicer , Usability evaluation , User experience , UX dimensions

Abstract

This study evaluates the usability of three widely used 3D printing slicer software, Ultimaker Cura, IdeaMaker, and PrusaSlicer, at the Engineering Drawing Laboratory of UPN “Veteran” Yogyakarta. A mixed-methods approach was applied, combining the System Usability Scale (SUS) and the USE Questionnaire (Which Assesses Usefulness, Ease of Use, Ease of Learning, and Satisfaction), as well as direct observation of three UX dimensions: learnability, effectiveness, and efficiency. Nine respondents completed seven task scenarios, each with six repetitions. To compare the three software, statistical analysis was conducted using the Friedman test and Wilcoxon post-hoc comparisons. The results showed that Ultimaker Cura consistently achieved the highest SUS and USE scores and demonstrated significantly faster task completion times. The strong alignment between observed performance and user perception supports the validity of the blended evaluation method. This study concludes that Ultimaker Cura is the most user-friendly option for beginners and is well-suited for educational environments. This finding contributes to provide guidance in selecting software and teaching practices in educational laboratories, while also contributing to usability research.

References

[1] T. Wibawa, H. Mastrisiswadi, and I. Ismianti, “3D Print Parameter Optimization: A Literature Review,” Yogyakarta Conference Series Proceeding on Engineering and Science Series (ESS), vol. 1, no. 1, pp. 146–151, 2020, doi: 10.31098/ess.v1i1.105.

[2] M. Gebler, A. J. M. , Schoot U, and C. Visser, “A global sustainability perspective on 3D printing technologies,” Energy Policy, vol. 74, pp. 158–167, 2014, doi: 10.1016/j.enpol.2014.08.033.

[3] A. Aslan and Y. Celik, “A literature review on 3D printing technologies in education,” International Journal of 3D Printing Technologies and Digital Industry, vol. 6, no. 3, pp. 592–613, Dec. 2022, doi: 10.46519/ij3dptdi.1137028.

[4] H. G. Lemu and O. Mikkelsen, “Experience in Use of 3D Printing in Engineering Education at University of Stavanger,” in MNT konferansen, 2021. doi: 10.5324/njsteme.v5i1.3934.

[5] P. Gallagher, R. Smith, and G. Sheppard, “Use of three-dimensional printing for simulation in ultrasound education: A scoping review,” Jul. 01, 2021, BMJ Publishing Group. doi: 10.1136/bmjstel-2020-000663.

[6] International Organization for Standardization, “Part 11: Usability: Definitions and Concepts,” in ISO 9241-11:2018 – Ergonomics of Human-System Interaction, ISO, 2018.

[7] P. Vlachogianni and N. Tselios, “Perceived Usability Evaluation of Educational Technology Using the Post-Study System Usability Questionnaire (PSSUQ): A Systematic Review,” Sep. 01, 2023, Multidisciplinary Digital Publishing Institute (MDPI). doi: 10.3390/su151712954.

[8] S. Hajesmaeel-Gohari, F. Khordastan, F. Fatehi, H. Samzadeh, and K. Bahaadinbeigy, “The most used questionnaires for evaluating satisfaction, usability, acceptance, and quality outcomes of mobile health,” BMC Med Inform Decis Mak, vol. 22, no. 1, Dec. 2022, doi: 10.1186/s12911-022-01764-2.

[9] J. Brooke, SUS: A “Quick and Dirty” Usability Scale, 1st Edition. CRC Press, 1996.

[10] R. A. Grier, A. Bangor, P. Kortum, and S. C. Peres, “The system usability scale: Beyond standard usability testing,” in Proceedings of the Human Factors and Ergonomics Society, 2013, pp. 187–191. doi: 10.1177/1541931213571042.

[11] P. Kortum and C. Z. Acemyan, “How low can you go? Is the System Usability Scale range restricted?,” J Usability Stud, vol. 9, no. 1, pp. 14–24, 2013, doi: 10.5555/2817705.2817707.

[12] B. Blažica and J. R. Lewis, “A Slovene translation of the System Usability Scale (SUS),” Int J Hum Comput Interact, vol. 31, pp. 112–117, 2015, doi: 10.1080/10447318.2014.986634.

[13] M. I. Berkman and D. Karahoca, “Re-assessing the Usability Metric for User Experience (UMUX) Scale,” J Usability Stud, vol. 11, no. 3, pp. 89–109, 2016, doi: 10.5555/2993219.2993221.

[14] C. C. Tossell, P. Kortum, C. Shepard, A. Rahmati, and L. Zhong, “An empirical analysis of smartphone personalization: Measurement and user variability,” Behaviour & Information Technology, vol. 31, no. 10, pp. 995–1010, 2012, doi: 10.1080/0144929X.2012.687773.

[15] M. M. Hyzy et al., “System Usability Scale Scores for Digital Health Apps: Meta Analysis,” Feb. 15, 2022. doi: 10.2196/preprints.37290.

[16] W. Yuhui, L. Chunfu, L. Tian, and H. Qin, “Longitudinal Study on Retrospective Assessment of Perceived Usability: A New Method and Perspectives,” Jun. 01, 2019, Oxford University Press. doi: 10.1093/iwc/iwz026.

[17] A. M. Lund, “Measuring Usability with the USE Questionnaire,” Usability Interface, vol. 8, no. 2, pp. 3–6, 2001.

[18] J. Marín-Morales et al., “Navigation Comparison between a Real and a Virtual Museum: Time-dependent Differences using a Head Mounted Display,” Interact Comput, vol. 31, no. 2, pp. 208–220, Mar. 2019, doi: 10.1093/iwc/iwz018.

[19] J. M. Ferreira, F. Rodríguez, A. Santos, S. T. Acuña, and N. Juristo, “Impact of Usability Mechanisms: A Family of Experiments on Efficiency, Effectiveness and User Satisfaction,” Inf Softw Technol, vol. 117, 2020, doi: 10.1016/j.infsof.2019.106195.

[20] B. S. Shim and J. U. Hou, “Improving Estimation of Layer Thickness and Identification of Slicer for 3D Printing Forensics,” Sensors, vol. 23, no. 19, Oct. 2023, doi: 10.3390/s23198250.

[21] J. Bryła and A. Martowicz, “Study on the importance of a slicer selection for the 3d printing process parameters via the investigation of g-code readings,” Machines, vol. 9, no. 8, Aug. 2021, doi: 10.3390/machines9080163.

[22] M. K. A. Mohd Ariffin, N. A. Sukindar, B. T. H. T. Baharudin, C. N. A. Jaafar, and M. I. S. Ismail, “Slicer Method Comparison Using Open-source 3D Printer,” in IOP Conference Series: Earth and Environmental Science, Institute of Physics Publishing, Feb. 2018. doi: 10.1088/1755-1315/114/1/012018.

[23] S. Cahyati and H. R. Aziz, “The Influence of Different Slicer Software on 3d Printing Products Accuracy and Surface Roughness,” Jurnal Rekayasa Mesin, vol. 12, no. 2, pp. 371–380, Aug. 2021, doi: 10.21776/ub.jrm.2021.012.02.14.

[24] M. Šljivic, A. Pavlovic, M. Kraišnik, and J. Ilić, “Comparing the accuracy of 3D slicer software in printed enduse parts,” in IOP Conference Series: Materials Science and Engineering, Institute of Physics Publishing, 2019. doi: 10.1088/1757-899X/659/1/012082.

[25] J. Scherick, C. Touchette, M. Gulbin, P. Coady, P. Radhakrishnan, and D. C. Brown, “GaPA: An Application to Assist Novice Users With 3D Printing,” in ASME International Mechanical Engineering Congress and Exposition, 2022. doi: 10.1115/IMECE2021-71068.

[26] X. Li, D. Zhao, and J. Zhao, “A design case study: 3D printer software interface design based on home users preferences knowledge,” in Proceedings of the International Conference on Engineering Design, ICED, Cambridge University Press, 2019, pp. 639–648. doi: 10.1017/dsi.2019.68.

[27] F. Baumann, H. Bugdayci, J. Grunert, F. Keller, and D. Roller, “Influence of slicing tools on quality of 3D printed parts,” Comput Aided Des Appl, vol. 13, no. 1, pp. 14–31, Jan. 2016, doi: 10.1080/16864360.2015.1059184.

[28] N. C. Maideen et al., “An Optimum Combination of Filament Material and Slicing Software in Improving the Manufacturing Performance of 3D-printed Parts,” Scientific Research Journal, vol. 22, no. Special Issue, pp. 135–149, Jan. 2025, doi: 10.24191/srj.v22is.13289.

[29] X. Li, D. Zhao, and J. Zhao, “A design case study: 3D printer software interface design based on home users preferences knowledge,” in Proceedings of the International Conference on Engineering Design, ICED, Cambridge University Press, 2019, pp. 639–648. doi: 10.1017/dsi.2019.68.

[30] F. Nr, H. Bugdayci, J. Grunert, and F. Keller, “Analysis of Slicing-Tools for Fused Deposition Modeling 3D-Printers and comparison of different printers,” 2014.

[31] N. Che Maideen, M. H. Nazri, S. Budin, H. Koay Mei, H. Yusoff, and S. Sahudin, “The Effect Of Different Slicing Software On The Manufacturing Performance OF 3D Printed Parts,” Jurnal Mekanikal, pp. 72–80, Nov. 2023, doi: 10.11113/jm.v46.490.

[32] R. A. Virzi, “Refining the Test Phase of Usability Evaluation: How Many Subjects Is Enough?,” Hum Factors, vol. 34, no. 4, pp. 457–468, 1992, doi: 10.1177/001872089203400407.

[33] L. Faulkner, “Beyond the five-user assumption: Benefits of increased sample sizes in usability testing,” Behavior Research Methods, Instruments, & Computers, vol. 35, no. 3, pp. 379–383, 2003, doi: 10.3758/BF03195514.

[34] W. Hwang and G. Salvendy, “Number of people required for usability evaluation: The 10±2 rule,” Commun ACM, vol. 53, no. 5, pp. 130–133, May 2010, doi: 10.1145/1735223.1735255.

[35] P. Kortum and A. Bangor, “Usability Ratings for Everyday Products Measured with The System Usability Scale,” Int J Hum Comput Interact, vol. 29, pp. 67–76, 2013, doi: 10.1080/10447318.2012.681221.

[36] J. Sauro and J. R. Lewis, Quantifying the User Experience: Practical Statistics for User Research, 2nd Edition. Cambridge: Morgan Kaufmann, 2016.

Downloads

Published

2025-12-30

Most read articles by the same author(s)