On the Combination of Static Analysis for Software Security Assessment – A Case Study of an Open-Source e-Government Project
Volume 6, Issue 2, Page No 921-932, 2021
Author’s Name: Anh Nguyen-Duc1,a), Manh-Viet Do2, Quan Luong-Hong2, Kiem Nguyen-Khac3, Hoang Truong-Anh4
View Affiliations
1Department of IT and Business, Business school, University of South Eastern Norway, Notodden, 3679, Norway
2MQ ICT SOLUTIONS, Vietnam
3School of Electronics and Telecommunication, Hanoi University of Science and Technology, Hanoi, 100000, Vietnam
4VNU University of Engineering and Technology, Vietnam
a)Author to whom correspondence should be addressed. E-mail: anhnd85@gmail.com
Adv. Sci. Technol. Eng. Syst. J. 6(2), 921-932 (2021); DOI: 10.25046/aj0602105
Keywords: Software security, Vulnerability, SAST, Case study, Secured e-government
Problem Downloading File? Alternate Link
Export Citations
Static Application Security Testing (SAST) is a popular quality assurance technique in software engineering. However, integrating SAST tools into industry-level product development and security assessment poses various technical and managerial challenges. In this work, we reported a longitudinal case study of adopting SAST as a part of a human-driven security assessment for an open-source e-government project. We described how SASTs are selected, evaluated, and combined into a novel approach for software security assessment. The approach was preliminarily evaluated using semi-structured interviews. Our result shows that (1) while some SAST tools outperform others, it is possible to achieve better performance by combining more than one SAST tools and (2) SAST tools should be used towards a practical performance and in the combination with triangulated approaches for human-driven vulnerability assessment in real-world projects.
Received: 08 February 2021, Accepted: 23 March 2021, Published Online: 10 April 2021
- L. Yang, N. Elisa, N. Eliot, “Chapter 7—Privacy and Security Aspects of E-Government in Smart Cities”, In D. B. Rawat & K. Z. Ghafoor (Eds.), Smart Cities Cybersecurity and Privacy, 89–102, 2019
- V. Vakkuri, K. Kemell, J. Kultanen, P. Abrahamsson, “The Current State of Industrial Practice in Artificial Intelligence Ethics”, IEEE Software, 37(4), 50–57, 2020, doi: 10.1109/MS.2020.2985621
- A. N. Duc, R. Jabangwe, P. Paul, P. Abrahamsson, “Security Challengesin IoT Development: A Software Engineering Perspective,” in Proceedings of the XP2017 Scientific Workshops, XP ’17, 1-5, 2017, doi:10.1145/3120459.3120471
- M. Ammar, G. Russello, B. Crispo, “Internet of Things: A survey on the securityof IoT frameworks”, Journal of Information Security and Applications, 38, 8–27, 2018, doi:10.1016/j.jisa.2017.11.002
- “ISACAs State of Cybersecurity 2019 Survey Retaining Qualified Cybersecu-rity Professionals” [Online], <https://www.isaca.org/cyber/Documents/state-of-cybersecurity_res_eng_0316.pdf>, 2016
- “New McAfee Report Estimates Global Cybercrime Losses to Exceed $1Trillion” [Online], <https://www.businesswire.com/news/home/20201206005011/en/New-McAfee-Report-Estimates-Global-Cybercrime-Losses-to-Exceed-1-Trillion>, 2020
- “Cost of Data Breach Study: Global Analysis” [Online], <https://securityintelligence.com/cost-of-a-data-breach-2017>, 2017
- P. E. Black, M. J. Kass, H.-M. M. Koo, “Source Code Security Analysis Tool Functional Specification Version 1.0″, Technical report, 2020
- J. Zheng, L. Williams, N. Nagappan, W. Snipes, J. P. Hudepohl, M. A. Vouk, “On the value of static analysis for fault detection in software”, IEEE Transactions on Software Engineering, 32(4), 240–253, 2006, doi:10.1109/TSE.2006.38
- B. Chess, G. McGraw, “Static analysis for security”, IEEE Secur. Privacy, 2(6), 76–79, 2004, doi: 10.1109/MSP.2004.111
- B. Aloraini, M. Nagappan, D. M. German, S. Hayashi, Y. Higo, “An empirical study of security warnings from static application security testing tools”, Journal of Systems and Software, 158, 110427, 2019, doi: 10.1016/j.jss.2019.110427
- I. Pashchenko, “FOSS version differentiation as a benchmark for static analysis security testing tools”, Proceedings of the 2017 11th Joint Meeting on Foundations of Software Engineering, 1056–1058, 2017
- V. Okun, A. M. Delaitre, P. E. Black, “Report on the Static Analysis ToolExposition (SATE) IV,” Last Modified: 2018-11-10T10:11-05:00.
- D. Baca, B. Carlsson, K. Petersen, L. Lundberg, “Improving software security with static automated code analysis in an industry setting”, Softw. Pract. Exp., 43(3), 259–279, 2013
- T. Hofer, “Evaluating static source code analysis tools”, Technical report, 2010
- V. Okun, A. M. Delaitre, P. E. Black, “NIST SAMATE: static analysis tool exposition (sate) iv”, Technical Report, NIST, Mar 2012
- G. McGraw, B. Potter, “Software security testing”, IEEE Secur. Priv., 2(5), 81–85, 2004
- M. Felderer, M. Buchler, M. Johns, A.D. Brucker, R. Breu, A. Pretschner “Chapter one – security testing: a survey”, In: Memon, A. (ed.) Advances in Computers, 101, 1–51, 2016
- M. Dowd, J. McDonald, and J. Schuh, “The Art of Software Security Assessment”, Addision-Wesley publications, 2007.
- T. D. Oyetoyan, B. Milosheska, M. Grini, D. Soares Cruzes, “Myths and Facts About Static Application Security Testing Tools: An Action Research at Telenor Digital”, In J. Garbajosa, X. Wang, & A. Aguiar (Eds.), Agile Processes in Software Engineering and Extreme Programming, 86–103, 2018
- G. Díaz, J. R. Bermejo, “Static analysis of source code security: Assessment of tools against SAMATE tests”, Information and Software Technology, 55(8), 1462–1476, 2013
- N. R. T. Charest, Y. Wu, “Comparison of static analysis tools for Java using the Juliet test suite”, In: 11th International Conference on Cyber Warfare and Security, 431–438, 2016
- S. M. Ghaffarian, H. R. Shahriari, “Software Vulnerability Analysis and Discovery Using Machine-Learning and Data-Mining Techniques: A Survey,” ACM Comput. Surv., 50(4), 1-36, 2017
- “Vulnerability Database Catalog” [Online]. <https://www.first.org/global/sigs/vrdx/vdb-catalog>, 2021
- T. Field, E. Muller, E. Lau, H. Gadriot-Renard, C. Vergez, “The case for egovernment – Excerpts from the OECD report: The E-Government Imperative”, OECD Journal on Budgeting, 3(1), 61–96, 2003
- Å, Grönlund,”Electronic Government: Design, Applications and Management”, IGI Global. 2002
- J. D. Twizeyimana, “User-centeredness and usability in e-government: A reflection on a case study in Rwanda” , The international conference on electronic, 2017
- J. D. Twizeyimana, A. Andersson, “The public value of E-Government – A literature review” Government Information Quarterly, 36(2), 167–178, 2019
- Q. N. Nkohkwo, M. S. Islam, “Challenges to the successful implementation of e-government initiatives in Sub-Saharan Africa: A literature review”, Electronic Journal of e-Government, 11(2), 253–267, 2013
- F. Bélanger, J. Carter, “Trust and risk in e-government adoption”, The Journal of Strategic Information Systems, vol. 17(2), 165–176, 2008, doi: doi.org/10.1016/j.jsis.2007.12.002
- M. Alshehri, S. Drew, “E-government fundamentals”, IADIS International Conference ICT, Society and Human Beings, 2010.
- C.E. Jiménez, F. Falcone, J. Feng, H. Puyosa, A. Solanas, F. González, “e-government: Security threats”, e-Government, 11:21, 2012
- A. Nguyen Duc, A. Chirumamilla, “Identifying Security Risks of Digital Transformation – An Engineering Perspective,” in Digital Transformation for a Sustainable Society in the 21st Century, Cham, 677–688, 2019
- D. Churchill, D. Distefano, M. Luca, R. Rhee, J. Villard, “AL: A new declarative language for detecting bugs with Infer”, Facebook Code Blog Post, 2020
- D. S. Sergio, “Facebook’s New AL Language Aims to Simplify Static Program Analysis”, InfoQ, 2020
- C. Calcagno, D. Distefano, J. Dubreil, D. Gabi, P. Hooimeijer, L. Piete, O. Martino, P. Peter, P. Irene, J. Purbrick, D. Rodriguez, “Moving Fast with Software Verification” NASA Formal Methods. Lecture Notes in Computer Science. 9058. Springer, Cham., 3–11, 2015
- A. N. Duc, A. Mockus, R. Hackbarth, and J. Palframan, “Forking and coordination in multi-platform development: a case study,” The 8th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, 1–10, 2014, doi: 10.1145/2652524.2652546
- N. Duc Anh, D. S. Cruzes, R. Conradi, and C. Ayala, “Empirical validation of human factors in predicting issue lead time in open source projects”, The 7th International Conference on Predictive Models in Software Engineering, 1-10, 2011
- “CAS static analysis tool study – methodology” [Online]., <https://samate.nist.gov/docs/CAS%202012%20Static%20Analysis%20Tool %20Study%20Methodology.pdf>, 2020