Abstract
This study delves into how human experts detect junk codes, a known tool for evasion attacks. Through an online experiment with industry and academic practitioners, we analyze the detection of evasive samples by humans, from simple to complex tactics. We show that the experience in software development impacts the ability of the participants to identify junk codes correctly. Surprisingly, their experience in cybersecurity is not a great contribution to this task, although detecting malicious code is generally viewed as a field of cybersecurity. We also show that extended time slots do not aid in detecting such junk codes. Mostly, these types of malicious code are identified early, and additional time does not make a great improvement. We hope this study will emphasize the need for improved training for future experts, to enhance malware detection in human-computer defense systems. Also, as we show the detection time of skilled experts to be relatively short, we envision effective detection systems based on human-computer interaction in the future.