publications
publications by categories in reversed chronological order. generated by jekyll-scholar.
An up-to-date list is available on Google Scholar.
conference & journal articles
2023
- Lessons in VCR Repair: Compliance of Android App Developers with the California Consumer Privacy Act (CCPA)Nikita Samarin, Shayna Kothari , Zaina Siyed , and 7 more authorsProceedings on Privacy Enhancing Technologies (PoPETS), 2023
The California Consumer Privacy Act (CCPA) provides California residents with a range of enhanced privacy protections and rights. Our research investigated the extent to which Android app developers comply with the provisions of the CCPA that require them to provide consumers with accurate privacy notices and respond to "verifiable consumer requests" (VCRs) by disclosing personal information that they have collected, used, or shared about consumers for a business or commercial purpose. We compared the actual network traffic of 109 apps that we believe must comply with the CCPA to the data that apps state they collect in their privacy policies and the data contained in responses to "right to know" requests that we submitted to the app’s developers. Of the 69 app developers who substantively replied to our requests, all but one provided specific pieces of personal data (as opposed to only categorical information). However, a significant percentage of apps collected information that was not disclosed, including identifiers (55 apps, 80%), geolocation data (21 apps, 30%), and sensory data (18 apps, 26%) among other categories. We discuss improvements to the CCPA that could help app developers comply with "right to know" requests and other related regulations.
@article{samarin2023lessons, title = {Lessons in VCR Repair: Compliance of Android App Developers with the California Consumer Privacy Act (CCPA)}, author = {Samarin, Nikita and Kothari, Shayna and Siyed, Zaina and Bjorkman, Oscar and Yuan, Reena and Wijesekera, Primal and Alomar, Noura and Fischer, Jordan and Hoofnagle, Chris and Egelman, Serge}, journal = {Proceedings on Privacy Enhancing Technologies (PoPETS)}, volume = {3}, pages = {103--121}, year = {2023}, doi = {10.56553/popets-2023-0072}, }
2020
- Empirical Measurement of Systemic 2FA UsabilityJoshua Reynolds , Nikita Samarin, Joseph Barnes , and 4 more authorsIn USENIX Security Symposium , 2020
Two-Factor Authentication (2FA) hardens an organization against user account compromise, but adds an extra step to organizations’ mission-critical tasks. We investigate to what extent quantitative analysis of operational logs of 2FA systems both supports and challenges recent results from user studies and surveys identifying usability challenges in 2FA systems. Using tens of millions of logs and records kept at two public universities, we quantify the at-scale impact on organizations and their employees during a mandatory 2FA implementation. We show the multiplicative effects of device remembrance, fragmented login services, and authentication timeouts on user burden. We find that user burden does not deviate far from other compliance and risk management time requirements already common to large organizations. We investigate the cause of more than one in twenty 2FA ceremonies being aborted or failing, and the variance in user experience across users. We hope our analysis will empower more organizations to protect themselves with 2FA.
@inproceedings{reynolds2020empirical, title = {Empirical Measurement of Systemic 2FA Usability}, author = {Reynolds, Joshua and Samarin, Nikita and Barnes, Joseph and Judd, Taylor and Mason, Joshua and Bailey, Michael and Egelman, Serge}, booktitle = {USENIX Security Symposium}, pages = {127--143}, year = {2020}, doi = {10.5555/3489212.3489220}, }
2019
- Pilot: Password and PIN Information Leakage from Obfuscated Typing VideosKiran Balagani , Matteo Cardaioli , Mauro Conti , and 8 more authorsJournal of Computer Security, 2019
This paper studies leakage of user passwords and PINs based on observations of typing feedback on screens or from projectors in the form of masked characters (∗ or ∙) that indicate keystrokes. To this end, we developed an attack called Password and Pin Information Leakage from Obfuscated Typing Videos (PILOT). Our attack extracts inter-keystroke timing information from videos of password masking characters displayed when users type their password on a computer, or their PIN at an ATM. We conducted several experiments in various attack scenarios. Results indicate that, while in some cases leakage is minor, it is quite substantial in others. By leveraging inter-keystroke timings, PILOT recovers 8-character alphanumeric passwords in as little as 19 attempts. When guessing PINs, PILOT significantly improved on both random guessing and the attack strategy adopted in our prior work (In European Symposium on Research in Computer Security (2018) 263–280 Springer). In particular, we were able to guess about 3% of the PINs within 10 attempts. This corresponds to a 26-fold improvement compared to random guessing. Our results strongly indicate that secure password masking GUIs must consider the information leakage identified in this paper.
@article{balagani2019pilot, title = {Pilot: Password and PIN Information Leakage from Obfuscated Typing Videos}, author = {Balagani, Kiran and Cardaioli, Matteo and Conti, Mauro and Gasti, Paolo and Georgiev, Martin and Gurtler, Tristan and Lain, Daniele and Miller, Charissa and Molas, Kendall and Samarin, Nikita and others}, journal = {Journal of Computer Security}, volume = {27}, number = {4}, pages = {405--425}, year = {2019}, publisher = {IOS Press}, doi = {10.3233/JCS-191289}, }
2018
- SILK-TV: Secret Information Leakage from Keystroke Timing VideosKiran S Balagani , Mauro Conti , Paolo Gasti , and 8 more authorsIn European Symposium on Research in Computer Security (ESORICS) , 2018
Shoulder surfing attacks are an unfortunate consequence of entering passwords or PINs into computers, smartphones, PoS terminals, and ATMs. Such attacks generally involve observing the victim’s input device. This paper studies leakage of user secrets (passwords and PINs) based on observations of output devices (screens or projectors) that provide “helpful” feedback to users in the form of masking characters, each corresponding to a keystroke. To this end, we developed a new attack called Secret Information Leakage from Keystroke Timing Videos (SILK-TV). Our attack extracts inter-keystroke timing information from videos of password masking characters displayed when users type their password on a computer, or their PIN at an ATM or PoS. We conducted several studies in various envisaged attack scenarios. Results indicate that, while in some cases leakage is minor, it is quite substantial in others. By leveraging inter-keystroke timings, SILK-TV recovers 8-character alphanumeric passwords in as little as 19 attempts. However, when guessing PINs, SILK-TV yields no substantial speedup compared to brute force. Our results strongly indicate that secure password masking GUIs must consider the information leakage identified in this paper.
@inproceedings{balagani2018silk, title = {SILK-TV: Secret Information Leakage from Keystroke Timing Videos}, author = {Balagani, Kiran S and Conti, Mauro and Gasti, Paolo and Georgiev, Martin and Gurtler, Tristan and Lain, Daniele and Miller, Charissa and Molas, Kendall and Samarin, Nikita and Saraci, Eugen and others}, booktitle = {European Symposium on Research in Computer Security (ESORICS)}, pages = {263--280}, year = {2018}, organization = {Springer}, doi = {10.1007/978-3-319-99073-6_13}, }
refereed preprints
2023
- Understanding How Third-Party Libraries in Mobile Apps Affect Responses to Subject Access RequestsNikita Samarin, and Primal WijesekeraWorkshop on Technology and Consumer Protection (ConPro), 2023
@article{samarinunderstanding, title = {Understanding How Third-Party Libraries in Mobile Apps Affect Responses to Subject Access Requests}, author = {Samarin, Nikita and Wijesekera, Primal}, journal = {Workshop on Technology and Consumer Protection (ConPro)}, year = {2023}, }
2021
- Examining the Landscape of Digital Safety and Privacy Assistance for Black CommunitiesNikita Samarin, Aparna Krishnan , Moses Namara , and 2 more authorsWorkshop on Inclusive Privacy and Security (WIPS), 2021
Recent events have placed a renewed focus on the issue of racial justice in the United States and other countries. One dimension of this issue that has received considerable attention is the security and privacy threats and vulnerabilities faced by the communities of color. Our study focuses on community-level advocates who organize workshops, clinics, and other initiatives that inform Black communities about existing digital safety and privacy threats and ways to mitigate against them. Additionally, we aim to understand the online security and privacy needs and attitudes of participants who partake in these initiatives. We hope that by understanding how advocates work in different contexts and what teaching methods are effective, we can help other digital safety experts and activists become advocates within their communities.
@article{samarin2022examining, title = {Examining the Landscape of Digital Safety and Privacy Assistance for Black Communities}, author = {Samarin, Nikita and Krishnan, Aparna and Namara, Moses and Ma, Joanne and Redmiles, Elissa M}, journal = {Workshop on Inclusive Privacy and Security (WIPS)}, year = {2021}, }
2020
- Surveying Vulnerable Populations: A Case Study of Civil Society OrganizationsNikita Samarin, Alisa Frik , Sean Brooks , and 2 more authorsWorkshop on Inclusive Privacy and Security (WIPS), 2020
Compared to organizations in other sectors, civil society organizations (CSOs) are particularly vulnerable to security and privacy threats, as they lack adequate resources and expertise to defend themselves. At the same time, their security needs and practices have not gained much attention among researchers, and existing solutions designed for the average users do not consider the contexts in which CSO employees operate. As part of our preliminary work, we conducted an anonymous online survey with 102 CSO employees to collect information about their perceived risks of different security and privacy threats, and their self-reported mitigation strategies. The design of our preliminary survey accounted for the unique requirements of our target population by establishing trust with respondents, using anonymity-preserving incentive strategies, and distributing the survey with the help of a trusted intermediary. However, by carefully examining our methods and the feedback received from respondents, we uncovered several issues with our methodology, including the length of the survey, the framing of the questions, and the design of the recruitment email. We hope that the discussion presented in this paper will inform and assist researchers and practitioners working on understanding and improving the security and privacy of CSOs.
@article{samarin2020surveying, title = {Surveying Vulnerable Populations: A Case Study of Civil Society Organizations}, author = {Samarin, Nikita and Frik, Alisa and Brooks, Sean and Cheshire, Coye and Egelman, Serge}, journal = {Workshop on Inclusive Privacy and Security (WIPS)}, year = {2020}, }
2019
- A Key to Your Heart: Biometric Authentication Based on ECG SignalsNikita Samarin, and Donald SannellaWho Are You?! Adventures in Authentication Workshop (WAY), 2019
In recent years, there has been a shift of interest towards the field of biometric authentication, which proves the identity of the user using their biological characteristics. We explore a novel biometric based on the electrical activity of the human heart in the form of electrocardiogram (ECG) signals. In order to explore the stability of ECG as a biometric, we collect data from 55 participants over two sessions with a period of 4 months in between. We also use a consumer-grade ECG monitor that is more affordable and usable than a medical-grade counterpart. Using a standard approach to evaluate our classifier, we obtain error rates of 2.4% for data collected within one session and 9.7% for data collected across two sessions. The experimental results suggest that ECG signals collected using a consumer-grade monitor can be successfully used for user authentication.
@article{samarin2019key, title = {A Key to Your Heart: Biometric Authentication Based on ECG Signals}, author = {Samarin, Nikita and Sannella, Donald}, journal = {Who Are You?! Adventures in Authentication Workshop (WAY)}, year = {2019}, }
- On the Ridiculousness of Notice and Consent: Contradictions in App Privacy PoliciesEhimare Okoyomon , Nikita Samarin, Primal Wijesekera , and 6 more authorsWorkshop on Technology and Consumer Protection (ConPro), 2019
The dominant privacy framework of the information age relies on notions of “notice and consent.” That is, service providers will disclose, often through privacy policies, their data collection practices, and users can then consent to their terms. However, it is unlikely that most users comprehend these disclosures, which is due in no small part to ambiguous, deceptive, and misleading statements. By comparing actual collection and sharing practices to disclosures in privacy policies, we demonstrate the scope of the problem. Through analysis of 68,051 apps from the Google Play Store, their corresponding privacy policies, and observed data transmissions, we investigated the potential misrepresentations of apps in the Designed For Families (DFF) program, inconsistencies in disclosures regarding third-party data sharing, as well as contradictory disclosures about secure data transmissions. We find that of the 8,030 DFF apps (i.e., apps directed at children), 9.1% claim that their apps are not directed at children, while 30.6% claim to have no knowledge that the received data comes from children. In addition, we observe that 10.5% of 68,051 apps share personal identifiers with third-party service providers, yet do not declare any in their privacy policies, and only 22.2% of the apps explicitly name third parties. This ultimately makes it not only difficult, but in most cases impossible, for users to establish where their personal data is being processed. Furthermore, we find that 9,424 apps do not use TLS when transmitting personal identifiers, yet 28.4% of these apps claim to take measures to secure data transfer. Ultimately, these divergences between disclosures and actual app behaviors illustrate the ridiculousness of the notice and consent framework.
@article{okoyomon2019ridiculousness, title = {On the Ridiculousness of Notice and Consent: Contradictions in App Privacy Policies}, author = {Okoyomon, Ehimare and Samarin, Nikita and Wijesekera, Primal and Elazari Bar On, Amit and Vallina-Rodriguez, Narseo and Reyes, Irwin and Feal, {\'A}lvaro and Egelman, Serge and others}, journal = {Workshop on Technology and Consumer Protection (ConPro)}, year = {2019}, }
2018
- Evading Classifiers in Discrete Domains with Provable Optimality GuaranteesBogdan Kulynych , Jamie Hayes , Nikita Samarin, and 1 more authorWorkshop on Security in Machine Learning at NeurIPS, 2018
Machine-learning models for security-critical applications such as bot, malware, or spam detection, operate in constrained discrete domains. These applications would benefit from having provable guarantees against adversarial examples. The existing literature on provable adversarial robustness of models, however, exclusively focuses on robustness to gradient-based attacks in domains such as images. These attacks model the adversarial cost, e.g., amount of distortion applied to an image, as a p-norm. We argue that this approach is not well-suited to model adversarial costs in constrained domains where not all examples are feasible. We introduce a graphical framework that (1) generalizes existing attacks in discrete domains, (2) can accommodate complex cost functions beyond p-norms, including financial cost incurred when attacking a classifier, and (3) efficiently produces valid adversarial examples with guarantees of minimal adversarial cost. These guarantees directly translate into a notion of adversarial robustness that takes into account domain constraints and the adversary’s capabilities. We show how our framework can be used to evaluate security by crafting adversarial examples that evade a Twitter-bot detection classifier with provably minimal number of changes; and to build privacy defenses by crafting adversarial examples that evade a privacy-invasive website-fingerprinting classifier.
@article{kulynych2018evading, title = {Evading Classifiers in Discrete Domains with Provable Optimality Guarantees}, author = {Kulynych, Bogdan and Hayes, Jamie and Samarin, Nikita and Troncoso, Carmela}, journal = {Workshop on Security in Machine Learning at NeurIPS}, year = {2018}, }