Student Spotlight: Emily Tseng
February 20, 2023
Emily Tseng is a doctoral student in information science (IS) from Singapore and Nashville, Tennessee. She earned a bachelor’s degree from Princeton University, with a focus on global health and mathematical modeling for infectious disease epidemics, and now studies human-computer interaction, machine learning, and data privacy at Cornell.
What is your area of research and why is it important?
I work broadly in human-computer interaction (HCI) and social computing, borrowing approaches from machine learning, computer security and privacy, and global health. I’m interested in how we build computational tools for new systems of caregiving. In fields like medicine, health, and social work, our efforts to improve people’s lives involve gathering large amounts of data on their trauma or pain, analyzing it using machine learning methods increasingly less and less subject to human oversight, and then using that analysis to make design and policy decisions. I want us to be able to do this rigorously and with attention to core human values like privacy, agency, and equity. To me, this is a combination of evolution in our data analysis techniques (e.g., privacy-preserving machine learning), our approaches to gathering and curating datasets (e.g., informed consent), and our technology design frameworks (e.g., participatory design). So far, I’ve worked specifically with text-based psychotherapy platforms, mobile data collection systems in home health care, and computer security and privacy for survivors of intimate partner violence.
What are the larger implications of this research?
Care is core to society, and care systems are increasingly being remade as systems for data collection and analysis—consider, for example, how much time your doctor spends checking boxes on your electronic health record. This means we can do amazing things with that data, like developing new therapeutics, forecasting epidemics, and improving care for underrepresented people—just look at the explosion in machine learning for health. But this also means we’re collecting more data about more people than ever before, and the history of research shows us that can often be extractive and harmful. My goal is to ensure we build the data-driven future of care systems responsibly and ensure this future is uplifting for data subjects and for broader society.
What does it mean to you to have been awarded a Microsoft Research Ph.D. Fellowship?
I’m thrilled to have been selected—it’s an honor and a massive statement of confidence for my work. More importantly, these awards always reflect a community’s worth of effort, and in my Ph.D. I have had the privilege of extraordinary support from my advisors, my peers, and the field of IS. I’m immensely grateful to them. I also take seriously my responsibility to pay it forward—I wrote up a blog post on my approach to fellowship applications that I hope can help other junior scholars.
You received a best paper award at CHI 2022. Can you tell us about the paper?
It was a tremendous honor to be recognized by the CHI community! “Care Infrastructures for Digital Security and Privacy in Intimate Partner Violence” was a project I led out of the IPV Tech Research Group, which is spearheaded here at Cornell Tech by co-PIs Nicki Dell and Tom Ristenpart. Our research group is interested in characterizing the many ways digital technology exacerbates intimate partner violence (IPV)–so domestic violence, harassment, stalking, etc. by a current or former spouse or significant partner. We pull that knowledge through to interventions that help victims at the Clinic to End Tech Abuse (CETA), a volunteer organization where survivors get 1:1 computer security and privacy support from consultants trained in the specific threat model of IPV. We call this approach “clinical computer security.”
Services like CETA can be a huge benefit to survivors facing targeted and persistent digital threats, but they’re often delivered as one-off tech support. In this paper, we presented an eight-month study of an approach to computer security and privacy inspired by the feminist ethic of care: think less Geek Squad and more primary care physician. We built various technical and social systems to make this work and used it in CETA to support 72 survivors over eight months (from December 2020 to August 2021). Then, through a reflexive qualitative study, we examined how well the model worked in practice. Our paper shows we were able to help more survivors and meet increasing demand for this type of care—but to grow the service, we need to reckon with tensions like ensuring safe connections to survivors, adapting to their changing needs, establishing boundaries on consultants’ time, and assessing risks in the face of uncertainty.
Since this paper was published, we’ve been using the protocol in CETA, and as of February 2023 we’ve helped over 300 survivors. In ongoing work, we’re building on this infrastructure to continue research with survivors on the many ways IPV affects their lives and to explore how the research process can be made both data-driven and participatory—stay tuned for that.
What are your hobbies or interests outside of your research or scholarship?
In 2020 I started playing soccer in various pickup groups NYC as a pandemic coping mechanism. Now it’s a full-blown hobby: I play several times a week and follow all sorts of leagues. Talk to me about the women’s game especially!
Why did you choose Cornell to pursue your degree?
I was attracted to the interdisciplinarity baked into the IS Ph.D. Cornell offers top-flight training in computer science—especially machine learning—and exposure to cutting-edge ML across both industry and academia. Cornell also offers top-flight training in how to think about technology’s social contexts and consequences—and will have you critically examining what the words ‘data’ and ‘technology’ even mean. It can sometimes be frustrating to operate between traditions, but I personally find it freeing to pursue a research question from any and all angles and to learn from peers seeking to do the same.