Visit

In an era where digital threats are ever-evolving, the need for advanced education and research in cybersecurity, trust, and safety is paramount. Cornell Tech’s new Security, Trust, and Safety (SETS) Initiative, a cutting-edge program aimed at revolutionizing these fields, aims to address these challenges head-on. The director of the SETS program, Google alum Alexios Mantzarlis, brings a wealth of experience and a vision to this critical endeavor.

We spoke with Mantzarlis to hear directly about what drove him to pursue this field, why Cornell Tech is the right institution for this type of initiative and what he hopes SETS will be able to achieve.

Can you tell us a bit about what led you to the fields of cybersecurity, safety, and trust?

I fell into this field by accident. After I launched a fact-checking startup in Italy, I became passionate about access to information and a fact-based public discourse built on transparency and mutual understanding.

I then moved to the United States to start the International Fact-Checking Network, an industry coalition of journalists combating misinformation. It was in that position that I realized the fundamental role our digital spaces played in setting the stage for a healthy information ecosystem. I saw both what governments could do through being one of the experts in the European Union’s High Level Group on online disinformation and advising the European Commission on scoping the phenomenon of fake news, as well as what platforms could do in helping Facebook launch a Third Party Fact-Checking Program.

Both of these experiences ultimately led me to join Google, where as Principal of Trust & Safety Intelligence I was the lead analyst for misinformation and generative AI, working on information quality policies across Google products like Search and Gemini.

What makes Cornell Tech uniquely positioned to lead in the security, trust, and safety fields?

Its people. Cornell Tech has an extraordinary set of experts across the privacy, cybersecurity, and digital safety fields – and they care deeply about the role of technology on society. This shows in their research and teaching, but it also shows in some of the projects that have emerged from this campus such as the Clinic to End Tech Abuse, the Digital Life Initiative and the Public Interest Tech Initiative.

And if all that wasn’t enough, Cornell Tech’s faculty and students are an integral part of a larger ecosystem of researchers and educators that includes equally stellar colleagues in Ithaca, N.Y. and Haifa, Israel that allow for collaboration and knowledge-sharing across disciplines, geographies, and lived experiences.

How do you envision the SETS Initiative impacting the broader field of cybersecurity and technology safety and trust?

A central goal for SETS is to leverage the collective expertise at Cornell Tech and build a new approach that more holistically approaches problems across systems security, data privacy, and trust and safety. The latter field has boomed over the past decade and now employs approximately 100,000 people worldwide, but it remains less systematically a focus of academic inquiry. Such inquiry is often disjointed from security and privacy, and SETS will be a place where we can tackle problems across all three topic areas in a closely integrated way.

What types of research projects or areas of study will SETS focus on initially?

One of the first things we’ll do is use the convening power of Cornell Tech to bring practitioners and researchers together on specific risk areas. To start, we’ll focus on the scourge of synthetic non-consensual intimate imagery that is more easy to generate and proliferate today, and on data center security.

Building on those initial initiatives, topics that SETS might address include stalkerware, doxxing and other forms of harassment that exploit cybersecurity or privacy vulnerabilities; anti-abuse features for end-to-end encrypted messaging; the consequences for privacy from new adtech developments; and LLM safety.

How important is interdisciplinary collaboration in the work you plan to do at SETS, and which industry or civic partners do you see Cornell Tech working with?

SETS aims to be interdisciplinary by default – leveraging experts in computer security, digital safety, policy, ethics, law and beyond – to assess the harms it studies. This is one of the things Cornell Tech is uniquely qualified to do as an institution that is inherently interdisciplinary and blends fields in its teaching, research, and programmatic offerings.

We aim to advance these issues that are strongly aligned with public and societal interest alongside partners who seek to inform and spur positive change. On something as complicated as synthetic non-consensual intimate imagery, for example, that might mean a group as diverse as public school educators, platform representatives, and victims. Ultimately, SETS will have succeeded if we have activated communities of practitioners fighting common harm together.

What do you see as the biggest challenges currently facing the technology industry when it comes to online safety and trust, especially with developments like artificial intelligence?

New technologies can end up harming vulnerable populations first and foremost. We’ve seen that with “nudifiers” – AI technology specifically designed to remove clothing from photos and create a fake nude image – making their way insidiously onto school grounds. Generative AI has been used as a kind of productivity tool by bad actors seeking to conduct influence operations and scam individuals and companies out of valuable information and money. Beyond that, we have yet to see the full impact of prompt injections, data poisoning, and interactive 1-to-1 social engineering. But those are coming.

Why is it important for academia and industry to work together to improve security and trust in technology and artificial intelligence?

Academia has the expertise, frameworks, and independence to study security and trust challenges with the public interest front and center. Industry has the infrastructure, resources, and moral obligation to engender systemic impact on our online infrastructures. Interaction between these two sectors is essential. Collaborative efforts can accelerate the development of robust security measures, foster innovation, and ensure that advancements in AI are both ethical and beneficial to society.

How will SETS involve and educate students, the broader Cornell Tech community, and even the general public about security and trust issues?

Cornell Tech already has a strong presence in many of these areas of focus. So first and foremost, we aim to bring it all together under a unified umbrella for those who want to concentrate on these topics. Beyond that, we will bolster it with additional courses and practical modules that we will start experimenting with soon.

When it comes to the general public, we are planning to launch a newsletter that translates and contextualizes research findings, as well as convene fireside chats and other public forums for conversations with leading practitioners and researchers. But we’re only just starting, so watch this space!

What are some of the long-term goals you have for SETS over the next five years?

Our North Star is to catalyze and harness the power of Cornell Tech and to provide world-class education that equips future leaders of this field to guide technological progress with strong ethical frameworks and a deep commitment to secure systems and safe online citizenship. We aim to foster a culture of continuous learning and innovation, ensuring that our graduates and the broader community are well-prepared to tackle emerging challenges and drive positive change in the technology landscape.