Behind the firewall: Rebecca Balebako

By 
Jacob Høedt Larsen
October 10, 2024

“I didn’t initially plan to pursue data protection, but I found my way into it during my career. I started as a software engineer, building web applications and working on projects like climate change. However, I wanted to focus more on the societal impact of technology, so I decided to pursue a PhD at Carnegie Mellon, where I studied usable privacy. My interest in data protection grew from learning how to create privacy policies based on solid research and understanding the real risks of data sharing.

A turning point for me was during a project in Ghana, where we developed a tool for farmers to check market prices via text. This made me question the trade-offs between usability and privacy. Should passwords be visible to make things easier, even in a public space? That experience sparked my deeper interest in balancing security, usability, and privacy.

For me, privacy isn’t just about protecting personal data—it’s about protecting vulnerable populations and ensuring fairness. Privacy is essential to living a "good life," offering people the choice to control their data. Moving forward, I’m particularly interested in AI and how it intersects with privacy. There’s a lot of hype around AI, and people don’t fully understand its implications. That leaves room to improve how we ensure both privacy and ethical AI use.

Different organizations need different approaches to privacy. A small startup might focus on making privacy a core value and communicating it clearly to employees and customers. Larger companies need to build the right infrastructure and hire privacy experts. It’s not just about complying with regulations like GDPR—it’s about thinking beyond compliance and focusing on what’s best for users and society.

I’m a big believer in test-driven development, particularly with privacy. Red teaming, where we actively try to break systems, is a great way to spot risks early. Brainstorming potential user scenarios and pushing the limits of systems help ensure better, safer technologies. However, we need more diversity in the tech and privacy sectors to truly represent the lived experiences of all users. Empathy and compassion must guide how we design systems to ensure they work for everyone, not just a few.”

Rebecca Balebako is a Privacy Engineer, who has worked with RAND corporation and Google.  She now runs her own business, Balebako Privacy Engineer in Switzerland.