I care about my privacy. I'm one of those folks who read the privacy policy and head straight to the settings page after signing up, double-checking what features and data the tools, apps, and websites I use are accessing.

This makes me think a lot about how we handle user data in our own products. Duo's Dug Song captured it perfectly when he outlined the key areas of security risk that startups need to consider:

"There are basically three areas that folks should start considering how to bucket those risks. The first is corporate risk in defending your users and applications they access. The second is application security and product risk. A third area is around production security and making sure that the operation of your security program is something that keeps up with that risk. And then a fourth — a new and emerging space — is trust, and not just privacy, but also safety." – Dug Song

That fourth area-trust-is what keeps me up at night. Because here's the thing: one of the hardest challenges in developing software is defining exactly what data you need to collect, then figuring out how to responsibly manage and discard it after it serves its intended purpose. Just that purpose. Nothing more.


The foundation of good data privacy is simple: say precisely what you're doing with user data, then do exactly what you say. Your privacy policy means nothing if you aren't taking concrete steps to ensure it.

This isn't just about compliance or avoiding lawsuits. It's about empathy. You're handling another person's information. It's your responsibility to do right by them, especially when they aren't looking.

The most important thing to understand? You're dealing with a user's data. Their data. Not yours.

You can use it to provide the service they opted into-that's the deal. But beyond that, you don't and shouldn't have the right to do anything without explicit permission.

I do get that some provisions exist with legitimate reason-limiting liability, allowing for acquisitions, enabling basic promotional advertising. While no sensible lawyer wold recommends eliminating these protections entirely, we need to be more thoughtful about what we're actually asking for and why. Open-ended clauses have sadly become a de facto template across a lot of things, often creating a permission structure that let companies stretch how they use personal data in ways users never imagined.


My conclusion is straightforward; build solutions that create real value. Charge users a fair price for your services. Eliminate the need to wade the murky waters entirely. It's bad ethics, and honestly, it's bad business. You're better off taking a stand.

Here's my litmus test: If you're constantly worried about security and protecting the privacy of everyone using your product, you're probably doing it right. If you don't genuinely care about privacy, it'll never get the priority it deserves. And your users will eventually figure that out.

The trust you build by respecting user privacy isn't just the right thing to do—it's a competitive advantage in a world where people are increasingly aware of how their data is being used. Start there, and everything else becomes clearer.