The future of Privacy Inc.
January 25, 2023•1,401 words
I'm somewhat pragmatic about how I view the privacy market, even though having a more hyperbolic view would be better for business. It's somewhat telling that, almost a decade after the Snowden revelations, consumers still find themselves having to ask the age-old question of "why should I give up Google when it makes my life so much easier?" This question is so front-of-mind that it is almost immediately posed on podcasts where the subject is a prominent privacy figure like Brendan Eich on Acquired or Andy Yen on Big Technology.
Andy Yen, CEO and founder of Proton, was also asked the same question by Alex Kantrowitz on Big Technology, in a slightly more confrontational conversation about the true value of privacy in practical ecosystems like Google's. Andy's response was more or less similar to Brendan's—if general mistrust of Google is not a factor, then privacy is about power, and about building a better internet and thus a better world, by taking away power from large corporations who are reluctant to share it.
These sort of legendary arguments—as in the literal sense of speaking to the legend we aim to leave behind—are wonderful, I believe, in the eyes of true privacy crusaders like myself and which would no doubt appease the likes of the authors of the United States constitution. But to the everyday layman? Only little chance, in my experience, that this will leave a mark past a spontaneous "oh, that's an interesting way to look at it."
As someone who has been developing a note-taking app focused on security for the better half of a decade, the question of why privacy matters hits all too close to home, day in, day out. And as someone who truly does internalize that privacy is not just about how it benefits me but how it benefits society at large, I have to say this dosage does not quite hit as hard as the competing product's. I've built up tolerance to this argument. I have no problem—and indeed must often—put myself in the mind of a consumer, who has all right to say: "Google's products are some of the best on the market, and yet I should avoid using them so I can build a better world for the inhabitants of tomorrow?" For the resolved amongst us, sure, but for everyone else, this privacy diet just wouldn't be one to stick.
And so the only real chance any privacy product has in sticking with consumers is to be not equal to but better than the non-private alternative. In which case, privacy becomes no longer the most important selling point, but a bullet point amongst bullet points.
But here's the catch: I believe, at large, that non-privacy preserving products will perpetually be better than privacy-preserving products. Apart from the fact that private, encrypted products are significantly more difficult to develop, the primary constraint is the foundational client-server computing model in which non-privacy oriented clients can opaquely access and transmit unencumbered data to a remote server which exposes a single interface but may in fact be, and has all right to be, a cluster of supercomputers. Even if that cluster of supercomputers can tomorrow fit in the palm of your hand by shrinking by a factor of 1000x, the remote endpoints of tomorrow can likewise scale by 1000x and remain perpetually ahead.
With the advent of practical AI technologies today, the gap between server-distrusting, privacy-focused products and products which have no shame in sharing data with the server to maximize yield will only widen. I'm not quite certain if the present-day implementation of machine learning models like OpenAI's in all productivity applications is a fad or here to stay, but assuming this becomes requisite to user expectations in the future, privacy-preserving products which cannot reveal user contents to the server do not stand a chance of competing. It may very well be that AI is privacy's death kiss. And again, even assuming that models can one day be run client-side, better models will always be available where consumer-grade equipment is not a constraint.
One possible chance for Privacy Inc. is home-side server computing, the likes of a Synology NAS. There should be no reason that consumers cannot purchase a HomePod-sized personal home server that is always on and offers powerful enough computing capabilities to allow deep computation on personal data. Of course even a home server the size of a refrigerator still pales in comparison to the unrestrained potential of the client-server model, but perhaps there is a limit by which larger and larger computational devices serve only diminishing gains over what is practically useful for a consumer. Another fighting chance for privacy products is peer-to-peer computation spread amongst a user's devices, but this overall doesn't differ too much from a home-side server model.
Some products which advertise end-to-end encryption today have found the LLM rush simply too irresistible. What was previously fully private is now "only the text you highlight will be sent to OpenAI"—pragmatic, but a slippery slope.
While privacy as a market seems to be perpetually precarious in the minds of consumers, security on the other hand is that cold, hard, and indisputable product that privacy companies wish they could so easily delineate. While privacy is sometimes hard to explain, security is self-explanatory. Why do you need privacy in a product? "Power dynamics" seems to be the best answer. Why do you need security in a product? "To protect what's valuable to you," is one of many great, easily forthcoming answers.
Few would stand to benefit as much as a company like Standard Notes if my predictions about the future consumer sentiment towards privacy turn out to be incorrect. And don't get me wrong—the privacy market is still sizable. Proton boasts 70 million users, almost doubling in recent years. My theory is only that privacy-preserving products will perpetually lag behind non-private alternatives when it comes to sheer, raw power, and if this power is ultimately more immediately practical and self-serving to users than designing for the betterment of some future society, then the fight for privacy is not so much a fight towards something, but rather its perpetual state.
- Privacy is important in building free societies. The theoretical arguments for it are plentiful and profound.
- On a large scale, consumers gravitate towards products that solve an immediate need, rather than one that builds towards a better future tomorrow for other people.
- If the present boom of practical AI tools is not a fad and these tools become truly indispensable for productivity and communication oriented products, privacy-preserving products do not stand a chance of competing, even if these models can be run client-side, because server-side models will always be more powerful.
- Home-side server computing is a viable model (Synology NAS and self-hosting) but one which does not seem to have made it in the mainstream due to a lack of concerted effort by big technology companies who do not seem to find it in their interest to subvert the lucrative and convenient cloud computing model.
- Privacy can be a hard sell, but security in many cases encapsulates privacy by default, and isn't prone to capricious arguments geared towards conscientious consumers.
There is one optimistic vision for privacy, which is not altogether too unlikely: AI's practical usecase turns out to be more limited than we thought, and models which can run on a phone are plenty sufficient to do what we want, like categorizing photos and suggesting text. If that is the case, privacy-preserving products stand to be a +1 on their non-private counterparts, if executed correctly. This is Privacy Inc's current direction, and it seems to be very viable and very promising. Privacy-preserving products today have become truly excellent. If this trend continues, it may be that non-private products have difficulty competing with private alternatives. What a truly wonderful world that would be.