Post-Christmas Thoughts on Privacy and AI
The end of the year always makes me more reflective, and this one is no exception. After a good Christmas night with my family, I found myself thinking again about privacy, Linux, and the path I chose this year. I went deep into that world, sometimes to the point of obsession, and for a while I wondered whether all the friction was worth it. After some honest reflection, I think it was.
One of my recurring insecurities is whether the older version of me, looking back at life from its final stretch, would think: “Was all this effort to protect my personal data actually worth it?” Right now, I believe the answer would be yes. Even if the gains are not always visible, there is something valuable in refusing to become completely passive in relation to your own information.
What came to mind after dinner was simple: if people were not so careless with their own data, generative AI would probably not feel so overwhelming. It would still exist, of course, but it would not be fed so easily, scaled so effortlessly, or accepted so casually. Part of what makes AI feel enormous is not only the technology itself, but the culture around information. We hand over too much, too often, and with too little resistance.
I am not especially interested in AGI mythology or apocalyptic fantasies. My concern is more ordinary, and for that reason more immediate: when technological capability advances at the same time that people normalize surrendering their data, the result is a labor market that can be reorganized faster than workers can adapt to it. Jobs disappear first. New ones may come later, but not always for the same people, and not without damage in between.
That is why privacy matters to me beyond the usual moral language about rights and surveillance. Privacy is also about bargaining power. It is about setting limits on how much of human life can be captured, modeled, automated, and sold back to us. A society that does not value restraint in the treatment of information should not be surprised when automation arrives as a threat instead of a tool.
Maybe that is the part I am trying to hold on to. Not purity, not paranoia, and not the fantasy of disappearing from the modern world, but a discipline of refusal. A willingness to say that not everything needs to be collected, indexed, trained on, optimized, or made frictionless.
If that discipline becomes more common, perhaps AI will still grow, but in a way that feels less extractive and less frightening. If it does not, then the unease people feel today is probably only the beginning.