AI, Privacy & Humanity - The Age of Algorithmic Intrusion
From short-lived image editing trends to unknowingly handing over sensitive data to AI Chatbots, privacy concerns are at an all-time high. The twist? Most people don't even realize it.
Before you continue reading, please agree to the following terms to unlock the rest of the blog post.
By reading and clicking "I Agree", you consent to the following Privacy Policy:
Effective Date: Whenever you clicked "Agree" without reading this. Last Updated: Who knows? You didn't check anyway.
Thanks for using our app, where we help you edit your selfies, track your habits, and slowly chip away at your privacy and digital soul.
By using our services, you agree to let us:
Collect everything.
Keep it forever.
Use it however we want.
And maybe sell it to strangers.
But it's cool, right? Because filters! And AI!
1. What We Collect (a.k.a. Everything)
We may collect (and probably already have):
Your name, email, phone number, contacts, calendar events, blood type (just kidding… or are we?)
Every photo you upload — even the ones you thought you deleted
Facial data, because who doesn't love a good biometric scan?
Your exact location, even when you say no
Your mood, energy level, and what you're probably doing right now (AI is amazing!)
2. How We Use It (a.k.a. We Do What We Want)
We use your data to:
Improve your user experience
Serve you ads for things you whispered about
Train AI models to recognize your face better than you do
Share with "trusted partners" (aka strangers with spreadsheets)
Build a digital clone of you that lives forever (for performance testing, obviously)
3. Your Content Is Now... Also Ours
When you upload anything — photo, video, meme, soul fragment — you give us a forever-and-ever, royalty-free, sell-it-to-an-alien-if-we-want license to:
Edit it
Share it
Train AI on it
Meme it
Basically do anything short of tattooing it on someone's back (but give us time)
4. Data Retention (Until the Heat Death of the Universe)
We retain your data:
As long as it's "useful" (to us, not you)
Even if you delete the app
Even if you move to a forest and live off the grid
Even if you ask nicely (we'll say we're working on it)
5. Sharing Is Caring (For Our Wallets)
We may share your data with:
Advertisers
Data brokers
Governments (if they ask nicely — or not)
Anyone who says "machine learning pipeline" confidently
That one third-party SDK we never actually audited
6. Security
We take security very seriously… until it becomes expensive.
If we get breached, we'll let you know after a vague blog post and maybe a tweet — unless we forget.
7. Your Choices
You can:
Opt out of some stuff (which we'll silently re-enable later)
Request data deletion (which we'll "look into")
Delete your account (not the data though, lol)
8. Updates to This Policy
We might change this entire document tomorrow.
Will we notify you? 🤷
Will you read it? Definitely not.
Is this policy basically a dare? Yes.
9. Contact
Have questions? Too bad. But you can try your luck at: privacy@weknoweverything.com
Final Note
By clicking "I Agree," you confirm that:
You didn't read any of this
You just wanted that AI feature that makes your cat look like Batman
You trust us with your data because… aesthetics?
And now, we legally own part of your digital shadow 🎉
Did you actually read the terms before clicking I Agree? Probably not. 🤷
PS: The content above is purely satirical and meant to highlight the absurdity of many modern privacy policies. And yes, it was totally generated by ChatGPT.
Every other month, a new AI photo editing trend emerges — turning people into retro avatars, 3D toys, or fantasy creatures.
But what happens to those photos?
They're probably stored and used to train AI models. Yes, the same AI models that you're using to edit your photos.
While companies are searching for data to train their models on, people are voluntarily giving away their photos for free.
Of course I'm not saying that every company does this. But we've seen multiple instances where companies have been caught using our data to "improve their services".
While the chances of your photo appearing in an awkward thumbnail or a billboard somewhere are slim, it's definitely not 0%.
But it's not just the filters — it's where you're applying them that matters.
Many AI photo-editing sites aren't run by reputable companies. They're wrappers around APIs of popular AI tools, designed to harvest your data before sending it off.
They might be using your data for training their models or selling your data to third parties.
So before you turn yourself into budget Batman, check the authenticity of the site. If it feels fishy, report it to Google Safe Browsing .
And what about counterfeit apps that secretly scrape your entire phone's storage? Good lord. Some of these apps ask for unnecessary permissions — all under the pretense of “enhancing your experience.”
They might send your data to unknown servers halfway across the world.
I hope you read the permissions before installing any app. Because once you tap “Allow,” it's not just your selfies at risk — it's your digital life.
Another common mistake that people make is giving away sensitive info to AI Chatbots.
There have been cases where chats containing ID numbers, bank details, and personal documents were indexed by search engines. Yes, publicly searchable.
No matter how reputed the company behind the AI tool is, giving away your personal information is always a risk. One data breach is all it takes for your data to be leaked publicly.
Also, there have been instances where people got free key codes for paid software using clever prompt obfuscation . But how did the AI chatbot get access to these key codes in the first place? Most probably, it was unintentionally leaked somewhere on the internet and got scraped by the model.
So, it's not just about being cautious when sharing sensitive information with AI tools, like your ID number when renewing your passport. It's about being mindful of the digital footprint we leave behind, and making sure that we never accidentally leak sensitive information anywhere online, where it can be exploited.
Forget the technical jargon for a second — let's talk human.
People have mixed opinions about this. This is just my opinion, so take it with a pinch of salt.
AI can make us look flawless in the retro 90s style or a cartoon character that barely looks like us. But, what about our individuality? Are we trading our individuality for the sake of aesthetics?
What happens to our own identity when we constantly edit ourselves to look like someone else?
Of course, none of us are perfect. But being imperfect is what makes us human. Embracing our imperfections is what makes us unique.
By constantly chasing artificial perfection, humanity is slowly losing the essence of reality. Also, the definition of a photograph is changing from "capturing moments" to "rendering fantasy".
Should your profile picture be you... or a 1080x1080 diffusion canvas trying to look like you?
NOTE: Please don't raise your pitchforks at me for this.
If more people used AI to fix real-world imperfections — like learning new skills or solving actual problems — the world might just become a better place.
AI is a powerful tool. But it's up to us to decide whether it becomes a force for progress… or a data-snatching mirror that distorts who we are.
PS: Back to the privacy factor, before you click "Agree" again, ask yourself: Is the filter worth the footprint?