The IET is carrying out some important updates between 17-30 April and all of our websites will be view only. For more information, read this Announcement

AI, Surveillance and Privacy

I’ve been thinking a lot about how fast AI surveillance is evolving- facial recognition, emotion detection, predictive policing… it’s all moving so quickly.

Governments and big tech companies say it’s for our safety or to make life more convenient, but honestly, I’m starting to feel like we’re giving up way more than we realize.

If AI can track where we go, what we do, even how we feel—where’s the line?

Are we gradually trading our privacy for convenience without fully understanding the consequences? Or is this just the new normal in a digital world?

Would love to hear how others are thinking about this.

  • You’ve raised an important point, and I really appreciate the depth of your concern. Since I’ve been working in many of the areas you mentioned for several years, I can definitely relate and you’re absolutely right to question the balance between safety and convenience versus privacy and autonomy. But let me answer your question in three areas:

    1. On Privacy and Regulation
    In many countries, there are local regulations in place to manage the privacy aspects of facial recognition and other AI-driven video analytics. While concerns are valid, it’s worth noting that:

    Governments often manage databases containing “allowed,” “blocked,” or “wanted” individuals.

    These systems existed even before the rise of AI used by police, border control, airports, and ports to reduce crime and improve security.

    AI has simply accelerated and enhanced the efficiency of these existing systems.

    But as you rightly pointed out, it’s not just about what AI can do it’s about who is using it, how it’s being used, and what safeguards exist.

    2. On Data Control
    Another crucial dimension is data ownership and control:

    Who owns the data?

    Who gets to analyze it?

    Who profits from it?

    And who has the authority to make decisions based on it?

    These are ongoing ethical and legal challenges that deserve close scrutiny.

    3. Beyond Security: Broader Applications
    It’s also important to recognize that AI surveillance technologies have applications beyond just security let me give you examples:

    In retail and operations, for example, video analytics can help determine which products customers are engaging with—not necessarily to monitor individuals, but to optimize layout and service.

    In the healthcare sector, especially during the pandemic, thermal cameras played a vital role in detecting patients who might need urgent medical attention or were showing symptoms.

    I worked on a project where behavior analysis played a key role in preventing a fire in a car inside a large mall, stopping the situation before it escalated into a major crisis.


    I hope this answered your requestion.

  • Thanks for laying out such a thoughtful perspective. it’s clear you’ve got a lot of experience to draw from, and I’m glad to dig into this with you. You’ve hit on some critical tensions here: safety versus privacy, efficiency versus autonomy, and the broader implications of who’s holding the reins on these systems.
    1. Privacy Over Efficiency: While AI enhances security systems, the loss of personal privacy often outweighs the benefits, as individuals have little control over how their data is collected or used, even with regulations in place.
    2. Regulation Gaps: Local regulations may exist, but they’re inconsistent globally and frequently fail to keep pace with AI advancements, leaving significant loopholes for misuse by governments or corporations.
    3. Data Ownership Ambiguity: The question of who owns and controls data remains unresolved, with individuals rarely having a say, while powerful entities exploit this lack of clarity for profit or surveillance without consent.
    4. Mission Creep Risk: Broader applications like retail analytics or healthcare monitoring sound beneficial, but they normalize surveillance creep, where systems built for one purpose (e.g., safety) get repurposed for invasive tracking or profiling.
    5. Accountability Weakness: Even with safeguards, the “who” and “how” of AI use often lack transparency, those in control face little accountability, undermining trust in the systems regardless of their intended benefits.
  • Hello Athul:

    Government surveillance has always over rode personal privacy!

    AI just makes it easier to process the data.

    The only thing a person can do is to reduce the number of crumbs you leave behind on your trail.

    Peter Brooks

    Palm Bay FL

  • Hello Peter,

    Fair point, but I wouldn't say surveillance always overrides privacy, there’s been real pushback and reform, especially post-Snowden.


    AI speeds things up, sure, but it also raises awareness and demand for better tools.


    It’s not just about leaving fewer crumbs, it’s about reshaping the system too.

  • Hello Athul:

    I am sorry to burst your bubble but the situation is worse than you think.

    As an example the other day my e-mail supplier advised me of a new service that one can buy - they can (for a price) find any message that one got or sent and DELETED three or four years ago. 

    So DELETE doesn't mean it has disappeared.

    Peter Brooks

    Palm Bay FL  

  • So DELETE doesn't mean it has disappeared.

    It never did. Since the advent of magnetic storage it's been challenging. Never mind things only being moved to a "waste basket" when you ask for them to be deleted - most underlying storage system don't actually delete even when you "empty the waste basket" - they merely re-allocate the storage area to a "available to be re-used list". Even if you physically over-write the area (e.g. reformatting), faint traces of the original magnetic patterns remain and although not noticeable using normal read/write heads, can be detected using the right specialist equipment. I remember many years ago visiting a defence research site and seeing piles of bare hard disc drives on top of filing cabinets - asking why they needed so many spares, I was told they weren't spares but faulty old ones - and as they might contain sensitive data and there was no way of being 100% sure it was unreadable, they had to stay on site.

        - Andy.

  • As they might contain sensitive data and there was no way of being 100% sure it was unreadable, they had to stay on site.

    now solved by machines that bear a striking resemblance to a wood chipper, that convert sensitive hardware like hard disks  into sub-centimetre flakes... Very noisy though compared to paper shredding.

    https://www.ironmountain.com/en-gb/services/secure-shredding

    for example.

  • Hello Arthul:

    I give you an example of products that were developed for Military and Police use.

    I know of one company that developed a suitcase size system that could be used for spoofing local telecom towers, in order to intercept phone conversations.

    Peter Brooks

    Palm Bay FL 

  •  

    • Deletion Can Be Effective

      • In systems designed with strong privacy principles (e.g., GDPR-compliant or zero-knowledge architectures), data marked for deletion can be permanently erased from both active storage and backups, depending on policy and configuration.

    • Recoverability Often Depends on Backup Policies

      • The ability to retrieve “deleted” emails typically stems from backup archives retained for business continuity, legal obligations, or user-side recovery, not from live storage. These backups are usually not indexed or actively accessed unless required.

    • Privacy-Respecting Platforms Offer Stronger Guarantees

      • Services like ProtonMail, Tutanota, Signal, and Matrix provide end-to-end encryption and do not retain retrievable data after deletion.

      • Without access to encryption keys, even service providers cannot recover deleted content.

    • Transparency and Data Governance Are Key

      • The core issue is not that “delete doesn’t work,” but that many services lack transparency in how data is handled post-deletion.