The Surveillance Economy Rebellion

Do repost and rate:

tl;dr: The digital personal protection market and the surveillance economy. A new image cloaking tool hits the market.

Among privacy advocates, the question is often posed:

When will ‘regular’ people care about privacy?

When will people recognize the tradeoffs inherent in using social media?

When will people become aware of the dangers of inviting TikTok into their home networks?

Usually, the answer is, ‘probably never.”

I think that’s too binary.

There’s the gray in between. Some people care and some care enough to make adjustments.

Confusing the Machines

All of this came to mind because I learned of a new software tool, appropriately named ‘Fawkes,’ that will take any photo and make binary-level changes in its composition.

To the human eye, the picture looks the same.

To a computer, particularly one that is using machine learning and generative adversarial networks, however, it’s like putting a person in a room full of mirrors.

It gets really confusing, really fast.

I don’t understand all of the tricks of the trade involved in making this software work. I just find its existence to be fascinating.

Just like the advent of ‘face masking,’ which I highlighted in a keynote to Microsoft’s customers last year, this is proof of innovation.

There was no need for digital ‘cloaking’ a few years ago. Now, if you care about privacy, there is. Boom, innovation.

The Digital Personal Protection Market

This is just the latest advance in the overall market for people who want to resist the incursion, or should I say ‘further incursion,’ of the surveillance economy into their lives.

We touched on many of these already, but it involves things like secure email (e.g. Proton mail) and secure VPN (e.g. ProtonVPN for a centralized version or OrchidMysterium for decentralized versions that are in beta).

And it involves best practices, such as two-factor authentication and more.

Today, being aware of the work associated with digital personal protection is challenging. Implementing safeguards is a challenge as well.

I am frequently in the position where I have to go to another room to get my phone to get the 2-factor code to log in. I have to reinitiate LastPass after a time out.’ Sometimes my machine is slower because of a VPN.

It can be annoying and there are times when I am tempted to “forget the whole thing.”

But every time I do, I think about the alarm system in our house or insurance.

You put them in to protect against the edge cases that carry huge downside costs to them.

Already twice this week, we’ve received credit card fraud alerts. As more and more of our lives go online, I expect that digital crime will continue to grow.

It might seem like “protecting my face from an AI recognition program” is not as important as “protecting my credit card from thieves.” And, today, that’s probably true. It may be true forever.

That doesn’t mean it’s not worth thinking about beefing up personal digital security because we don’t know how this data and images will be used in the future to hurt us.

Like insurance, better to get it before you need it.

For now, running Fawkes on every image you have is not going to work.

One image (below) took 3 minutes to process on my Mac. No faces here, just wanted to see what would happen.

But the software will get better and those who care will start to use it.

Maybe a gap of the future is between the “digital security ‘haves’ and ‘have nots?”

Fawkes Experiment

This is the original

This is the “cloaked” image that Machine Learning algorithms struggle with.

For more, see the documentation on Fawkes from U of Chicago.

Regulation and Society adoption

Events&meetings

Ждем новостей

Нет новых страниц

Следующая новость