Avoidance of unintended image reusage - Cloak images against AI model creation

I am a very privacy aware person (see my privacy related lectures), which took me into a delicate position when deciding to publish an image of myself on this very website. As you can see, I decided to publish the image but also took precautions to it before publishing it. None of the ways prevents the image from being processed. It has been published on this website, so it will be used. But these precautions may help in guiding this processing.

Cloaking the image - against AI models

Cloaking images to defend against AI model creation may be done via Fawkes. It is a python3 script which aims to change images to falsify newly created AI models. There are GUI tools for MacOS and Windows, and binaries for Linux, MacOS, and Windows. I used the CLI. This worked for me.

fawkes -d imgs/ --mode low
All images in the folder imgs are cloaked.

There are still some questions.

Low resolution

Reducing the resolution of the image removes information from the image permanently. If the image is small enough, the human eye will not notice it and you can still use it.

Do not commit it to the git repository

Do not commit the image to the git repository that is used for the Continuous Deployment.

Setting a robots.txt entry - against widespread publishing in search engines

Setting a robots.txt entry will reduce the availability of information in search engines and other bots. These entries have no security impacts (malicious bots will just ignore them), but popular engines respect them. Therefore add images to this file will most likely help in avoiding to display these images in search results.

User-agent: *
Disallow: /imgs/
This code instructs the web crawlers to ignore all files in the imgs folder.

Previous Blog EntryNext Blog Entry


Last update: 2024-12-14