Either Mastodon's link preview bot should obey robots.txt or Mastodon needs O(1) link previews: jefftk.com/p/mastodons-dubious

@jefftk Would this still be a issue if Mastodon lazily built its link preview caches only after the first user requests the link?

Is the issue the total number of requests, or the fact that they are happening automatically, whether or not a specific user requests them?

@zebrask the latter: unless the requests are happening because someone directly asked for them, my interpretation is robots.txt should apply

@jefftk I agree that that would satisfy the contract, but it seems a bit rules lawyer-y to suggest that they can solve the problem by changing to a nearly identical action taken in response to user action - unless you think most instances never *display* the link previews that calculate.

From a practical point of view, the problems identified in this thread don't seem like they would be solved by fetching in response to user action: better.boston/@crschmidt/10941

Follow

@zebrask @jefftk

On one hand, _some_ of those problems would be solved, because the requests would be spread over more time (many instances won't have anyone looking at the new post within 60s). TBF, that can be done manually too.

Alas, this rules lawyery interpretation is already used for embedded images in e-mails: fetching them when the recipient is reading the email is considered perfectly fine and ~everyone does that (without any knowledge about any relationship or lack thereof between the owner of the site the image is on and the author of the email). An email with an embedded image sent to a mailing list is essentially equivalent to the situation here.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.