🛒 Remember the iconic Kmart in-store audio that set the holiday shopping vibe? 🎄🛍️ Travel back to December 1990 & relive the holiday magic with these preserved recordings: https://archive.org/details/KmartDecember1990 #ThrowbackThursday
#DOScember tests with MSDOS NETDRIVE, which assigns a new drive letter to a remote LAN or internet hard disk/diskette volume; accessing the tool's author test remote volume. I grabbed a bunch of his favorite utilities and ran programs like it was nothing. https://www.vogons.org/viewtopic.php?f=5&t=97743
If you use #Dropbox there are two toggles that you have to check/toggle in order to avoid your private data being sold or being used to train AI models:
On the Desktop website:
1) Go to Help and then Cookies & CCPA preferences to enable "do not sell or share my information"
2) And then next, go to Settings -> Third Party AI, to disable sharing your data for AI training.
Do you want to block threads from your account?
Would you eat a plastic milk jug ring than see a Facebook minion meme on your timeline?
Well oh boy do I have a treat for you! If you're like me and have at least three different Mastodon accounts (or even just one!), you can dom-block them quick fast and in a hurry...
Sᴛᴇᴘ 1: Login to your account. 🖼️¹
Sᴛᴇᴘ 2: Go to Settings, then Development 🖼️²
Sᴛᴇᴘ 3: Create a New application with at least write or write:blocks access. 🖼️³
Sᴛᴇᴘ 4: After you saved, click on the name of your application and copy Your access token 🖼️⁴
Sᴛᴇᴘ 5: Repeat for all your accounts, then run the following shell script...
readarray -t pw <<EOF
infosec.exchange Your-Api-Key1
mastodon.social Your-Api-Key2
defcon.social Your-Api-Key3
EOF
masdablock() {
local i c u p
while [ -n "$1" ]; do
for ((i=0; i<${#pw[@]}; i++)); do
read -ra p <<<"${pw[i]}"
u="$p/api/v1/domain_blocks"
u="https://${u}?domain=$1"
c="Authorization: Bearer ${p[1]}"
curl -H "$c" -X "POST" "$u"
done; shift; echo
done;
}
masdablock threads.net
Mastodon API documentation relating to accounts can be found here:
fave new QOTD, via the comments on https://www.tbray.org/ongoing/When/202x/2022/11/07/Just-Dont
"For every complex problem there is an answer that is clear, simple, and wrong." -- H.L. Mencken
Failure analysis in complex system design (especially when lives are at stake) is always fascinating, but especially so when you can see compensating design choices that helped mitigate the worst outcomes. https://admiralcloudberg.medium.com/a-matter-of-millimeters-the-story-of-qantas-flight-32-bdaa62dc98e7
8
[at llangollen]
Byron: [tossing hair] delightfully devilish byron, caroline lamb will never think to look for you here
Caroline Lamb: [barging into llangollen] WHERE'S BYRON
Lamb: I KNOW HE'S HERE
Lamb: DON'T YOU LESBIANS LIE TO ME
Lamb: I CAN SMELL HIS AXE BODY SPRAY
@cstross
Some clients are getting pretty good at threading now. I use https://phanpy.social/ (web client) mostly because of the threading that lets me follow conversations even on large threads like this.
(Also, as the author of STRN (at least the "S" part), it is nice to see the good parts of Usenet have not been completely forgotten.)
Three days after Amazon announced its AI chatbot Q, some employees are sounding alarms about accuracy and privacy issues. Q is “experiencing severe hallucinations and leaking confidential data,” including the location of AWS data centers, internal discount programs, and unreleased features, according to leaked documents obtained by Platformer.
An employee marked the incident as “sev 2,” meaning an incident bad enough to warrant paging engineers at night and make them work through the weekend to fix it.
https://www.platformer.news/p/amazons-q-has-severe-hallucinations
Battle.net broke in Wine / Proton - here's how to fix for Steam Deck / Linux https://www.gamingonlinux.com/2023/12/battlenet-broke-in-wine-proton-heres-how-to-fix-for-steam-deck-linux/
So much great stuff in this Jeremy Howard interview on LLMs - I grabbed this quote for my blog, but there's just a ton of insight crammed into this one: https://www.youtube.com/live/6LXw2beprGI?si=I2JEqIccboFRou0K
Here's an incredible new way to run LLMs on your own machine: llamafile - https://hacks.mozilla.org/2023/11/introducing-llamafile/
It bundles an LLM with the code needed to run it in a single binary using DEEP magic (Cosmopolitan Libc) such that the same binary works on 6 different operating systems
Best part: it works with LLaVA multi-modal... so you can a 4GB file from https://huggingface.co/jartine/llava-v1.5-7B-GGUF/blob/main/llamafile-server-0.1-llava-v1.5-7b-q4 and:
chmod 755 llamafile-server-0.1-llava-v1.5-7b-q4
./llamafile-server-0.1-llava-v1.5-7b-q
Visit http://127.0.0.1:8080/
And now:
@mttaggart https://www.youtube.com/watch?v=H3i6c4TRuZc
But it has to come with it's crew.
Father of 4, Lasers and Computers and Physics, Oh My! Soon to be a major motion picture. My Pokemons, let me show them to you.