the wildest thing to me here is that i don't think it's just drawing random easter-probable dates from its training set bc these dates are all actually sundays in 2024. really impressive how specifically wrong it is
extremely cool that my very fav band @themountaingoats.bsky.social has reskeeted this, extremely uncool that it has escaped containment and i am getting my least fav response to AI fuckups from AI fans, which is "oh, this is corrected in the latest version of chatgpt that you have to pay for"
The Mountain Goats AND calling out AI’s bullshit in the same thread seems like a pretty cool niche Venn Diagram of ‘things some of us like’ that I am pleasantly surprised by.
right, that's what i'm saying, i expected it to just pull random late march and april dates from instances of "easter sunday is [date]" in its training set, but all the answers it's giving me for "when is easter 2024" are plausible but incorrect SUNDAYS in 2024, so something more is going on here
I think this is one of those places where the fancy autocomplete mental model underestimates how fancy it is. There must be some real statistical sense in which the next token is more likely to be be a date which other sources have referred to as a Sunday
Well, yes, "fancy autocomplete" sells it quite a bit short. It's more like an extremely sophisticated flow chart that people keep infuriatingly anthropomorphizing
It’s never going to get it right if its date for the paschal full moon is off by 12 days. (Bede and everyone else involved in perfecting the computus wept)
this only affirms my belief that what AI-as-oracle people want is simply google that speaks in complete sentences, and whether or not it is a liar is irrelevant
I've said this several times but it is worth repeating:
You don't ask an LLM something, you are giving it a prompt.
You don't get back an answer, you get plausible sounding text.
Using the incorrect words for what they do allows the people peddling the technology to get away with ridiculous claims.
Douglas Adams had this shit dead to rights
there's a bit in the Hitchhiker's Guide to the Galaxy books where a typo in the eponymous Guide made it sound like the Ravenous Bugblatter Beast of Traal was a local delicacy instead of a ferocious monster, resulting in multiple fatalities
I always think of the bit in Dirk Gently's Holistic Detective Agency where they have a best selling decision-making software where the big feature is instead of giving it the facts and getting a decision, you give it a decision and it justifies it for you
That's cause it's using Retrieval Augmented Generation. It first searches for relevant content and then feeds that into its generative AI before it gives you an answer. This helps ground and minimise hallucination.
Google is typically good at floating dates like this - I imagine it’s because they have a popular calendar app?
Like, does google have a datum stored somewhere that says “Easter Sunday = 3/31/24” that it can regurgitate?
That’s a great word for it.
It frequently feels like coming in halfway through a game of Mao - the inputs and outputs are visible, but they don’t always seem related.
honestly it's the same vibe as SEO, an entire industry built around trying to figure out what's going inside the all-important black box of google search
Basically, without access to the exact training data and weights, it's impossible to judge exactly what it's doing and why - which is of course the whole point of keeping them secret