You’ve sullied my quick answer:
The assistant figures it out though:
Maybe that’s why ai had trouble determining anything about AJ & the movie Heat, because she’s wasn’t even in it!
How can she be fertile if her ovaries are removed?
Because you’re not getting an answer to a question, you’re getting characters selected to appear like they statistically belong together given the context.
A sentence saying she had her ovaries removed and that she is fertile don’t statistically belong together, so you’re not even getting that.
You think that because you understand the meaning of words. LLM AI doesn’t. It uses math and math doesn’t care that it’s contradictory, it cares that the words individually usually came next in it’s training data.
It has nothing to do with the meaning. If your training set consists of a bunch of strings consisting of A’s and B’s together and another subset consisting of C’s and D’s together (i.e.
[
and ]+[
in regex) and the LLM outputs “ABBABBBDA”, then that’s statistically unlikely because D’s don’t appear with A’s and B’s. I have no idea what the meaning of these sequences are, nor do I need to know to see that it’s statistically unlikely. ]+In the context of language and LLMs, “statistically likely” roughly means that some human somewhere out there is more likely to have written this than the alternatives because that’s where the training data comes from. The LLM doesn’t need to understand the meaning. It just needs to be able to compute probabilities, and the probability of this excerpt should be low because the probability that a human would’ve written this is low.
Honestly this isn’t really all that accurate. Like, a common example when introducing the Word2Vec mapping is that if you take the vector for “king” and add the vector for “woman,” the closest vector matching the resultant is “queen.” So there are elements of “meaning” being captured there. The Deep Learning networks can capture a lot more abstraction than that, and the Attention mechanism introduced by the Transformer model greatly increased the ability of these models to interpret context clues.
You’re right that it’s easy to make the mistake of overestimating the level of understanding behind the writing. That’s absolutely something that happens. But saying “it has nothing to do with the meaning” is going a bit far. There is semantic processing happening, it’s just less sophisticated than the form of the writing could lead you to assume.
It’s not even words, it “thinks” in “word parts” called tokens.
And the text even ends with a mention of her being in early menopause…
Leaving aside the fact that this looks like AI slop/trash bait; who the fudge is so clueless as to think Ashley Judd, assuming that she’s who they’re confusing, looks anything like Angelina Jolie back then
How do you know that OP even saw Heat? Maybe they were just curious to see if she was in it.
First, it’s the internet, you can cuss. Either structure the sentence not to include it at all or just cuss for fuck’s sake. Second, not everyone knows every actor/actress or is familiar, especially one that’s definitely not in the limelight anymore like Ashley Judd. Hell even when she was popular she wasn’t in a lot.
I think the trick here is to not use Google. The Wikipedia page for the movie heat is the first result on DuckDuckGo
DDG also has a quick answer AI
You can also search Wikipedia directly.
Yup, using the bang !w anywhere within the search
I use duck duck go as well. I wish it wasn’t just anonymised Bing search. One of these days I’ll look into an open source independent search engine.
Searxng maybe?
Why is the search query in the top and bottom different?
Google correction does not reflect in the tab name; genuinely happens
Deepseek also gets this wrong.
So she is in heat …
Why do people Google questions anyway? Just search “heat cast” or “heat Angelina Jolie”. It’s quicker to type and you get more accurate results.
“How to to describe a character in my story hiding a body after they committed a murder?”
⬇️
“killed someone, how to hide body?”
see? it’s easy
Why use many word when few work
As a funny challenge I like to come up with simplified, stupid-sounding, 3-word search queries for complex questions, and more often than not it’s good enough to get me the information I’m looking for.
Why do people Google questions anyway?
Because it gives better responses.
Google and all the other major search engines have built in functionality to perform natural language processing on the user’s query and the text in its index to perform a search more precisely aligned with the user’s desired results, or to recommend related searches.
If the functionality is there, why wouldn’t we use it?
that is true but the results will be the same at best, not better
It works. It will also find others who posted that question.
I just tested. “Angelina jolie heat” gives me tons of shit results, I have to scroll all the way down and then click on “show more results” in order to get the filmography.
“Is angelina jolie in heat” gives me this bluesky post as the first answer and the wikipedia and IMDb filmographies as 2nd and 3rd answer.
So, I dunno, seems like you’re wrong.
Have people just completely forgot how search engines work? If you search for two things and get shit results, it means those two things don’t appear together.
both queries give me poor results and searching “heat cast” reveals that she is not actually in the movie, so that’s probably why you can’t find anything useful
That’s why you just add “movie” to the search.
Or do IMDb heat or IMDb jolie or something
Search engine algorithms are way better than in the 90s and early 2000s when it was naive keyword search completely unweighted by word order in the search string.
So the tricks we learned of doing the bare minimum for the most precise search behavior no longer apply the same way. Now a search for two words will add weight to results that have the two words as a phrase, and some weight for the two words close together in the same sentence, but still look for each individual word as a result, too.
More importantly, when a single word has multiple meanings, the search engines all use the rest of the search as an indicator of which meaning the searcher means. “Heat” is a really broad word with lots of meanings, and the rest of the search can help inform the algorithm of what the user intends.
You won’t get funny answers if you do it correctly.
Because that’s the normal way in which humans communicate.
But for Google more specifically, that sort of keyword prompts is how you searched stuff in the '00s… Nowadays the search prompt actually understands natural language, and even has features like “people also ask” that are related to this.
All in all, do whatever works for you, it’s just that asking questions isn’t bad.
Google is not a human so why would you communicate with it as if it were a human? unlike chatgpt it’s not designed to answer questions, it’s designed to search for words on webpages
Because we’re human, and that’s a human-made tool. It’s made to fit us and our needs, not the other way around. And in case you’ve missed the last decade, it actually does it rather well.
Except Google has been optimizing for natural language questions for the last decade or so. Try it sometime, it’s really wild
typing keywords instead of full sentences is still quicker so nah
Tell me you’re too young to have used “Ask Jeeves” without telling me
We spend most of our time communicating with humans so we’re generally better at that than communicating with algorithms and so it feels more comfortable.
Most people don’t want to learn to communicate with a search engine in its own language. Learning is hard.
what’s there to learn about using search terms
Do you think you were born knowing what search terms are?
They’re literally just words? All you need is the ability to speak a language
You weren’t born with the knowledge of written language either.
Surely you see how using a search engine is a separate skill from just writing words?
Point is, people don’t want to learn. Natural language searches in the form of questions are just easier for people, because they already know how to ask questions.
I never heard of the movie and was enjoying the content you created that I thought was supposed to be funny.
Is it considered normal to type out a normal question format when using search engines?
If I were looking for an answer instead of making a funny meme, I’d search “heat movie cast Angelina Jolie” if I didn’t feel like putting any effort in.
Then again, I guess I shouldn’t be surprised. I’ve seen someone use their phone to search google “what is 87÷167?” instead of doing “87/167” or like… Opening the calculator…
People do things in different, sometimes weird ways.
This is like the difference between normal and right. Like I know a ton of people normally search for answers by putting full questions in. With the advent of LLMs and AI being thrown into everything asking full questions starts to make more sense.
For actual good results using a search engine, for sure what you said is better.
Yeah, the way that i would do it is to look up the Wikipedia page for the movie Heat and go to the cast section.
I always do things like this and it can actually be to my detriment. Like that time i went to Reddit to ask them what that movie was where time is a currency, and somebody pointed out that i could have just googled “time is money movie” and it would have immediately shown me In Time (2011).
Also, when i want something from an app or website i will consult the alphabetical list or look for a link to click, instead of just using the search bar.
I don’t know, somehow it never entered my brain that search bars are smart and can figure out what you meant if you use natural language. Even though they’ve been programmed that way since before i was born
It depends on the person in my experience.
For instance, I’ll often use a question format, but usually because I’m looking for similar results from a forum, in which I’d expect to find a post with a similar question as the title. This sometimes produces better results than just plain old keywords.
Other times though, I’m just throwing keywords out and adding
""
to select the ones I require be included.But I do know some people who only ever ask in question format no matter the actual query. (e.g. “What is 2+2” instead of just typing “2+2” and getting the calculator dialogue, like you said in your post too.)
I sometimes ask questions, and sometimes I’m forced to because the original answer somehow misinterpreted my query. I also do searches like you mentioned, but I don’t exclusively do one of the other.
NGL, I learned some things.
Heat is an excellent movie, and one of my top five. Coincidentally, I just watched it last night. For a film released in 1998, it has aged well. OOP is in the ballpark, too - a young Natalie Portman is in it, not Jolie.
Yeah it’s a movie that nails “then suddenly… all hell breaks loose.”
ddg isn’t really any better with that exact search query. all ‘fashion’ related items on the first page.
you get the expected top result (imdb page for the film ‘heat’, which you have to scroll through to determine your ‘answer’) by using simply: angelina jolie heat
We all know how AI has made things worse, but here’s some context on how it’s outright backwards.
Early search engines had a context problem. To use an example from “Halt and Catch Fire”, if you search for “Texas Cowboy”, do you mean the guys on horseback driving a herd of cows, or do you mean the football team? If you search for “Dallas Cowboys”, should that bias the results towards a different answer? Early, naive search engines gave bad results for cases like that. Spat out whatever keywords happen to hit the most.
Sometimes, it was really bad. In high school, I was showing a history teacher how to use search engines, and he searched for “China golden age”. All results were asian porn. I think we were using Yahoo.
AltaVista largely solved the context problem. We joke about its bad results now, but it was one of the better search engines before Google PageRank.
Now we have AI unsolving the problem.
I was okay with keyword results. If you knew what you were dealing with in the search engine, you could usually find what you were looking for.
It’s not helpful for OOP since they’re on iOS, but there’s a Firefox extension that works on desktop and Android that hides the AI overview in searches: https://addons.mozilla.org/en-US/android/addon/hide-google-ai-overviews/