• Bongles@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    16 hours ago

    You’ve sullied my quick answer:

    The assistant figures it out though:

    • LemmyKnowsBest@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      13 hours ago

      Maybe that’s why ai had trouble determining anything about AJ & the movie Heat, because she’s wasn’t even in it!

    • _stranger_@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      15 hours ago

      Because you’re not getting an answer to a question, you’re getting characters selected to appear like they statistically belong together given the context.

      • howrar@lemmy.ca
        link
        fedilink
        arrow-up
        0
        ·
        15 hours ago

        A sentence saying she had her ovaries removed and that she is fertile don’t statistically belong together, so you’re not even getting that.

        • JcbAzPx@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          15 hours ago

          You think that because you understand the meaning of words. LLM AI doesn’t. It uses math and math doesn’t care that it’s contradictory, it cares that the words individually usually came next in it’s training data.

          • howrar@lemmy.ca
            link
            fedilink
            arrow-up
            0
            ·
            14 hours ago

            It has nothing to do with the meaning. If your training set consists of a bunch of strings consisting of A’s and B’s together and another subset consisting of C’s and D’s together (i.e. [AB]+ and [CD]+ in regex) and the LLM outputs “ABBABBBDA”, then that’s statistically unlikely because D’s don’t appear with A’s and B’s. I have no idea what the meaning of these sequences are, nor do I need to know to see that it’s statistically unlikely.

            In the context of language and LLMs, “statistically likely” roughly means that some human somewhere out there is more likely to have written this than the alternatives because that’s where the training data comes from. The LLM doesn’t need to understand the meaning. It just needs to be able to compute probabilities, and the probability of this excerpt should be low because the probability that a human would’ve written this is low.

            • monotremata@lemmy.ca
              link
              fedilink
              English
              arrow-up
              0
              ·
              14 hours ago

              Honestly this isn’t really all that accurate. Like, a common example when introducing the Word2Vec mapping is that if you take the vector for “king” and add the vector for “woman,” the closest vector matching the resultant is “queen.” So there are elements of “meaning” being captured there. The Deep Learning networks can capture a lot more abstraction than that, and the Attention mechanism introduced by the Transformer model greatly increased the ability of these models to interpret context clues.

              You’re right that it’s easy to make the mistake of overestimating the level of understanding behind the writing. That’s absolutely something that happens. But saying “it has nothing to do with the meaning” is going a bit far. There is semantic processing happening, it’s just less sophisticated than the form of the writing could lead you to assume.

  • Alexaral@infosec.pub
    link
    fedilink
    English
    arrow-up
    0
    ·
    18 hours ago

    Leaving aside the fact that this looks like AI slop/trash bait; who the fudge is so clueless as to think Ashley Judd, assuming that she’s who they’re confusing, looks anything like Angelina Jolie back then

    • Bosht@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      18 hours ago

      First, it’s the internet, you can cuss. Either structure the sentence not to include it at all or just cuss for fuck’s sake. Second, not everyone knows every actor/actress or is familiar, especially one that’s definitely not in the limelight anymore like Ashley Judd. Hell even when she was popular she wasn’t in a lot.

  • nickiam2@aussie.zone
    link
    fedilink
    arrow-up
    0
    ·
    19 hours ago

    I think the trick here is to not use Google. The Wikipedia page for the movie heat is the first result on DuckDuckGo

  • stebo@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    0
    ·
    1 day ago

    Why do people Google questions anyway? Just search “heat cast” or “heat Angelina Jolie”. It’s quicker to type and you get more accurate results.

    • ERROR: UserNotFound@infosec.pub
      link
      fedilink
      English
      arrow-up
      0
      ·
      15 hours ago

      “How to to describe a character in my story hiding a body after they committed a murder?”

      ⬇️

      “killed someone, how to hide body?”

    • warbond@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      17 hours ago

      As a funny challenge I like to come up with simplified, stupid-sounding, 3-word search queries for complex questions, and more often than not it’s good enough to get me the information I’m looking for.

    • GamingChairModel@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      18 hours ago

      Why do people Google questions anyway?

      Because it gives better responses.

      Google and all the other major search engines have built in functionality to perform natural language processing on the user’s query and the text in its index to perform a search more precisely aligned with the user’s desired results, or to recommend related searches.

      If the functionality is there, why wouldn’t we use it?

    • nyctre@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      21 hours ago

      I just tested. “Angelina jolie heat” gives me tons of shit results, I have to scroll all the way down and then click on “show more results” in order to get the filmography.

      “Is angelina jolie in heat” gives me this bluesky post as the first answer and the wikipedia and IMDb filmographies as 2nd and 3rd answer.

      So, I dunno, seems like you’re wrong.

      • howrar@lemmy.ca
        link
        fedilink
        arrow-up
        0
        ·
        15 hours ago

        Have people just completely forgot how search engines work? If you search for two things and get shit results, it means those two things don’t appear together.

      • stebo@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        0
        ·
        17 hours ago

        both queries give me poor results and searching “heat cast” reveals that she is not actually in the movie, so that’s probably why you can’t find anything useful

      • GamingChairModel@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        18 hours ago

        Search engine algorithms are way better than in the 90s and early 2000s when it was naive keyword search completely unweighted by word order in the search string.

        So the tricks we learned of doing the bare minimum for the most precise search behavior no longer apply the same way. Now a search for two words will add weight to results that have the two words as a phrase, and some weight for the two words close together in the same sentence, but still look for each individual word as a result, too.

        More importantly, when a single word has multiple meanings, the search engines all use the rest of the search as an indicator of which meaning the searcher means. “Heat” is a really broad word with lots of meanings, and the rest of the search can help inform the algorithm of what the user intends.

    • ByteJunk@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      24 hours ago

      Because that’s the normal way in which humans communicate.

      But for Google more specifically, that sort of keyword prompts is how you searched stuff in the '00s… Nowadays the search prompt actually understands natural language, and even has features like “people also ask” that are related to this.

      All in all, do whatever works for you, it’s just that asking questions isn’t bad.

      • stebo@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        0
        ·
        23 hours ago

        Google is not a human so why would you communicate with it as if it were a human? unlike chatgpt it’s not designed to answer questions, it’s designed to search for words on webpages

        • ByteJunk@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          16 hours ago

          Because we’re human, and that’s a human-made tool. It’s made to fit us and our needs, not the other way around. And in case you’ve missed the last decade, it actually does it rather well.

        • Trainguyrom@reddthat.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          19 hours ago

          Except Google has been optimizing for natural language questions for the last decade or so. Try it sometime, it’s really wild

        • queermunist she/her@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          22 hours ago

          We spend most of our time communicating with humans so we’re generally better at that than communicating with algorithms and so it feels more comfortable.

          Most people don’t want to learn to communicate with a search engine in its own language. Learning is hard.

  • jaschen@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    1 day ago

    I never heard of the movie and was enjoying the content you created that I thought was supposed to be funny.

  • Miles O'Brien@startrek.website
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    Is it considered normal to type out a normal question format when using search engines?

    If I were looking for an answer instead of making a funny meme, I’d search “heat movie cast Angelina Jolie” if I didn’t feel like putting any effort in.

    Then again, I guess I shouldn’t be surprised. I’ve seen someone use their phone to search google “what is 87÷167?” instead of doing “87/167” or like… Opening the calculator…

    People do things in different, sometimes weird ways.

    • LePoisson@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      19 hours ago

      This is like the difference between normal and right. Like I know a ton of people normally search for answers by putting full questions in. With the advent of LLMs and AI being thrown into everything asking full questions starts to make more sense.

      For actual good results using a search engine, for sure what you said is better.

    • 0range@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      Yeah, the way that i would do it is to look up the Wikipedia page for the movie Heat and go to the cast section.

      I always do things like this and it can actually be to my detriment. Like that time i went to Reddit to ask them what that movie was where time is a currency, and somebody pointed out that i could have just googled “time is money movie” and it would have immediately shown me In Time (2011).

      Also, when i want something from an app or website i will consult the alphabetical list or look for a link to click, instead of just using the search bar.

      I don’t know, somehow it never entered my brain that search bars are smart and can figure out what you meant if you use natural language. Even though they’ve been programmed that way since before i was born

    • ArchRecord@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 day ago

      It depends on the person in my experience.

      For instance, I’ll often use a question format, but usually because I’m looking for similar results from a forum, in which I’d expect to find a post with a similar question as the title. This sometimes produces better results than just plain old keywords.

      Other times though, I’m just throwing keywords out and adding "" to select the ones I require be included.

      But I do know some people who only ever ask in question format no matter the actual query. (e.g. “What is 2+2” instead of just typing “2+2” and getting the calculator dialogue, like you said in your post too.)

    • chatokun@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      0
      ·
      1 day ago

      I sometimes ask questions, and sometimes I’m forced to because the original answer somehow misinterpreted my query. I also do searches like you mentioned, but I don’t exclusively do one of the other.

  • magnetosphere@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    1 day ago

    Heat is an excellent movie, and one of my top five. Coincidentally, I just watched it last night. For a film released in 1998, it has aged well. OOP is in the ballpark, too - a young Natalie Portman is in it, not Jolie.

  • adarza@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 day ago

    ddg isn’t really any better with that exact search query. all ‘fashion’ related items on the first page.

    you get the expected top result (imdb page for the film ‘heat’, which you have to scroll through to determine your ‘answer’) by using simply: angelina jolie heat

  • frezik@midwest.social
    link
    fedilink
    arrow-up
    0
    ·
    1 day ago

    We all know how AI has made things worse, but here’s some context on how it’s outright backwards.

    Early search engines had a context problem. To use an example from “Halt and Catch Fire”, if you search for “Texas Cowboy”, do you mean the guys on horseback driving a herd of cows, or do you mean the football team? If you search for “Dallas Cowboys”, should that bias the results towards a different answer? Early, naive search engines gave bad results for cases like that. Spat out whatever keywords happen to hit the most.

    Sometimes, it was really bad. In high school, I was showing a history teacher how to use search engines, and he searched for “China golden age”. All results were asian porn. I think we were using Yahoo.

    AltaVista largely solved the context problem. We joke about its bad results now, but it was one of the better search engines before Google PageRank.

    Now we have AI unsolving the problem.

    • doingthestuff@lemy.lol
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      I was okay with keyword results. If you knew what you were dealing with in the search engine, you could usually find what you were looking for.