• foggy@lemmy.world
    link
    fedilink
    arrow-up
    13
    arrow-down
    12
    ·
    edit-2
    6 days ago

    Yo ill be 100 with you.

    Regex is where something like an LLM excells.

    Don’t rely on an llm for coding, but… This is exactly where it should be in your toolbox.

    • circuitfarmer@lemmy.sdf.org
      link
      fedilink
      arrow-up
      13
      ·
      6 days ago

      I don’t disagree with this hot take. But the major difference is the sheer resources needed to have an LLM in place of a “do one thing right” utility like sed. In that sense, they are incomparable.

      • bus_factor@lemmy.world
        link
        fedilink
        arrow-up
        11
        arrow-down
        1
        ·
        6 days ago

        I think they’re arguing for having the LLM generate the regex. And I certainly would not trust an LLM to do that right.

        • Natanox@discuss.tchncs.deOP
          link
          fedilink
          English
          arrow-up
          9
          ·
          6 days ago

          Yeah, it’s way more sensible to use some of the available regex utilities like this. Although it’s always funny to see what an LLM comes up with.

      • foggy@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        6 days ago

        I mean fair.

        I guess the caveat here should be fucking learn regex first, lmao.

        Don’t use it works not necessary. Google is probably still better if you’re looking for regex for an email or something like that

        And also don’t just rely on its answer for prod.

      • foggy@lemmy.world
        link
        fedilink
        arrow-up
        8
        arrow-down
        2
        ·
        edit-2
        6 days ago

        A lot of lemmy is very anti-Ai. As an artist I’m very anti-Ai. As a veteran developer I’m very pro AI (with important caveats). I see it’s value; I see it’s threat.

        I know I’m not in good company when I talk about its value on Lemmy.

        • Natanox@discuss.tchncs.deOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 days ago

          Completely with you on this one. It’s awful when used to generate “art”, but once you’ve learned its short-comings and never blindly trust it it is such a phenomenal help in learning and assisting with code or finding something you’ve a hard time to find the right words for. And aside from generative use-cases neural networks are also phenomenally useful for assisting tasks in science, medicine and so on.

          It’s just unfortunate we’re still in the “find out” phase of the information age. It’s like with the industrialization ~200 years ago, just with data… and unfortunately the lessons seem to be equally rough. All the generative tech will deal painful blows to our culture.

          • JayDee@lemmy.sdf.org
            link
            fedilink
            arrow-up
            2
            ·
            6 days ago

            That’s a view from the perspective of utility, yeah. The downvotes here are likely also from a ethics standpoint, since most LLMs currently trained are doing so by using other peoples’ work without permission, all while using large amounts of water for cooling, and energy from our mostly coal-powered grid. This is also not mentioning the physical and emotional labor that many untrained workers are required to do when sifting through the datasets of these LLMs, removing unsavory data for extremely low wages.

            A smaller, more specialized LLM could likely perform this same functionality with a much less training, on a more exclusive data set (probably only a couple of terabytes at its largest I’d wager), and would likely be small enough to run on most users’ computers after training. That’d be the more ethical version of this use case.

            • Natanox@discuss.tchncs.deOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              6 days ago

              True, those are awful problems. The whole internet is suffering due to this, I constantly read about fedi instances being literally DDoS’ed by robots.txt ignoring, IP-block circumventing crawlers. Unfortunately there’s no way to prevent any of this right now with our current set of technologies… the best thing we could do is make it a state- or even UN-level affair, reducing the amount of simultaneous training and focus on cooperation instead of competition while upholding high worker’s rights. However that would also be very anti-capitalistic and supposedly “stifle innovation” (as if that’s important in comparison to idk, our world burning?), so it won’t happen either. Banning it completely of course is also impossible, humanity is still way too divided and its benefits for “defends” (against ourselves) too high.

              In regards to running locally, we currently see a new wave of chip designs that’ll enable this on increasingly reasonable devices. I myself get a new laptop with XDNA2 chip and 32gb RAM today where I want to try running Codestral w/ Linux (the driver arrived natively in 6.14). Technically it should be possible to run any ~30b model on those newer chips (potentially slightly quantized), I’ll definitely try this a little bit and probably write a thread about it in the OpenSuse forums if you’re interested.

        • JayDee@lemmy.sdf.org
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          6 days ago

          I think it’s important to also use the more specific term here: LLM. We’ve been creating AI automation for years for ourselves, the difference now is that software vendors are adding LLMs to the mix now.

          I’ve hear this argument before in other instances. Ghidra, for example, just had an LLM pipeline rigged up by LaurieWired to take care of the more tedious process of renaming various functions during reverse engineering. It’s not the end of the analysis process during reverse engineering, it just takes out a large amount of busy work. I don’t know about the use-case you described but it sounds similar. It also seems feasible that you could train an AI system on your own system (given you have enough reversed engineered programs) and then run it locally to do this kind of work, which is a far cry from the disturbingly large LLMs that are guzzling massive amounts of data and energy to learn and run.

          EDIT: To be clear, because LaurieWired’s pipeline still relies on normal LLMs which are unethically trained, her pipeline using it is also unethical. It has the potential to be ethical, but currently is unethical.

    • ABC123itsEASY@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      6 days ago

      Lol why are you getting downvoted this isn’t even a hot take. You are 100% right regex is famously enigmatic even among experienced software engineers.

      • foggy@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        2
        ·
        edit-2
        6 days ago

        Yeah Lemmy used to have a core of tech Intel and that has slipped hard in the last 6 months.

        Be what it do I guess. Dummies gonna dumb.

        We are in this sea of like a million people who want to be cybersecurity professionals…

        …and as a cybersecurity professional it’s adorable when I see vehement dissent.

        Like y’all, I’ve been doing this. And if you want a recommendation, pipe down lol.

        • ABC123itsEASY@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          6 days ago

          Yea I come from the generation of reddit departures that left because of API lockdown and elimination of third party apps. Nowadays a lot of people join Lemmy because they got banned off of reddit for reasons of varying respectability. I would say it’s diluting the concentration of tech intel, as you say. Oh well.

          • foggy@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            5 days ago

            Lol yep. Also here from the reason for which you only care about a lot if you have done some kind of web develooment.

            Edit: Jesus I just reread that. I literally just ripped the bong. Was a dumb sentence. I’ll leave it.