• 18 Posts
  • 37 Comments
Joined 2 years ago
cake
Cake day: June 17th, 2023

help-circle


  • we’re not in the slower than walking era, we’re in the ford model t era, i’m reading someone call the ford model t magic beans (eg. a scam) while millions of people are driving one every day and simultaneously worried it’s going to ruin transportation all over the north america

    “It took years of hard work for us to steal beans from farmers, apply our unique brand of magic, and seek investment from our nation’s finest rubes and oafs,” a Beanco spokesman said. “Now DeepBean wants to steal our magic beans, rebrand the magic, and get money from their own buffoons and clods? It’s just not right.”

    again, these are contradictory statements, it cannot be both a scam that has no value (eg. magic beans) and going to take all our jobs


  • But cars have always had value, we’re talking under an article that calls AI “magic beans”

    How could magic beans which produce garbage enslave humanity under capitalism? Would you argue that cars which replaced horses as the primary mode of transport in a few years be called this?

    You can argue that AI does some things badly, it’s still very very early on and the progress people are making is insane, like nothing I’ve seen before, but you can’t argue it is worthless and a giant threat to us at the same time, this is contradictory






  • She asked, “Why do men feel so threatened by women and others who are finally getting a seat at the table?”

    She admits the competition has increased (and with women making up 50% of the population competition has damn near doubled), and is wondering why men are getting more competitive?

    That script says all we need to do is provide for and protect our family. **Yet, the majority of women today also work and provide. **

    Which has never been the case in history before, who knew if you give women the same rights and freedoms as men (and added bonuses like DIE to promote them over men) things would break down??



  • but we had the same thing with Alpaca, Llama2, Llama3, 3.2, Mistral, Phi…

    I don’t believe so, or at least, them all getting smaller and/or more intelligent isn’t the point, it’s how they did it

    I noted above that if DeepSeek had access to H100s they probably would have used a larger cluster to train their model, simply because that would have been the easier option; the fact they didn’t, and were bandwidth constrained, drove a lot of their decisions in terms of both model architecture and their training infrastructure. Just look at the U.S. labs: they haven’t spent much time on optimization because Nvidia has been aggressively shipping ever more capable systems that accommodate their needs. The route of least resistance has simply been to pay Nvidia. DeepSeek, however, just demonstrated that another route is available: heavy optimization can produce remarkable results on weaker hardware and with lower memory bandwidth; simply paying Nvidia more isn’t the only way to make better models.

    https://stratechery.com/2025/deepseek-faq/









  • ikt@aussie.zonetoProgrammer Humor@programming.devVersatility wins
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    3 days ago

    Using mistral to calculate this but looks close enough

    1921 + 1673 + 242 + 85 + 84 + 84 + 83 + 82 + 81 + 80 + 79 + 77 + 77 + 75 + 74 + 73 + 72 + 71 + 70 + 69 + 68 + 68 + 67 + 66 + 65 + 64 + 63 + 62 + 61 + 60 + 59 + 58 + 57 + 56 + 55 + 54 + 53 + 52 + 52 + 51 + 50 + 48 + 48 + 47 + 46 + 45 + 44 + 43 + 41 + 21 + 18 = 7264ms

    So, the total sum of all the time values is 7264 milliseconds, or 7.264 seconds

    Removing the snapd services: 7264ms - 3836ms = 3428ms



  • ikt@aussie.zonetoProgrammer Humor@programming.devVersatility wins
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    3 days ago

    Not really noticed it tbh:

    1.951s snapd.seeded.service

    1.673s snapd.service

    Seems to be a lot longer than the mounts themselves but even then pretty minimal impact:

    $ systemd-analyze blame | grep snap | grep mount
       86ms snap-bare-5.mount
       85ms snap-blanket-49.mount
       84ms snap-btop-813.mount
       83ms snap-btop-814.mount
       82ms snap-chromium-3010.mount
       82ms snap-chromium-3025.mount
       80ms snap-core18-2829.mount
       79ms snap-core18-2846.mount
       78ms snap-core20-2379.mount
       77ms snap-core20-2434.mount
       76ms snap-core22-1663.mount
       76ms snap-core22-1722.mount
       75ms snap-core24-609.mount
       74ms snap-core24-716.mount
       73ms snap-cups-1067.mount
       72ms snap-firefox-5600.mount
       71ms snap-firefox-5647.mount
       70ms snap-firmware\x2dupdater-127.mount
       69ms snap-firmware\x2dupdater-147.mount
       68ms snap-gnome\x2d3\x2d28\x2d1804-198.mount
       67ms snap-gnome\x2d3\x2d38\x2d2004-140.mount
       66ms snap-gnome\x2d3\x2d38\x2d2004-143.mount
       65ms snap-gnome\x2d42\x2d2204-172.mount
       64ms snap-gnome\x2d42\x2d2204-176.mount
       63ms snap-gnome\x2d46\x2d2404-66.mount
       62ms snap-gnome\x2d46\x2d2404-77.mount
       61ms snap-gtk\x2dcommon\x2dthemes-1534.mount
       60ms snap-gtk\x2dcommon\x2dthemes-1535.mount
       59ms snap-libreoffice-330.mount
       58ms snap-libreoffice-334.mount
       57ms snap-mesa\x2d2404-143.mount
       56ms snap-mesa\x2d2404-44.mount
       55ms snap-nvtop-171.mount
       54ms snap-pinta-33.mount
       53ms snap-pinta-37.mount
       52ms snap-snap\x2dstore-1244.mount
       51ms snap-snap\x2dstore-1248.mount
       50ms snap-snapd-23258.mount
       49ms snap-snapd-23545.mount
       48ms snap-snapd\x2ddesktop\x2dintegration-247.mount
       47ms snap-snapd\x2ddesktop\x2dintegration-253.mount
       46ms snap-surfshark-51.mount
       44ms snap-telegram\x2ddesktop-6489.mount
       43ms snap-transmission-100.mount
       42ms snap-youtube\x2ddl-4630.mount
       41ms snap-youtube\x2ddl-4806.mount
       40ms var-snap-firefox-common-host\x2dhunspell.mount
       24ms snap-telegram\x2ddesktop-6495.mount