• 0 Posts
  • 52 Comments
Joined 2 years ago
cake
Cake day: June 16th, 2023

help-circle
  • Eh, analogy will be imperfect due to nuance, but I’d say it is close.

    The big deals are:

    • DeepSeek isn’t one of the “presumed winners” that investors had been betting on, and they caught up very quickly
    • DeepSeek let people download the model, meaning others can host it free and clear. The investors largely assumed at least folks would all abide by the ‘keep our model private if it is competitive and only allow access as a service offering’, and this really fouls up assumptions that an AI provider would hold lock-in
    • DeepSeek is pricing way way lower than OpenAI.
    • Purportedly they didn’t need to push their luck with just tons of H100 to get where they are. You are right that you still need pretty beefy to run it, but nVidia’s stock was predicated on even bigger stakes. Reportedly an attempt to train a model by OpenAI involved $500 million, and a claim to train a “good enough” for less than $10 million dramatically reduces the value of nVidia. Note that why they are “way down” they still have almost a 3 trillion dollar market cap. That’s still over 30 Intels or 12 AMDs. There’s just some pessimism because OpenAI and Anthropic either directly or indirectly drove potentially a majority of nVidia revenue, and there’s a lot more uncertainty about those companies now.

    I also think this is on the back of a fairly long relatively stagnant run. After the folks saw the leap from GPT2 to ChatGPT they assumed a future of similar dramatic leaps, but have instead gotten increasingly modest refinements. So against a backdrop of a more “meh” sentiment over where they are going you have this thing to disturb some presumed fundamentals in the popular opinion.


    • 7-zip
    • VLC
    • OBS
    • Firefox did it only to mostly falter to Chrome but Chrome is largely Chromium which is open source.
    • Linux (superseded all the Unix, very severely curtailed Windows Server market)
    • Nearly all programming language tools (IDEs, Compilers, Interpreters)
    • Essentially all command line ecosystem (obviously on the *nix side, but MS was pretty much compelled to open source Powershell and their new Terminal to try to compete)

    In some contexts you aren’t going to have a lively enough community to drive a compelling product even as there’s enough revenue to facilitate a company to make a go of it, but to say ‘no open source software has acheived that’ is a bit much.








  • Problem in some teams are the respective audiences for the commit activity v. the ticket activity.

    The people who will engage on commit activity tend to have a greater common ground and sensibilities. Likely have to document your work and do code reviews as the code gets into the codebase and other such activity.

    However, on the ticket side you are likely to get people involved that are really obnoxious to contend with. Things like:

    • Getting caught up in arguments over sizing where the argument takes more of your time than doing the request
    • Having to explain to someone who shouldn’t care why the ticket was opened in the first place despite all the real stakeholders knowing immediately that it makes sense.
    • Work getting prioritized or descoped due to some political infighting rather than actual business need
    • Putting extra work to unwind completed work due to some miscommunication on planning and a project manager wanting to punish a marketing person for failing to properly get their request through the process
    • Walking an issue through the process to completion involves having to iterate through 7 states, with about 16 mandatory fields that are editable/not editable depending on which state and sometimes the process is stuck due to not having permission because of some bureaucratic nonsense that runs counter to everyone’s real world understanding.

    In a company with armies of project managers the ticket side is the side of dread even if the technical code side is relatively sane.


  • Generally speaking, these platforms are not flashable unless they can boot a flash utility, assuming that whatever prior firmware is running is at least in good enough shape to boot to an update environment.

    There are designs to be robust and accessible even in the face of all this, but relatively rare, effectively unheard of in laptop market. Even some of those emergency recovery environments may be more limited than you would like to repair this sort of thing.


  • Also the kernel makes those variable immutable by default now, except the well known standard ones, so even for buggy UEFI this is mitigated nowadays. Just pointing out it came from a once legitimate space as a consequence of “everything is a file in a monolithic file namespace”. Which on the one hand is bad if someone uses rm with all sorts of flags to overrule the “you don’t want to do this” protections in the utility. On the other hand what you accidentally managed to do in Linux represented a problem that windows malware could have exploited.


  • UEFI defines a structured way to have data shared with OS as read write variables, including the ability to create, modify, and delete variables that UEFI can see.

    However, some firmware used this facility to store values and then their code assumed the variables would always be there. The code would then crash when it goes to read a deleted variable and not know what to do. The thing is deleting those variables per spec is a perfectly valid the due the OS to do, but firmware was buggy and the bugs not caught because normally OS would not bother those variables except for a few standard popular ones, like boot order.


  • The mention of UEFI in this context likely means they are thinking of a deletion recursing through sysfs and by extension deleting all visible UEFI variables which, in some firmware editions and versions, causes it not to be able to get through post or into the setup menu.

    I vaguely recall this and the general issue was very bad firmware design, but it was possible to make it impossible to even reinstall a system. If you were industrious in windows you could have done the same thing, so malware under windows could also brick such platforms.

    Of course rm has more safeguards on it so you have to pass more flags and really really be asking it to try to screw things up.


  • Usually I’ll see something mild or something niche get wildly messed up.

    I think a few times I managed to get a query from a post in, but I think they are monitoring for viral bad queries and very quickly massage it one way or another to not provide the ridiculous answer. For example a fair amount of times the AI overview just would be seemingly disabled for queries I found in these sorts of posts.

    Also have to contend with the reality that people can trivially fake it and if the AI isn’t weird enough, they will inject a weirdness to get their content to be more interesting.



  • I’d say their example is just an oversimplification to keep it understandable. Ultimately fuel based energy has a lot of the same concerns. That natural gas facility costs money to keep viable even if, hypothetically, zero fuel were being burned in some given week. The power lines need repairs, maintenance, upgrades, and expansion over the potential capacity, not actual usage. You have fixed costs alongside the marginal costs. The marginal costs certainly make sense to map directly to usage based rate, but fixed costs are significantly covered by those usage rates as well rather than bumping up the “basic charge” sort of line item on a power bill.


  • Seems like in such a case, it should be a different mix of base fixed monthly bill versus usage based rates, to more accurately reflect the cost structure in play.

    For example, in my area it’s about $15 a month even if you use absolutely no electricity, that’s just the base charge ostensibly for the infrastructure required to deliver power, should you want it. It might make sense for this number to be increased rather than raising $/kwh rates.

    Suppose the counter would be that at least with the rate increase, folks in more dire circumstances can cut back to avoid the increasing costs (which might be a bit of a feedback loop…)


  • jj4211@lemmy.worldtomemes@lemmy.worldCan't wait!
    link
    fedilink
    arrow-up
    17
    ·
    1 month ago

    Unfortunately, this time around the majority of AI build up are GPUs that are likely difficult to accomodate in a random build.

    If you want a GPU for graphics, well, many of them don’t even have video ports.

    If your use case doesn’t need those, well, you might not be able to reasonably power and cool the sorts of chips that are being bought up.

    The latest wrinkle is that a lot of that overbuying is likely to go towards Grace Blackwell, which is a standalone unit. Ironically despite being a product built around a GPU but needing a video port, their video port is driven by a non-nvidia chip.


  • jj4211@lemmy.worldtomemes@lemmy.worldAi bubble
    link
    fedilink
    arrow-up
    1
    ·
    1 month ago

    This was after applying various mechanisms of the traditional kind. Admittedly there was one domain specific strategy that want applied that would have caught a few more, but not all of them.

    The point is that I had a task that was hard to code up, but trivial yet tedious for a human. AI approaches can bridge that gap sometimes.

    In terms of energy consumption, it wouldn’t be so bad if the approaches weren’t horribly over used. That’s the problem now, 99% of usage is garbage. If it settled down to like 3 or 4% of usage it would still be just as useful, but no one would bat an eye at the energy demand.

    As with a lot of other bubble things, my favorite part is probably going to be it’s life after the bubble pops. When the actually useful use cases remain and the stupid stuff does out.