• 1 Post
  • 609 Comments
Joined 2 years ago
cake
Cake day: July 14th, 2023

help-circle
  • kibiz0r@midwest.socialto196@lemmy.blahaj.zoneRule
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    3
    ·
    edit-2
    2 days ago

    Welllll… everything in software development is trade-offs.

    It’s honestly pretty rare that one solution is unequivocally “better” than another, across every dimension you might care about (which includes non-technical things).

    The kinds of egregious defects you might think of as brazen incompetence or laziness are more often the result of everyone (technical and non-technical alike) refusing the actively pursue one side of a trade-off and hoping that the devs can just “nerd harder”.

    Technical constraints as in the case of the N64 example can actually help avoid the “just nerd harder” fallacy, because they prompt serious discussions about what you can and can’t compromise on.

    Ironically, when we sit here as users and complain about games not being optimized in this way or that, we’re also refusing to engage in a conversation about trade-offs and insisting that devs just “nerd harder”.

    Edit: That’s not to provide any excuses for the blatant financialization of the industry which prompts the whole “don’t trade off anything, just have them nerd harder” mindset… but to warn yall that even if the market wasn’t ruled by greedy suits, we would probably still be feeling like old games managed to do more with less, cuz well… trading away 500MB of bundle size so you can get better logging of resource management in production wasn’t really an option.





  • I don’t believe the common refrain that AI is only a problem because of capitalism. People already disinform, make mistakes, take irresponsible shortcuts, and spam even when there is no monetary incentive to do so.

    I also don’t believe that AI is “just a tool”, fundamentally neutral and void of any political predisposition. This has been discussed at length academically. But it’s also something we know well in our idiom: “When you have a hammer, everything looks like a nail.” When you have AI, genuine communication looks like raw material. And the ability to place generated output alongside the original… looks like a goal.

    Culture — the ability to have a very long-term ongoing conversation that continues across many generations, about how we ought to live — is by far the defining feature of our species. It’s not only the source of our abilities, but also the source of our morality.

    Despite a very long series of authors warning us, we have allowed a pocket of our society to adopt the belief that ability is morality. “The fact that we can, means we should.”

    We’re witnessing the early stages of the information equivalent of Kessler Syndrome. It’s not that some bad actors who were always present will be using a new tool. It’s that any public conversation broad enough to be culturally significant will be so full of AI debris that it will be almost impossible for humans to find each other.

    The worst part is that this will be (or is) largely invisible. We won’t know that we’re wasting hours of our lives reading and replying to bots, tugging on a steering wheel, trying to guide humanity’s future, not realizing the autopilot is discarding our inputs. It’s not a dead internet that worries me, but an undead internet. A shambling corpse that moves in vain, unaware of its own demise.




    1. Fuck AI
    2. This judge’s point is absolutely true:

    “You have companies using copyright-protected material to create a product that is capable of producing an infinite number of competing products,” Chhabria said. “You are dramatically changing, you might even say obliterating, the market for that person’s work, and you’re saying that you don’t even have to pay a license to that person.”

    1. AI apologists’ response to that will invariably be “but it’s sampling from millions of people at once, not just that one person”, which always sounds like the fractions-of-a-penny scene
    2. Fuck copyright
    3. A ruling against fair use for AI will almost certainly deal collateral damage to perfectly innocuous scraping projects like linguistic analysis. Even despite their acknowledgement of the issue:

    To prevent both harms, the Copyright Office expects that some AI training will be deemed fair use, such as training viewed as transformative, because resulting models don’t compete with creative works. Those uses threaten no market harm but rather solve a societal need, such as language models translating texts, moderating content, or correcting grammar. Or in the case of audio models, technology that helps producers clean up unwanted distortion might be fair use, where models that generate songs in the style of popular artists might not, the office opined.

    1. We really need to regulate against AI — right now — but doing it through copyright might be worse than not doing it at all





  • It’s the #1 thing that drives me crazy about Linux.

    It seems obvious. You’ve got a Windows/Apple/Super key and a Control key. So you’d think Control would be for control characters and Windows/Apple/Super would be for application things.

    I can understand Windows fucking this up, cuz the terminal experience is such a low priority. But Linux?

    There’s some projects like Kinto and Toshy which try to fix it, but neither work on NixOS quite yet.