• 0 Posts
  • 17 Comments
Joined 6 months ago
cake
Cake day: July 10th, 2024

help-circle





  • The level of your argumentation:
    Are you a firefighter or a medical doctor? If not, you’re obviously in favour of fires, death and disease.
    Why aren’t you donating all of your stuff to homeless people? Or are you happy all those people don’t have a home?
    Why aren’t you saving the world already???

    You know, demanding change and maybe showing some sort of protest does not mean you need to do those things exactly as you would like to see them, especially if those efforts wouldn’t change anything on the larger scale and rather lead to a bunch of problems in your life.



  • The position with the vegan cats is basically indefensible.

    What do all organisms, including animals, need to properly maintain their metabolism?
    Nutrients.
    What are nutrients?
    A bunch of different chemicals.

    Depending on the specific organism, another set of nutrients is required, also varying in amount of course.

    All required nutrients for humans at least can be obtained or synthesized from non-animal compounds.

    From that simplified perspective, it’s absolutely rational to explore how we could feed animals like cats on a purely vegan diet.
    But it’s certainly nothing which should be left to do for the layman alone, as veterinarian care is advisable if harming the animal should be avoided.


  • If we’re speaking of transformer models like ChatGPT, BERT or whatever: They don’t have memory at all.

    The closest thing that resembles memory is the accepted length of the input sequence combined with the attention mechanism. (If left unmodified though, this will lead to a quadratic increase in computation time the longer that sequence becomes.) And since the attention weights are a learned property, it is in practise probable that earlier tokens of the input sequence get basically ignored the further they lie “in the past”, as they usually do not contribute much to the current context.

    “In the past”: Transformers technically “see” the whole input sequence at once. But they are equipped with positional encoding which incorporates spatial and/or temporal ordering into the input sequence (e.g., position of words in a sentence). That way they can model sequential relationships as those found in natural language (sentences), videos, movement trajectories and other kinds of contextually coherent sequences.



  • Good that you got that diagnose and know what you’re dealing with. I’m probably the wrong one to talk to about this and you probably know the following already, but just to make sure: apparently medication can hugely help. There are different agents so it might take a while to find the right one and its dose. So if you want to, it won’t hurt to talk to a neurologist or psychiatrist about this.





  • Zacryon@feddit.orgtoMemes@lemmy.mlFirefox + Ublock = 👑
    link
    fedilink
    arrow-up
    6
    arrow-down
    3
    ·
    5 months ago

    I’ve read the announcement. Sounds reasonable and sufficiently private to me. So saying “Mozilla wants your data” sounds misleading and like an overreaction to me. Also might help to mitigate the arms race in privacy protection versus tracking for ads and worse stuff.

    Mozilla is definitely going to try more scummy crap like this in the future.

    How do you know that?

    Even if, there will still be alternatives. But right now, Firefox is the best browser with regards to privacy and security. It even passed minmum ratings by the german IT security authority, contrary to other widely used browsers.