AMD’s support for AI is just fine
This is quite untrue, especially if you do actual research and not just run other people’s models. For example, ROCm is missing in many sparse autograd frameworks, e.g. pytorch_sparse, or having a viable alternative to Nvidias MinkowskiEngine. This is needed if you do any state-of-the-art convnets with attention-like sparsity.
Germany traditionally is quite shocking in their practice of segregating children with disabilities into special Förderschulen. Whereas the U.S. has the Individual’s with Disabilities Education Act since the 1970s, Germany was basically forced into integration recently after the country signed the U.N. Convention on the Rights of Persons with Disabilities in 2009. And even then, they are taking their sweet time to integrate. See e.g. https://www.aktion-mensch.de/inklusion/bildung/hintergrund/zahlen-daten-und-fakten/inklusionsquoten-in-deutschland as how currently, slightly less than half of German students with disabilities go to a regular school (the Inklusionsanteil).
See: https://en.wikipedia.org/wiki/English-language_spelling_reform
English has been the total outlier among (originally) European language with no body of authority over its spelling. Even the “reform” by Noah Webster never really caught on outside North America, nearly 100 years later. And even more curious, the somewhat authoritative Oxford English Dictionary disagrees in their spelling with everybody (https://en.wikipedia.org/wiki/Oxford_spelling).
Nearly every single word in English that starts with a g followed by a soft ih/eh vowel is pronounced as a soft g, just a few:
That is patently not true and blatant cherry picking, e.g. already contradicted by the lexically matching word “gift” (and there are “giggle”, “gild”, “girl”, “git”, “give”, “gizmo”, etc.). See Wikipedia, which referenced linguists studying this:
An analysis of 269 words by linguist Michael Dow found near-tied results on whether a hard or soft g was more appropriate based on other English words; the results varied somewhat depending on what parameters were used.[11] Of the 105 words that contained gi somewhere in the word, 68 used the soft g while only 37 employed its counterpart. However, the hard g words were found to be significantly more common in everyday English; […]
https://en.wikipedia.org/wiki/Pronunciation_of_GIF#Cause
Michael Dow is an associate professor in linguistics with specialization in phonology, by the way.
and if you’re confused why others pronounce it with a soft G, they would seem to be simply more familiar with the English language 🤷♂️
Well, clearly you are already not as “familiar with the English language” as you might think.
GIMP is a special case. GIMP is being getting outdeveloped by Krita these days. E.g.:
https://gitlab.gnome.org/GNOME/gimp/-/issues/9284
Or compare with:
https://www.phoronix.com/news/Krita-2024-GPUs-AI
GIMP had its share of self inflicted wounds starting with a toxic mailing list that drove away people from professional VFX and surrounding FilmGimp/CinePaint. When the GIMP people subsequently took over the GEGL development from Rhythm & Hues, it took literally 15 years until it barely worked.
Now we are past the era of simple GPU processing into diffusion models/“generative AI” and GIMP is barely keeping up with simple GPU processing (like resizing, see above).