It violated their policies? What are they going to do? Give the LLM a written warning? Put it on an improvement plan? The LLM doesn’t understand or care about company policies.
It violated their policies? What are they going to do? Give the LLM a written warning? Put it on an improvement plan? The LLM doesn’t understand or care about company policies.
The other problem is that the mouse does not click properly. Apple is still stubbornly refusing to put a second physical button in their mice. For almost 20 years they’ve been selling mice that can emulate right clicking by using a touch surface, but it seems like you still need to hold the mouse funny to avoid accidentally doing the wrong click because your other finger is resting on the other side of the mouse when clicking. At least they got rid of the little ball that likes to scroll horizontally while you’re scrolling vertically and gets clogged easily.
The author misses the irony of leaving Twitter, a for-profit, centralized, social network for Bluesky, a different, for-profit, centralized social network. Hopefully it’s different this time.
Children also learn to reading and writing using copyrighted works, often from borrowed books that they aren’t paying for. Some corporations would love if everyone had to pay individually, maybe per use, to access copyrighted material, and New York Times and American pro sport leagues would love if they could actually own recollections of copyrighted material, but neither of these is good for normal people.
https://www.eff.org/deeplinks/2023/04/how-we-think-about-copyright-and-ai-art-0
OpenAI is right. Almost everything of value on the internet is under copyright, and very little on the internet has clearly and unambiguously specified licensing information. If the software can only be trained on content that clearly allows training, the model isn’t going to “know” anything about anything since Steamboat Willie and it isn’t going to use broken dialects of older English from being limited to only public domain works that have been digitized and made available as public domain (reprints may not be public domain).
The article isn’t that clear, but the attacker cannot get Slack AI to leak private data via prompt injection directly. Instead, they tell it that the answer to a question is a fake error containing a link which contains the private data, and then when a user that can access the private data asks that question they get the fake error and clicking the link (or automatic unfurling?) causes the private data to be sent to the attacker.
They can most likely prevent further breakdown through software. If the meters and controls are functioning correctly, they can undervolt the CPU. But it’s not really a fix if that comes with a performance penalty. If it’s a bug where the CPU maxes out the voltage when idle so it can do nothing faster, that could be fixed with no performance penalty, but that seems unlikely.
I’ve heard speculation that this is exasperated by a feature where the CPU increases the voltage to boost clocks when running single core workloads at low temperatures. If that’s true, having less load or better cooling may be detrimental to the life of the processor.
Built bundles are not affected. The service is supposed to figure out which polyfills are required by a particular browser and serve different scripts. Because it’s serving different scripts, the scripts cannot be bundled or secured using SRI. That would defeat the purpose of the service.
Code pulled from GitHub or NPM can be audited and it behaves consistently after it has been copied. If the code has a high reputation and gets incorporated into bundles, the code in the bundles doesn’t change. If the project becomes malicious, only recently created bundles are affected. This code is pulled from polyfill.io every time somebody visits the page and recently polyfill.io has been hijacked to sometimes send malicious code instead. Websites that have been up for years can be affected by this.
I looked it up before posting. It’s illegal in 48 states, including California where most of these companies are headquartered, and every state where major cloud data centers are located. This makes it effectively illegal by state laws, which is the worst kind of illegal in the United States when operating a service at a national level because every state will have slightly different laws. No company is going to establish a system that allows users in the two remaining states to exchange revenge porn with each other except maybe a website established solely for that purpose. Certainly Snapchat would not.
I’ve noticed recently there are many reactionary laws to make illegal specific things that are already illegal or should already be illegal because of a more general law. We’d be much better off with a federal standardization of revenge porn laws than a federal law that specifically outlaws essentially the same thing but only when a specific technology is involved.
Web services and AI in general are completely different things. Web services that generate AI content want to avoid scandals so they’re constantly blocking things that may be in some situations inappropriate, to the point where those services are incapable of performing a great many legitimate tasks.
Somebody running their own image generator on their own computer using the same technology is limited only by their own morals. They can train the generator on content that public services would not, and they are not constrained by prompt or output filters.
Modern AI is not capable of this. The accuracy for detecting nsfw content is not good, and they are completely incapable of detecting when nsfw content is allowable because they have no morals and they don’t understand anything about people or situations besides appearance.
“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.
“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail…so he would get a punishment for what he actually did,” McAdams told CNN.
There’s a reason kids are tried as kids and their records are expunged when they become adults. Undoing that will just ruin lives without lessening occurrences.
“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said. “By this bill getting passed, I will no longer have to live in fear knowing that whoever does bring these images up will be punished.”
This week, Republican Senator Ted Cruz, Democratic Senator Amy Klobuchar and several colleagues co-sponsored a bill that would require social media companies to take down deep-fake pornography within two days of getting a report.
“[The bill] puts a legal obligation on the big tech companies to take it down, to remove the images when the victim or the victim’s family asks for it,” Cruz said. “Elliston’s Mom went to Snapchat over and over and over again, and Snapchat just said, ‘Go jump in a lake.’ They just ignored them for eight months.”
BS
It’s been possible for decades for people to share embarrassing pictures of you, real or fake, on the internet. Deep fake technology is only really necessary for video.
Real or fake pornography including unwilling participants (revenge porn) is already illegal and already taken down, and because the girl is underage it’s extra illegal.
Besides the legal aspect, the content described in the article, which may be an exaggeration of the actual content, is clearly in violation of Snapchat’s rules and would have been taken down:
- We prohibit any activity that involves sexual exploitation or abuse of a minor, including sharing child sexual exploitation or abuse imagery, grooming, or sexual extortion (sextortion), or the sexualization of children. We report all identified instances of child sexual exploitation to authorities, including attempts to engage in such conduct. Never post, save, send, forward, distribute, or ask for nude or sexually explicit content involving anyone under the age of 18 (this includes sending or saving such images of yourself).
- We prohibit promoting, distributing, or sharing pornographic content, as well as commercial activities that relate to pornography or sexual interactions (whether online or offline).
- We prohibit bullying or harassment of any kind. This extends to all forms of sexual harassment, including sending unwanted sexually explicit, suggestive, or nude images to other users. If someone blocks you, you may not contact them from another Snapchat account.
Formerly in business website formerly known as Twitter.
ChromeOS and ChromiumOS are Linux.
The problem with ChromeOS (and Android) devices is that hardware support is usually only available in a fork of Linux which gets as little maintenance as possible for the five years. You end up with the choice of running and old kernel that supports the hardware but not some new software, a new kernel that supports new software but the hardware doesn’t work right, or taking over maintenance of the fork yourself. The same problem occurs with uncommon hardware on non-ChromeOS devices.
The five year policy is for ChromeOS, not ChromiumOS. ChromiumOS-based devices may have more or less support.
The hardware seems cool but most of the software features are things I would immediately turn off. Hopefully the Clippy+ branding gets changed.
That’s not what I mean. When you contribute content to Stack Exchange, it is licensed CC BY-SA. There are websites that scrape this content and rehost it, or at least there used to be. I’ve had a problem before where all the search results were unanswered Stack Overflow posts or copies of those posts on different sites. Maybe similar to Reddit they restricted access to the data so they could sell it to AI companies.
Why now? Other people have been profiting off of your Stack Overflow answers for years. This is nothing new.
If you partnered with them as a sponsor and they took your commission money and paid you using some of the money that you would have otherwise gotten anyway, you’d probably be angry.