Kids, in Waymo is the best thing for autonomous vehicles. You can put your kids and grandparents off on somebody else to drive around. That would totally make adoption of the technology in America easier than I possibly thought before.
@svicpodcast
18 күн бұрын
^facts
@planetmuskvlog3047
17 күн бұрын
I laughed, I cried… it was a great show. Lidar is totally unnecessary for self-driving.
@svicpodcast
17 күн бұрын
Thanks bro! Ty for the comment! And I hope Lidar is not needed bc it's expensive as all heck!
@pik910
17 күн бұрын
ai generated inappropriate material may be harmful as it may disinhibit users, escalate fetishization and serve to normalize things we do not want normalized. You do not want a potential victimizer intensely engaging in fantasies or problematic-minded individuals grouping up. In many countries certain things are forbidden even in fiction for those reasons, there is a lot of step-family in corn. Persecuting distribution sounds good to me, but I think they will go overboard with the gravity of punishment. People who are into that are going to have huge collections, I would not be surprised to see harder punishments for ai-generations than the often laughable punishments (or lack thereof) for actual child abuse. On a positive note, it might decrease the commercial value of producing CSAM. Many children are exploited with commercial motivations. Realistically, especially many core pedophiles, people who are exclusively attracted to children, will seek out material. Much better if this is fictional. There is also the controversial topic of potential therapeutic use of such material. Maybe or maybe not to compartmentalize their pedophilia to fantasy is a viable way for some patients to deal with it. (There is a large therapeutic demand from pedophiles seeking help, which should be supported by anyone who wants to increase the protection of children.) It is a seriously difficult topic likely to get a populist treatment. Some more caveats: 1) the chances that someone who has AI generated CP also has CSAM are good 2) real childrens' likenesses are going to be used, also in training data 3) you can't effectively control it, a prompt does not equal intent, e.g. if you wanted to generate a young-looking adult your prompt could be sus, you often can't tell the age of a person from an image, the age of a fictional character is often unclear, e.g. disney princesses are lore wise often underage but generations involving them are not generally thought of or intended as such (the Aladdin princess is apparently 15)
@svicpodcast
17 күн бұрын
Well written, well said, and well explained, 10/10 would read again. I'm ordering you to comment more on our videos. Love always, Josie - SVIC senior fetch reporter
@damienwade7848
18 күн бұрын
First off, I'm not condoning this guy's behavior but I've had seen online linking to cartoon porn where those characters were kids on those animation shows. Is that not child porn generated?
Пікірлер: 8