When does Social media hate speech become a real Threat???

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
I have pretty well trained FB's algorithm that I will only react to stuff about boats, wildlife, and history so that's mostly what it feeds me.

Today, however, some political stuff was thrown my way. Specifically, both Brietbart and Salon are posting (on FB, apparently unaware of the irony) that the FB outage reveals... drumroll please...

...the need to regulate Facebook, of course.

Gotta feed the chickens and head to the sailing center now, but I'm feeling motivated to produce some content for FB so plan to edit together some footage my wife and I got the other evening. Best manatee show I've ever seen, and that includes two full blown manatee orgies.

 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
When does Social media hate speech become a real Threat???

The latest answer from the 11th Circuit: NOT when a maniac goes on a rampage.

We are deeply saddened by the deaths and injuries caused by Mr. Mateen's rampage, but we agree with the district court that the plaintiffs failed to make out a plausible claim that the Pulse massacre was an act of 'international terrorism' as that term is defined in the ATA. And without such an act of 'international terrorism,' the social media companies—no matter what we may think of their alleged conduct—cannot be liable for aiding and abetting under the ATA.

 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
When does Social media hate speech become a real Threat???

When we put Lindsey Graham in charge of it, according to the ACLU.
 

Unfortunately, blessed bipartisan unity was achieved yesterday.
 

Glad to see that Sen. Leahy is not a back door man.

I wonder if showing a wife or girlfriend's tits is consistent with Lindsey's "best practices" that social media outlets are expected to follow?
The EARN IT Act isn't any better warmed over
 

A bad bill from 2020 returns. An ill-conceived and dishonest measure known as the EARN IT Act is being revived in this year's legislature. We covered the bill when it was first introduced, calling it "the new FOSTA." Like the 2018 law FOSTA, the EARN IT Act would make dangerous changes to the digital landscape and legal structure under the guise of protecting children.

Formally titled the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act, it was first proposed back in 2020 but failed to go anywhere. On Monday, Sen. Richard Blumenthal (D–Conn.)—one of the bill's sponsors—began tweeting about EARN IT again, spewing the same falsehoods about tech companies being indifferent to child pornography and how this bill is needed to hold them accountable.

The reintroduced EARN IT Act is now scheduled to be discussed at a Senate Judiciary Committee meeting on Thursday. It's being sponsored by a bipartisan group that includes some of the Senate's worst actors, including Lindsey Graham (R–S.C.), Dianne Feinstein (D–Calif.), and Josh Hawley (R–Mo.).

The bill has earned criticism from all sorts of civil liberties, sex worker rights, LGBT, and free speech groups including the American Civil Liberties Union, the Electronic Frontier Foundation, the Urban Justice Center, Hacking//Hustling, and Human Rights Watch.

Support for the EARN IT Act tends to come from groups opposed to all sex work and/or pornography (even when all parties are consenting adults).

...

The new bill "appears to be just a reintroduction of old EARN IT fixing none of the problems," tweeted Techdirt's Mike Masnick. "It misunderstands the problem. Creates a 'solution' that will make it harder to actually fight CSAM… and creates a ton of collateral damage in the process. Just bad all around."

Evan Greer—director of digital advocacy group Fight for the Future—points out that "more than HALF A MILLION people signed this petition to lawmakers opposing the EARN IT Act last Congress http://NoEarnItAct.org."
It still has bipartisan support. I still agree with the ACLU.

 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
I'm not understanding what that has to do with "Hate Speech".  Enlighten me.
The common thread is the censorship and the bipartisan attacks on Section 230. Oh, and protecting the cheeruns, of course.

Hate speech is one excuse, "sex trafficking" is another, but the SOLution is always the same: get rid of that terrible Section 230.

The excuses still don't fly with me because I like forums like this one, where the owner(s) are not held responsible for what any of us say. I think this place (and the rest of the internet) would be very different and much worse if the censors get their way.

 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
Gonzalez vs Google

Issue: Whether Section 203(c)(1) of the Communications Decency Act immunizes interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limits the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information.
SCOTUS is being asked whether Google can recommend videos, which is different from just hosting them.

 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
When does Social media hate speech become a real Threat???

When TeamR Texans Decide The Fairness Doctrine Was Great After All
 

...

In the 1960s, a group of progressive scholars argued that the First Amendment does not merely prohibit the government from censoring private speech and press. In fact, they argued, it granted the government the affirmative power to control the mass media. In a capitalist system, they reasoned, the government must ensure that private media owners do not exclude unwelcome viewpoints, in order to protect the "democratic interest" in free speech. To this end, the scholars championed the Fairness Doctrine, right-of-reply mandates, and expansive applications of "common carriage" doctrine, which enable the government to force the inclusion of certain content.

Borrowing from the same playbook, Texas now argues that First Amendment values require, rather than prohibit, government interference with private speech. H.B. 20 declares that social media platforms are common carriers like telephone companies and thus are subject to onerous restrictions over who and what they may host. According to Texas, H.B. 20 serves the democratic interest in protecting the free exchange of ideas and information. But like the collectivist efforts that preceded it, Texas' misguided attempt to advance "First Amendment rights in the Lone Star State" violates private platforms' First Amendment rights to choose what speech they publish.

As we explained in a recently filed amicus curiae brief on behalf of the Cato Institute, courts have repeatedly rejected such regulations, and for good reasons.

First, as a federal district court correctly noted last year, the First Amendment protects social media platforms' discretion to publish or remove content. The Supreme Court established this right in the 1974 case Miami Herald Publishing Company v. Tornillo, when it struck down a Florida law that forced newspapers to print responses to their criticism of political candidates. The Court explained that this "right-of-reply" law infringed on newspapers' right to choose what content they publish.

The Supreme Court affirmed that First Amendment rights apply with full force to internet media in 1997's Reno v. ACLU. Other federal courts have since upheld the editorial rights of search engines and social media sites. These precedents doom the Texas law. The state can't eviscerate platforms' well-established First Amendment rights by arbitrarily calling them common carriers.

...
It wasn't.

 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
Treating social media companies as common carriers
 

A blatantly unconstitutional Texas social media law can start being enforced unless the Supreme Court steps in. The law was blocked by a U.S. district court last year after internet advocacy and trade groups challenged it. But a new order from the U.S. Court of Appeals for the 5th Circuit means Texas can begin enforcement of its social media law—and wreak havoc on the internet as we know it in the process.

NetChoice and the Computer and Communications Industry Association (CCIA)—the groups that filed the lawsuit against the Texas social media law—have now submitted an emergency petition to the Supreme Court asking it to intervene. Meanwhile, Texas and a slew of other states with Republican leaders are advocating for the law, which would treat large social media platforms like common carriers (such as railroads and telephone companies) that have a legal obligation to serve everyone.

...

the idea that social media should be treated like common carriers has become a popular (if incredibly short-sighted and weird) conservative talking point.

Of course, a phone company or a telegraph company—where information is communicated privately between two (or a small number) of people—is nothing like social media, where speech by one user can reach all users. In arguing to treat social media like common carriers, conservatives could make these platforms havens for content that makes other users flee and repositories of things—like frank discussions and depictions of sexuality—that conservatives in other realms are fighting to suppress.

For a detailed and multifaceted case against treating social media platforms like common carriers, see this post from George Mason University law professor and Volokh Conspiracy contributor Ilya Somin.

...

A dangerous precedent? In a brief filed with the Supreme Court in opposition to H.B. 20, the American Civil Liberties Union (ACLU) and the ACLU of Texas suggest that allowing the law to stand could set a dangerous precedent.

H.B. 20 "challenges core pillars of the freedoms of speech and the press" and "while Texas has chosen to target new digital platforms today, its defense of HB 20 offers no limiting principle that would prevent it from turning  its attention to the most traditional of media tomorrow," they suggest.

TechFreedom also portends reverberating effects. "No one—no lawyer, not [sic] judge, no expert in the field; not even the law's own sponsors—knows what compliance with this law looks like," said Corbin K. Barthold, director of appellate litigation at TechFreedom, in a statement.

"Indeed, HB 20 is designed to generate as much litigation as possible. Any social media user in Texas may sue to undo any act of content moderation," notes Barthold. "Each lawsuit will contend that the real basis for the content moderation was the poster's 'viewpoint.' Take a ban on beheading videos. Is that a viewpoint-neutral policy against a certain type of content? Or is it at heart a viewpoint-based anti-ISIS rule? Such questions are infinite, and, under HB 20, they'll be litigated."

...
The bolded part? Seems pretty anti-ISIS to me.

 

Burning Man

Super Anarchist
10,764
2,196
Back to the desert
To answer my original question in the OP topic.....  I'd say this qualifies.

https://www.nytimes.com/2022/05/19/technology/mass-shootings-livestream-online.html

Yet even as Facebook expunged 4.5 million pieces of content related to the Christchurch attack within six months of the killings, what The Times found this week shows that a mass killer’s video has an enduring — and potentially everlasting — afterlife on the internet.

“It is clear some progress has been made since Christchurch, but we also live in a kind of world where these videos will never be scrubbed completely from the internet,” said Brian Fishman, a former director of counterterrorism at Facebook who helped lead the effort to identify and remove the Christchurch videos from the site in 2019.

To outwit some of the large platforms, which generally rely on artificial intelligence to take down toxic content, people have added watermarks or filters to alter the clips of the Christchurch attack or changed playback speeds of the recording, The Times found. Others started posting the web addresses of the videos instead of directly uploading the clips, to avoid detection by algorithms that match shooting videos to previously known versions. Other people uploaded the Christchurch videos to less popular hosting platforms with fewer content moderation rules.
So not only is "social media" actually creating these murderers like the Buffalo shooter who want to emulate other attacks but millions of people around the world are actively trying to evade any attempt to remove this sort of disturbing content.  I think this is what I find almost as disturbing as the shootings themselves, is that there are people out there who WANT to not only see this stuff but want to share it and distribute it widely.  And I don't think everyone sharing it is just white supremacist sympathizers.  

Where does "free speech" end and a crime begins when you share this sort of content?  It's one thing to pressure hosting platforms like Farcebook and Twatter to take stuff down.  But maybe it's time there are real world consequences for what people do online.  I would would an example out of some of these shitturds who actively evaded the site's rules to promulgate hate and violence like this and prosecute the fuck out of them.  

 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
Treating social media companies as common carriers
 

A blatantly unconstitutional Texas social media law can start being enforced unless the Supreme Court steps in. ...
A divided SCOTUS did step in. For now, the subject companies are not common carriers.
 

The Supreme Court has temporarily blocked enforcement of a Texas tech law. The law treats large social media platforms like common carriers and prohibits them from making independent decisions about content moderation.

Specifically, the law bans large tech companies from viewpoint-based blocking, restricting, or editorializing about social media content—a purview large enough to prevent platforms from moderating even the types of content that few would object to a private company limiting. It also requires them to set up an appeal system for users whose content is removed.

...

The majority—a conservative-liberal mix that included Chief Justice John Roberts and Justices Stephen Breyer, Sonia Sotomayor, Brett Kavanaugh, and Amy Coney Barrett—did not offer reasoning for their ruling.

A dissent penned by Justice Samuel Alito, joined by Clarence Thomas and Neil Gorsuch, can be found here. Justice Elena Kagan also dissented.
I haven't read what Alito had to say yet but Thomas has previously said that some tech companies should be treated as common carriers.

 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
So not only is "social media" actually creating these murderers like the Buffalo shooter who want to emulate other attacks but millions of people around the world are actively trying to evade any attempt to remove this sort of disturbing content.  I think this is what I find almost as disturbing as the shootings themselves, is that there are people out there who WANT to not only see this stuff but want to share it and distribute it widely.  And I don't think everyone sharing it is just white supremacist sympathizers.  

Where does "free speech" end and a crime begins when you share this sort of content?  It's one thing to pressure hosting platforms like Farcebook and Twatter to take stuff down.  But maybe it's time there are real world consequences for what people do online.  I would would an example out of some of these shitturds who actively evaded the site's rules to promulgate hate and violence like this and prosecute the fuck out of them.  
The bolded question is pertinent here on this forum, where lots of people want to take advantage of horrific crimes to call me (and sometimes you) a murderer. They have to discuss those crimes to do it. Prosecute the fuck out of them for what, exactly?

 

LB 15

Cunt
The bolded question is pertinent here on this forum, where lots of people want to take advantage of horrific crimes to call me (and sometimes you) a murderer. They have to discuss those crimes to do it. Prosecute the fuck out of them for what, exactly?
I have seen plenty call you a douchebag but who has called you a murderer? Perhaps 'Paranoid Tom' would work for your next amendment name change.

 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
When does Social media hate speech become a real Threat???

When Japan might lock you up for a year and/or fine you a couple grand for it?

This week, a Japanese law went into effect making it a jailable offense to be a jerk on the Internet.

As reported by The Japan Times, the legislation, passed in June, strengthens the country's punishment for "online insults." According to CNN, "Under Japan's penal code, insults are defined as publicly demeaning someone's social standing without referring to specific facts about them or a specific action…The crime is different to defamation, defined as publicly demeaning someone while pointing to specific facts."

Previously, the penalty for online offensiveness was either a fine of less than ¥10,000 (about $73 USD) or fewer than 30 days in prison. Under the new law, which went into effect Thursday, the penalties increased to as much as a year in prison and a fine of up to ¥300,000 (about $2,200 USD). It also extended the statute of limitations from one year to three.
...
Speech is not violence, and attempts to regulate it as if it were will only empower the regulators at the expense of the powerless.

I couldn't agree more with that last sentence.
 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
Tom, won't you help Miss Lindsey help Miss Lindsey help America? Please, donate generously.

When does Social media hate speech become a real Threat???

When Lindsey Graham gets involved, again.



Well, and Trump, of course. (With apologies for posting more Koch-$pon$ored Trump cheerleading, of course.)
No, what did I post to make you think I might?

BTW, if you won't frame your gossip about me as a response to something I actually said, I'll do it for you.
 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
The Supreme Court holds the internet’s fate in its hands, and you should be terrified

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

The Supreme Court’s decision to review two lower court rulings, including an appellate case from the U.S. 9th Circuit Court of Appeals in San Francisco, marks the first time the court has chosen to review Section 230, after years in which it consistently turned away cases involving the law.

That may not reflect a change in its view of the legal issues, so much as a change in how society views the internet platforms at the center of the cases — Google, Facebook, Twitter and other sites that allow users to post their own content with minimal review.

“We’ve been in the midst of a multi-year tech-lash, representing the widely-held view that the internet has gone wrong,” says Eric Goldman, an expert in high-tech and privacy law at Santa Clara University Law School. “The Supreme Court is not immune to that level of popular opinion — they’re people too.”

Disgruntlement with the big tech platforms stretches from one side of the political spectrum to the other.
...
It should be obvious that laws purporting to open online platforms to “neutral” judgments about content do nothing of the kind: They’re almost invariably designed to favor one color of opinion over others.
...
That brings us back to the California case before the Supreme Court. It was brought against Google, the owner of YouTube, by the family of Nohemi Gonzalez, an American who was killed in an attack by the militant group Islamic State, also known by the acronym ISIS, in Paris on Nov. 13, 2015.

The plaintiffs blame YouTube for amplifying the message of ISIS videos posted on the service by steering users who viewed the videos to other videos either posted by ISIS or addressing the same themes of violent terrorism, typically through algorithms. YouTube, the plaintiffs assert, has been “useful in facilitating social networking among jihadists,” and that it knew that the content in question was posted on its site.

The legal system’s perplexity about how to regulate online content was evident from the outcome of the Gonzalez case at the 9th Circuit. The three-judge panel fractured into issuing three rulings, though the effective outcome was to reject the family’s claim about algorithmic recommendations. The lead opinion by Judge Morgan Christen found that Section 230 protected YouTube.

But one judge, Marsha Berzon, concurred in that opinion only because she concluded that precedent prevented the appeals court from narrowing the legal immunity granted by Section 230, but said she would “join in the growing chorus of voices calling for a more limited reading of section 230.”

The third judge, Ronald M. Gould, held in a dissenting opinion that Section 230 was “not intended to immunize” online platforms from liability for “serious harms knowingly caused by their conduct.”

In legal terms, Section 230 itself isn’t the subject before the court. The question the justices are asked to resolve is whether YouTube and other platforms move beyond the role of mere publishers or distributors of someone else’s content when they make “targeted recommendations” steering users to related content, including when they do so via automated algorithms.

The power of such recommendations to magnify the impact of online content has been acknowledged before.
...

Gonzalez v. Google LLC

and

Twitter, Inc. v. Taamneh

Hmm... I'm really glad the owners of this place aren't generally responsible for what we say here. I'm a big Section 230 fan.

But this line from the 9th Circuit opinion under review caught my attention:

Plaintiffs also claim that Google placed paid advertisements in proximity to ISIS-created content and shared the resulting ad revenue with ISIS.

What was Google's corporate motto again? I forget.

Anyway, I think I'll wander over to FB, where the Facebot will decide what to show me.
 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
Abortion Rights Advocates Ask DOJ To Defend Section 230

...
Section 230 has received its share of criticism: During his administration, former President Donald Trump and his allies routinely threatened to modify or simply repeal it. Democrats have also been quite critical, with President Joe Biden saying in September that Congress should "get rid of special immunity for social media companies" and "hold social media platforms accountable for spreading hate and fueling violence."


Early next year, the Supreme Court will consider whether Section 230 also applies to recommendations made by a platform's algorithms. The case, Gonzalez v. Google, stems from a 2015 ISIS attack and claims that YouTube's recommendation algorithm helped the terror group spread its message by recommending its videos. Section 230 may grant platforms immunity from user-generated content, the suit argues, but what if the platform then recommends that content to other users?


Last week, a letter jointly signed by the Chamber of Progress, a center-left technology industry group, and Advocates for Youth, a D.C.-based sexual health advocacy nonprofit, encouraged Attorney General Merrick Garland to "submit a brief in support of defendants" on behalf of the DOJ. The defendant is Google, owner of YouTube, which the Chamber of Progress lists as one of its "corporate partners."

...

"Without Section 230," the letter worries, "online services might be compelled to limit access to reproductive resources, for fear of violating various state anti-abortion laws."
...
Biden and Trump are still wrong on this issue. I hope Garland forgets who he works for when reading that letter.
 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
Heh. It seems random is


The good news is that


I just checked and this post is still up on my page:

EvaCorruptOaf.jpg
Hah! Still there three years later.
 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
A divided SCOTUS did step in. For now, the subject companies are not common carriers.



I haven't read what Alito had to say yet but Thomas has previously said that some tech companies should be treated as common carriers.
Netchoice v Paxton may be back.

...
For years, courts have held that governments may not force private actors to host others’ speech. That is why Texas’ law is unconstitutional, according to the trade organizations petitioning the Court on Thursday.


“HB20 infringes the core First Amendment rights of Petitioners’ members by denying them editorial control over their own websites, while forcing them to publish speech they do not wish to disseminate,” wrote the Computer & Communications Industry Association and NetChoice in their filing.


Earlier this year, a federal appeals court had sided against the two groups. In September, the Fifth Circuit Court of Appeals said in a ruling: “We reject the idea that corporations have a freewheeling First Amendment right to censor what people say.”


That same month, the state of Florida asked the Supreme Court to hear a similar case involving one of its laws. At the center of that battle is SB 7072, which allows political candidates to sue social media companies if they are blocked or removed from online platforms for more than 14 days. NetChoice and CCIA are also involved in that case, having challenged Florida’s law after it was signed last year.


Multiple Supreme Court justices have expressed interest in hearing cases that deal with content moderation, citing the enormous role that social media now plays in democratic discourse.
...

I also reject the idea that corporations have a freewheeling first amendment right to censor stuff on their property.

They have a freewheeling property right to censor stuff on their property. That's why HB20 is wrong.
 

Pertinacious Tom

Importunate Member
62,898
2,017
Punta Gorda FL
When does Social media hate speech become a real Threat???

When it might have some tangential relationship to a federal judge.

...
The JSPA's fact ban works in two main ways. First, judges and their immediate family members can send users or online services requests to take down posts of prohibited information. This information does not need to be about the judge, or even foreseeably related to the judge's security, to be the subject of a takedown request. Under the JSPA, a judge's mother may send a takedown request to censor the birthdate of her brother (the judge's uncle), and a judge's daughter may censor information about the law school her son goes to (the judge's grandson). After receiving a takedown request, the user or online service has 72 hours to remove the banned content.

Second, online services have an implied duty to monitor for content similar to the content they have previously been asked to remove on any website or subsidiary website they own. Any online service which does not fully comply with the JSPA's requirements may be sued for injunctive relief and money damages.

No judge should have to fear for their lives as they defend the rule of law. Yet the JSPA will give the federal judiciary and their extended families power to infringe those civil liberties themselves in several ways. The Supreme Court established in Miami Herald v. Tornillo (1974) that private companies have a First Amendment right to choose the third party content they host. This means the government is barred from forcing private companies to remove lawful third-party material—like facts about judges their users post—or forcing them to monitor their own websites for content the government disfavors.
...

I get it. I sold a new Precision 185 sailboat to a federal judge. The location of his home was unpublished. I looked. He gave it to me so I could deliver the boat, but I looked because I was curious. It couldn't be found online.

They're kept secret for good reasons.

When my mom became a prosecutor for the state of FL, the crimes were sometimes pretty funny. Particularly the pigeon that fled the scene. Then she moved up in the ranks and the crimes were not funny at all and our house grew a fence and iron bars over windows and doors.

But the sweep of this law seems awfully broad. The uncle's birthday?
 


Latest posts



Top