Started training a bot that tries to analyze and identify hate-speech on Twitter.

A few things became quite apparent after only a few days:

1. There are huge networks of (seemingly) fake accounts that like and retweet each other's posts. Someone is operating this at a _massive_ scale.

2. Reporting and banning fake accounts seems futile. You're fighting a hydra that spawns new accounts quicker than one can report them.

3. I'm feeling sick to my stomach just browsing through the logs.

@fribbledom This is becoming its own pretty major area of research. See people like Kate Starbird (University of Washington), Renee Di Resta, or ... chap at the Institute for the Future. Cluster and network analysis is a large part of their work, and IMO ***VASTLY*** more productive than semantic content analysis.

Content is NOT king, channel is.

(And Sumner Redstone knew this, he was practicing misdirection.)

I've done some incidental analysis myself, small scale.

@fribbledom Sam Wooley is the IFTF guy. Doing Computational Propaganda.

Really good podcast episode here:

The Future of Computational Propaganda. Episode: tracking.feedpress.it/link/141. Media: tracking.feedpress.it/link/141. Sam Woolley recently joined Institute for the Future as a Research Director and was previously the D....

@fribbledom Kate Starbird, who is all kinds of awesome, has a number of YouTube videos, mostly lecture presentations.

This one was posted just a week ago, and should have her most recent work. (I'm queueing it up for watching right now).

youtube.com/watch?v=9gzo-1jK-T

Earlier content that's quite good dates to, IIRC, either December 2018 or December 2017 (I think it's '18). Her weariness and the mind-warping nature of the material was a big element of that series.

@fribbledom And insofar as you can't go wrong with Starbird material, this gives an overview, touches on ethics, and mentions the psychological impacts of simply doing research in this area:
youtube.com/watch?v=iil6p-zd6q

@fribbledom There's also a very strong tie-in with what I've been discussing in the past day or so on .

@fribbledom Tom Scott's Royal Society talk isn't based on rigorous research, but talks around many of these issues as well, both in the main talk:

"There Is No Algorithm for Truth"

invidio.us/watch?v=leX541Dr2rU

And the Q&A follow-up:
invidio.us/watch?v=ZIv4tqJNuxs

The views are based on experience and some research / discussion with experts.

Also, NB, I somewhat disagree with his premise. Our chief algorithm for truth is Bacon's Novum Organum, a/k/a the

@dredmorbius
There's no such thing as The scientific method: there are a bunch of methods that are more or less well used (as general guidelines) by a bunch of people who are more or less conscious they are trying to follow a principled approach to finding answers 😉
This is by no way an algorithm for truth, it's just the next best thing we've found so far...

@silmathoron "Algorithm" in the loose sense of a process followed, rather than a precise mathematical process or rule.

Most of what's described as "algorithm" in AI / ML / online tools is really much more heuristics.

@dredmorbius I think heuristics might actually be a good qualifier for scientific methods 😂

Sign in to participate in the conversation
mastodon.cloud

Everyone is welcome as long as you follow our code of conduct! Thank you. Mastodon.cloud is maintained by Sujitech, LLC.