John talks about the importance of choosing your own path before the algorithms decide for you, and how Twitter recently discovered its photo-cropping algorithms are not only racist — they are ageist, ableist, and Islamaphobic.
This is Day 9 of the August Daily Episode Challenge. John got suspended from Twitter for a week.
Email john@emodojo.com or text/call 14054403330
Transcript
WEBVTT
1
00:00:00.000 --> 00:00:06.320
Hey, what's up? It's John. Welcome back to Emo Dojo for Monday
2
00:00:06.360 --> 00:00:12.470
August nine. This is the ninth
day of my daily episode challenge for August,
3
00:00:12.669 --> 00:00:17.269
so thanks for hanging in there.
I know they they're they all can't
4
00:00:17.269 --> 00:00:21.940
be gems, but you know whatever, I'm getting in here getting it done,
5
00:00:22.019 --> 00:00:27.140
doing the editing, uploading all that
Shit, so I that counts for
6
00:00:27.379 --> 00:00:32.740
something. I was just looking at
article about twitter. Right, I like
7
00:00:32.899 --> 00:00:37.409
twitter. I like it's nice and
concise. Just gotta say what you got
8
00:00:37.530 --> 00:00:41.530
to say and get off. Often
it seems like a Shit River. It's
9
00:00:41.609 --> 00:00:46.170
nothing but a bunch of garbage flowing
by and you can comment on it and
10
00:00:46.329 --> 00:00:50.359
get stinky yourself if you want,
but it's a good way to craft headlines
11
00:00:50.399 --> 00:00:55.240
and things like that. I got
kicked off twitter for a week because I
12
00:00:55.479 --> 00:01:00.600
think some terrorists got mad when I
said something like shoot him if they get
13
00:01:00.600 --> 00:01:03.950
out of line. Yeah, so
I had to delete that post and I
14
00:01:04.030 --> 00:01:10.310
got I can't use any of the
buttons on twitter for a week. Wow.
15
00:01:10.989 --> 00:01:15.109
So here's what twitter did over the
weekend. They were at Def Con,
16
00:01:15.269 --> 00:01:19.060
which is a hacker conference. They
held some breakout session and had some
17
00:01:19.140 --> 00:01:25.859
hackers just kind of test their system
and go through their photocropping algorithms to see
18
00:01:25.859 --> 00:01:30.579
if they can find any any errors, oh missions or any fuckery generally speaking.
19
00:01:32.219 --> 00:01:38.450
And of course they did so,
not only seeing the past. Their
20
00:01:38.489 --> 00:01:42.329
Algorithm, their photocropping out agorithm,
was racist. It would always crop the
21
00:01:42.450 --> 00:01:46.129
white people, leave the white people
in and crop out the black people.
22
00:01:46.730 --> 00:01:53.439
Well, they also found out it's
a just ablest and Islamophobic. So twitter
23
00:01:53.719 --> 00:01:57.760
will crop you out of you have
white hair or gray hair, if you
24
00:01:57.879 --> 00:02:02.549
have a head scarf and if you're
low. I kind of get the low
25
00:02:02.709 --> 00:02:08.990
one. You if it's cropping faces
in, the majority of the faces are
26
00:02:09.030 --> 00:02:12.509
in the center. I can see
why I would crop there. But it's
27
00:02:12.550 --> 00:02:19.740
saying that it's ablest because that will
exclude somebody sitting in a wheelchair. So
28
00:02:20.099 --> 00:02:23.379
that's the only one I have a
question about. The other ones are weird
29
00:02:23.379 --> 00:02:29.020
right, like head scarf. Huh? Yeah, it'll leave you out if
30
00:02:29.060 --> 00:02:32.009
you got a head scarf on or
gray hair. So why was the mentioning
31
00:02:32.090 --> 00:02:38.729
twitter? Well, not because I
got kicked off because of the algorithms.
32
00:02:38.810 --> 00:02:44.439
I was one of the chapters in
you've all Harari's book twenty one ideas for
33
00:02:44.759 --> 00:02:49.800
the twenty first century. He dedicated
one whole chapter to the algorithms. And
34
00:02:50.039 --> 00:02:53.120
the creepy thing is not to me
that robots are going to rule the world
35
00:02:53.159 --> 00:02:58.789
or anything like that. It's that
you get pigeonholed into the wrong thing in
36
00:02:58.909 --> 00:03:02.990
life. Did you ever see the
B movie was Seinfeldt? I love that
37
00:03:04.110 --> 00:03:08.870
movie, but it also shows that
kind of regimented idea that you can be
38
00:03:08.990 --> 00:03:12.270
this or you can be that or
whatever, and you stick with it and
39
00:03:12.430 --> 00:03:15.379
that's all you get to do.
And that's what the Algorithms in social media
40
00:03:15.500 --> 00:03:23.060
do right now. They're forming opinions
of you and reinforcing those opinions back into
41
00:03:23.060 --> 00:03:28.409
a loop. And so what Ha
are you warns about? Mostly is that
42
00:03:28.569 --> 00:03:30.770
if you don't know what you want
to be in life, if you aren't
43
00:03:30.810 --> 00:03:35.050
solid on your plan for the future, and who is really right? A
44
00:03:35.370 --> 00:03:38.129
lot of people don't really know what
they want to do next week, let
45
00:03:38.129 --> 00:03:40.879
alone next year. But if you
don't have some concept of what you are
46
00:03:42.000 --> 00:03:47.120
and your your what you offer to
life, the algorithms are going to start
47
00:03:47.159 --> 00:03:51.800
putting you in wrong categories and one
day you won't be able to get out
48
00:03:51.840 --> 00:03:54.080
of it. Your choices will be
limited. You won't see the same job
49
00:03:54.199 --> 00:04:00.150
offerings, you won't see the same
house offerings or food options. This sounds
50
00:04:00.229 --> 00:04:04.789
like that's crazy talk, but it's
already happening. Imagine self driving cars of
51
00:04:04.870 --> 00:04:09.870
the future that refused to let you
get off the freeway at a particular exit
52
00:04:09.990 --> 00:04:16.779
because your credit score isn't high enough. Imagine ambulance lowering your priority because you
53
00:04:16.860 --> 00:04:23.939
haven't paid your healthcare bill. So
now that everything's connected, the algorithms means
54
00:04:24.019 --> 00:04:27.889
something right. They're always working and
they're always trying to find a place for
55
00:04:27.930 --> 00:04:30.610
us. We can't just swim around
in the spectrum anymore. It's the is
56
00:04:30.649 --> 00:04:35.089
a binary thing that's trying to make
us either or so to that end,
57
00:04:36.329 --> 00:04:40.800
all I'm saying is think about what
you want to be and contribute to the
58
00:04:40.959 --> 00:04:48.720
planet as life goes on, and
be careful and what you choose, because
59
00:04:48.839 --> 00:04:56.350
the algorithms are going to reinforce that. And if you haven't chosen, maybe
60
00:04:56.509 --> 00:04:59.910
stay away from the algorithms as much
as possible. I know it's hard.
61
00:04:59.910 --> 00:05:02.709
If you got ATM card or Cell
Phone or whatever, that you're already you're
62
00:05:02.709 --> 00:05:09.459
already being tracked and cataloged and labeled
in quantified. So to the extent that
63
00:05:09.579 --> 00:05:13.579
you want some kind of free will, you better start thinking of those choices
64
00:05:13.740 --> 00:05:16.779
now so that you can get the
algorithms to work in your favor. I
65
00:05:17.100 --> 00:05:23.050
think those people that do decide what
they want to do with themselves will will
66
00:05:23.089 --> 00:05:27.730
benefit greatly because the algorithms will actually
help. They can't. They can't but
67
00:05:28.009 --> 00:05:32.569
help if you've already chosen that thing
that you're into. So that's why it
68
00:05:32.649 --> 00:05:39.439
comes back to twitter and their algorithms
being so racist and ablest and agest.
69
00:05:40.040 --> 00:05:43.839
They're already working against us, the
algorithms, that is, not the people
70
00:05:44.000 --> 00:05:46.199
that made the algorithms. They did
it unwittingly. The thing is, if
71
00:05:46.199 --> 00:05:50.470
you train ai with your own personal
biases, the AI is going to have
72
00:05:50.589 --> 00:05:55.230
your own personal biases. So if
you got a bunch of white dudes coding
73
00:05:55.310 --> 00:06:00.509
the AI, guess what ductor.
So, if you want to read more,
74
00:06:00.949 --> 00:06:06.300
just I think it's on. It's
just fucking Google Twitter Racist Algorithm and
75
00:06:06.459 --> 00:06:10.699
go to the new section. You
can read that article and of course I
76
00:06:10.740 --> 00:06:15.779
Highly Recommend Harari's Book Twenty One ideas
for the twenty one century. I hope
77
00:06:15.779 --> 00:06:31.410
you have an excellent day. I'll
talk to you tomorrow. And now back
78
00:06:31.410 --> 00:06:31.850
to the wall.