Everyone carries a set of basic assumptions about how the human system works, like what's acceptable public behavior, how the government should be run (which affects who you vote for), expectations about how certain people will act, etc.
These assumptions are based on the information we take in each day, like articles read, images viewed/scrolled past, and so on. We ARE our media diets, whether we want to admit it or not.
Our opinions are formed slowly, they change slowly, and there are several well known "bugs" in human thought patterns that make us tend to prefer an echo chamber and reject information that doesn't line up with held beliefs.
AI enables fine-tuned control of exactly what assumptions people gain and maintain. What happens if YouTube silently starts recommending PragerU videos to all the millions of high schoolers that match the "impressionable" profile? What would that do to the basic expectations of the citizenry about what the "right" kind of government and tax system is?
What if the platform was used to convince everyone that annual slavery reparations must be made in perpetuity?
No one wants to admit that ads work on them, but we all know they work in general. So how much of what you know is true, was fed to you? In a world where everything we consume runs through relatively black-box algorithms (black-box to us outsiders, anyway), how much of our knowledge and beliefs are our own?
Guess I'm getting myself down the rabbit hole here. There's no easy answers anywhere. I think it's only a matter of time until we get our own Ozymandias.
>if YouTube silently starts recommending PragerU videos to all the millions of high schoolers that match the "impressionable" profile?
As someone who very rarely aligns with any of Dennis Prager's political dogma, that would be so much better than the "Top 10 Reasons The Earth Might Actually Be Flat" type that dominates reccomendations now, albiet less profitable.
We should of course expect YouTube to behave in a way that prioratizes the benefits to them financialy, etc.
Like you said there's no easy answers anywhere. Understanding that all of YouTube's reccomendations are effectivey silent, programitic , and optimized for profits that get calculated on industry stardard, necessarily shallow engagement metrics like clicks and views is a helpful start but seemingly uncommon knowledge.
https://medium.com/@francois.chollet/what-worries-me-about-a...
Everyone carries a set of basic assumptions about how the human system works, like what's acceptable public behavior, how the government should be run (which affects who you vote for), expectations about how certain people will act, etc.
These assumptions are based on the information we take in each day, like articles read, images viewed/scrolled past, and so on. We ARE our media diets, whether we want to admit it or not.
Our opinions are formed slowly, they change slowly, and there are several well known "bugs" in human thought patterns that make us tend to prefer an echo chamber and reject information that doesn't line up with held beliefs.
AI enables fine-tuned control of exactly what assumptions people gain and maintain. What happens if YouTube silently starts recommending PragerU videos to all the millions of high schoolers that match the "impressionable" profile? What would that do to the basic expectations of the citizenry about what the "right" kind of government and tax system is?
What if the platform was used to convince everyone that annual slavery reparations must be made in perpetuity?
No one wants to admit that ads work on them, but we all know they work in general. So how much of what you know is true, was fed to you? In a world where everything we consume runs through relatively black-box algorithms (black-box to us outsiders, anyway), how much of our knowledge and beliefs are our own?
Guess I'm getting myself down the rabbit hole here. There's no easy answers anywhere. I think it's only a matter of time until we get our own Ozymandias.