Home Random Page


CATEGORIES:

BiologyChemistryConstructionCultureEcologyEconomyElectronicsFinanceGeographyHistoryInformaticsLawMathematicsMechanicsMedicineOtherPedagogyPhilosophyPhysicsPolicyPsychologySociologySportTourism






Is Google Autocomplete Evil?

Writing Comprehension Test for 11th Form Students

“Women shouldn’t have rights.” “Women shouldn’t vote.” “Women shouldn’t work.” How prevalent are these beliefs? According to a resent United Nations campaign, such sexism is dispiritingly common, and it is why they published these sentiments on a series of posters. The source? These statements were the top suggestions offered by Google’s “instant search tool when words “women should not…” were typed into its search box. Google Instant is an “autocomplete” service – which, as the name suggests, automatically suggests letters and words to complete a query, based on the company’s knowledge of the billions of searchers performed across the world each day.

The argument behind the UN campaign is that this algorithm offers a glimpse into our collective psyche – and a disturbing one at that. Is this really true? Not in the sense that the campaign implies. Autocomplete is biased and deficient in many ways, and there are dangers ahead if we forget that. In fact, there is a good case that you should switch it off entirely.

The greatest danger is the degree to which an instantaneous answer – generator has the power not only to reflect but also to remould what the world believes - and to do so beneath the level of conscious debate. Autocomplete is coming to be seen as a form of prophecy, complete with a self-fulfilling invitation to click and agree. Yet by letting an algorithm finishing our thoughts we contribute to a feedback loop that portentously reinforces untruths and misconceptions for future searchers.

Consider the case of a Japanese man who earlier this year, typed his name into Google and discovered autocomplete associating him with criminal acts. He won a court case compelling the company to modify the results. The Japanese case echoed a previous instance in Australia where, effectively, the autocomplete algorithm was judged to be guilty of libel after it suggested the word “bankrupt” be appended to a doctor’s name. And there are plenty of other examples to pick from.

Do you know you can turn autocomplete off just by changing one setting? I’d recommend you give it a try, if only to perform a simple test: does having a computer whispering in your ear change the way you think about the world? Or, of course, you can ask Google itself. For me, typing “is Google autocomplete… “ offered the completed phrase “is Google autocomplete a joke?” Unfortunately, the answer is anything but.

 

Google autocomplete system can subconsciously impact our thought patterns. And so could the mass media. You are to write an essay of that mass media effect on people. The following questions can help guide your thought.

Mass Media consists of radio, television, newspapers, magazines, movies, books and internet.

- Is there any information that you have seen in the last year that you think should not be in the mass media?

- What do you think should be done about this and why?

- Should there be laws against certain types of information being spread? If so, which types? If not, why?

 


Date: 2015-12-17; view: 817


<== previous page | next page ==>
The Deming System of Profound Knowledge | The countryside of England is very varied. One can see hills, moorlands, wooded slopes and rich farmlands.
doclecture.net - lectures - 2014-2024 year. Copyright infringement or personal data (0.005 sec.)