The term "disinformation" today seems to be out of fashion, replaced by the term "fake news" decidedly more fashionable. While the English-speaking term indicates the mezzo used to influence public opinion, talk about "misinformation" puts the emphasis on finality intrinsic in the manufacture of false news. At this point we should forward the correct definition of what is meant by "true" or "false" news, because often, to make likely a false news, is enriched with true facts, which can however be related to times, places and people different from those reported. Furthermore, the story of the "facts" is not always totally separable from the opinion of the person telling, since the construction of a story involves the use of adjectives and adverbs that can not be neutral. But if we had to divest the facts of the story, it would be very difficult to be able to thrill the events of the world.
These reflections led half of the '800 to the definition of the term "social engineering", that is to the formalization of the possibility of being able to manipulate society, and in particular its economic behavior, through the use of appropriately packaged communication techniques in order to obtain the desired result. It is not necessarily a matter of fabricating false information, or of enriching facts with details that are not necessarily true, but it is mainly about being able to spread a story of facts and events that provoke specific behaviors in individuals who read it, listen to it or see it.
An interesting article that appeared recently in the magazine Computers and Security1 highlights i three basic principles of social engineering: asymmetry of knowledge, technocratic domain, replacement of ends.
The ability to change social behaviors relies on the greater knowledge and technical skills possessed by those who want to take on a role of control of another group of individuals, who will be guided to find convenient, consciously or unconsciously, to act in order to achieve the goals of the group that uses social engineering techniques. Commercial advertising is one of the clearest examples of use of these techniques.
Today we are witnessing a further refinement of disinformation techniques, or more precisely of information manipulation, as well described in the recent document published by a study group jointly constituted by the Ministry for Europe and Foreign Affairs and the Ministry of the Armed Forces in France2. A pioneering role has been played by Russia since the early days of the '900. To win the trust of peoples of other nations, the campaigns of (dis) information have had the goal not so much to enhance the benefits of communism, but to enhance the aspects of weakness and criticality of the culture of government of the target countries of the attack , showing itself as a possible anchor of salvation3. However, the use of forms of censorship aimed at blocking disinformation attempts, including through direct funding of journalistic investigative activities, ends up playing the misinformation campaign, because both are united by interference by the State in the sphere of free information: controlling and limiting freedom of expression paradoxically, the content of disinformation campaigns, which are often based on accusing the "enemy" government of information control, is made true.
Consequently, the very nature of disinformation services is changing, as reported in the recent article by Wired in which the United Kingdom's 77ma brigade is presented4. It is a real production center for multimedia information material aimed at capturing the interest and influencing public opinion following the well-known rules in the use of social networks in the advertising sector. The aim is to create communities of network users who trust the news and communications sent from the official channels.
The evolution of technologies for creating and manipulating multimedia contents, which make the narration even more effective in making "true" news, have made social engineering activities even more sophisticated and more subtle. The difficulty in distinguishing the truth of an image or a video derives not only from the ample possibilities of editing, retouching and manufacturing, but also from the potential of social networks that allow the dissemination of news from below, without apparent filters, and therefore considered more truthful than those presented by the official media. So the "truth" comes not from the verification of the sources, but from the community that supports the opinion. One of the social networks that has been most exploited in disinformation campaigns is Twitter through the use of automatic programs (called muzzle, robot) that create perfectly probable false profiles that engage in discussions based on the analysis of the topics of tendency in order to inflame the debate and quickly involve a large number of real users who unknowingly give credibility to the content5.
Returning to the topic of multimedia content to support the story misinformative, the difficulty to detect false has been reported for several years in various technical-scientific articles. The first targets are the journalists whose instruments available to verify the facts are still lacking if compared with the level of sophistication achieved by the software for the creation of false verosimili6. This is the case, for example, of the work done by bellingcat7 in submitting the news to the verification of the facts through the analysis of the available sources and their reliability and reliability. A task far from simple, as witnessed for example in the case of the news of the chemical attack in Syria of 7 April 2018 reported by the Caschi Bianchi (frame), which some Russian media presented as "fake news" through images and video that prove the existence of a film theater to make fake news. According to the survey carried out by Bellingcat The soundstage exists, but it is not connected to the chemical attack8, and yet the absence of third-party witnesses on the spot has made the task of ascertaining the facts extremely difficult having to wait for the month of July to have the results of the verification carried out by the international organization for the prohibition of chemical weapons (OPCW)9. The result is that the mutual accusations of false news on Syria legitimize all the parties in conflict to start bombing even before verifying the facts, as happened the 14 April with bombings by USA, UK and France10, a few hours before the OPCW mission could start the activities of ascertaining the actual presence of traces of chemical attacks. It follows a growing belief in public opinion that there are no "facts" but only "narratives" that everyone can make to motivate their actions.
At the academic level, the research group in "Computational Propaganda" at the Oxford Internet Institute of the University of Oxford11 is studying the influence that the misinformation services of opposing nations may have had during election campaigns in several nations since the 2012. The latest report from the December 17, commissioned by the US Senate, focuses on the activities of the Russian IRA (Internet Research Agency) department to influence US policy in the 2012-2018 period. The analysis of the tweet shows how the virtual world has become an attractive battleground between nations thanks to the ability to quickly create movements in public opinion. If the "traditional" disinformation campaigns have as their main objective the legitimacy of armed conflicts that generate victims among those civilians who want to defend from dictatorial powers, the novelty of campaigns conducted on social media resides in the objective of causing the overturning of political power without resorting to military actions from outside, but through exasperation of internal conflicts. Depending on the target nation, this is resolved either in an electoral result pleasing to the attacking nation, or in a violent action, where there is no trust in the electoral mechanisms.
It is evident that the technological progress that allows the lives of individuals and nations beyond the confines of the 5 senses to be extended, is profoundly transforming the origin and nature of conflicts among nations, shifting it increasingly in the information field. In this terrain it is often quite difficult to distinguish the facts from the imaginative constructions, especially when in order to communicate the facts we want to use the style of the narration, very effective from the point of view of emotional communication, but extremely vulnerable to the scientific rigor of verifying the veracity. It follows that it becomes easier to become victims of disinformation campaigns at least investing in the education to the use of rational knowledge, giving each citizen the ability to directly access information sources with technical descriptions, which certainly require patience and knowledge and can be "boring", but they are the only antidote to the indiscriminate use of the narrative style, which hides the desire to exploit and expand theasymmetry of knowledge, and the technocratic domain. The presumed aim of the narrative style of creating less distance from citizens and obtaining consensus conceals the opposite purpose in reality, that of keeping citizens increasingly distant from the mechanisms of functioning and of government, making the society more vulnerable overall.
1J. Hatfield, "Social Engineering in Cybersecurity: The evolution of a concept," Computers & Security, 2018.
11https://comprop.oii.ox.ac.uk
Photo: US Air National Guard (New Jersey Army National Guard Rescue Exercise) / web