His warning came at the World Economic Forum Annual Meeting in Davos, during a panel discussion on the dangers of disinformation Seth Moulton, a congressman, said the principles of a free press are established and accepted for traditional media but we are ‘having trouble translating those to the social media world’ DAVOS: ChatGPT, the artificial intelligence-powered natural language processing tool, will exacerbate the global problem of disinformation, Arthur Gregg Sulzberger, the chairman of the New York Times, said during a panel discussion at the World Economic Forum Annual Meeting on Tuesday. “A lot of this will not be information that is created with the intent to mislead, but based on everything I’ve read, I suspect we are going to see huge amounts of content that is produced, none of which is particularly verified (and) the origins of which are not particularly clear,” he added. “I think we are getting to a point where tools are going to make it harder and harder to solve this problem. “We need to address this information crisis but we also need to rebuild an ecosystem that is weaker than ever.” He added that to tackle the crisis, the media has “to go back to first principles, which is if you do not want bad information, you need to crowd it out with good information.” Seth Moulton, a member of the US Congress representing Massachusetts, said he believes that “there is a hunger for the truth,” which means “the market will be even bigger for the machine that can identify disinformation than for the machine that makes it easier to write your fourth-grade history paper.” He added that accountability should be sought and enforced to achieve “some level of public safety,” explaining that the principles of a free press have been “established for traditional media, that we have accepted for a long time, and we are just having trouble translating those to the social media world.” The panelists, who were discussing the dangers of disinformation, agreed that, to some extent, consumers of news are aware of the existence of what Sulzberger described as a “broader mix of bad information that is corrupting the information ecosystem.” “There is no doubt that society seems to have, at some level, accepted how much the information ecosystem has been poisoned and I think it is going to require real, sustainable efforts from the platforms, political and business leaders, and consumers themselves, to reject that,” he said. Jeanne Bourgault, the president and CEO of media organization Internews, said that “people are also getting used to navigating (disinformation) a little bit better.” To illustrate this, she highlighted the “unbelievably complicated information environment” in the Philippines and added: “Yet, people were able to find the information they needed.” She said one of the “most worrisome” disinformation trends is “gendered disinformation,” and that “these types of stories hit women so much worse — women politicians.” She added: “It has been proven across the board that women online get harassed, and online harassment becomes offline harassment very, very quickly.” Vera Jourova, the vice president of the European Commission for Values and Transparency, said: “To legislate on how the digital space should look is a pretty daring exercise.” She explained that this is because legislating bodies must ensure that any rules that are introduced cannot be abused. Sulzberger agreed that “terms like fake news were greedily gobbled up by autocratic regimes — and aspiring autocratic regimes — who then passed laws that they claimed were banning ‘fake news’ … but were actually banning the scrutiny and accountability provided by an independent press.” Jourova suggested three main steps that could be taken to address the disinformation crisis, the first of which is to “make sure the disinformers do not find the feeding ground, the society which is willing to get brainwashed.” To achieve this, she stressed the need to make citizens “more resilient through education and the work of professional media.” The second step, Jourova said, is for the representatives of democratic governments to improve communication strategies, while the third involves proper regulation. “The content that is illegal offline has to be treated as illegal online, such as terrorism, political extremism and hate speech,” she said, adding that 90 percent of requests to Facebook for the removal of content come from government bodies. Jourova urged citizens to be more demanding of the truth and to “look into what is promised in political campaigns” because they “are full of lies and unreachable goals.”
مشاركة :