By – Pierre Jarre, Christopher Emmanes and Jacques Fosterer
During the past two years, the term “fake news” has become the term. The Cambridge English Dictionary has “inflicted false stories” on the Internet or using other media, has generally introduced political ideas as an influence or fun.
As part of the global push to deliberately deliberate the spread of false information, researchers are trying to understand how people distribute fake news and support how social networks Can submit through fast, secure, easy-to-use browser
But humans are complex social animals, and technology reminds us of the richness of human learning and interaction.
So we decided to take a different approach in our research. We use artificial intelligence to study using the latest technologies – with a support or opposition – a piece of counterfeit news can spread within the social network. We believe our model is more realistic than previous perspectives because in our model people have to interact with their environment in a mutual way, and not only follow the rules. Our novel view has allowed us to learn new things about spreading fake news.
The key remains away from our research that, when it comes to preventing fake news spread, is the key to privacy. It is important to keep your personal data yourself and be careful when you provide information to large social media websites or search engines.
The latest wave of technical innovations has taken us to the mainstream challenges of user’s privacy and shared ethics in Data Center Web 2.0 and along with social networks. But as our research shows, this technology combines with a healthy diet of individual activities, can also cure fake news curses.
Human Behavioral Modeling
In the two ways of the current literature model, spread the fake news to the social network.
In the first instance, you can feel that when people do their neighbors and use this information in complex calculations, they use to improve their beliefs about the world.
The second point of view is that people follow the rule of majority of majority: doing everything they do most of their neighbors.
But they are lacking in both perspectives. They can not agree when a brain changes after conversation or conversation.
Our research was different. We assign humans as agents who have developed their strategy on updating their neighbors’ actions about how to update their news. Then we introduced an opponent who tried to spread fake news and contested against him when he failed to get information about the power of other agents’ beliefs.
So in the example of a real world, an opponent decided to spread fake news, which you can read your Facebook profile first and you believe that you have an endless effort. Try and send you fake news to increase its chances. .
We learned some new things about spreading fake news. For example, we show that providing feedback on news means that people are easy to detect counterfeit news.
Our work also shows that injection fraudulent news into a social network can encourage users to improve counterfeit news.
Mutually, we can also use models like us to come up with strategy about how to prevent fake news spread.
From everything we have learned from the research that everyone can do to prevent fake news.
Fight fake news
Because humans learn from their neighbors, who learn from their neighbors, and so on, fake news detectors and people who can help spread fake news on the network. How can we prevent counterfeit news when we define the model, we used to know that consumers had the only way to allow their friends to provide feedback about the federation.
In addition to pointing out fake news, you should also appreciate a friend when he has a good research on quality journalism and sharing balanced pieces. The important thing is that this may also be a surprise when you agree with the result or political perspective described in the article. The study of human psychological and promotion shows that people adopt their behavior in response to negative and positive feedback – especially when it comes from within their social circle.