Is Wikipedia's AI Revolutionizing Knowledge Sharing? Cost, Gain & Impact Revealed!

Wikipedias AI Revolution: Cost, Gain & Impact
Abhishek Founder & CFO cisin.com
In the world of custom software development, our currency is not just in code, but in the commitment to craft solutions that transcend expectations. We believe that financial success is not measured solely in profits, but in the value we bring to our clients through innovation, reliability, and a relentless pursuit of excellence.


Contact us anytime to know moreAbhishek P., Founder & CFO CISIN

 

Collaboration with Wikimedia Foundation and Jigsaw to Stop Abusive Comments

Collaboration with Wikimedia Foundation and Jigsaw to Stop Abusive Comments

 

In one attempt to halt the trolls, Wikimedia Foundation partnered with Jigsaw (the technology incubator formerly known as Google Ideas) on a study project named Detox using machine learning to flag comments which may be personal attacks.

This project is part of Jigsaw's initiative to construct open-source AI programs to help fight harassment on social media platforms and forums.

Step one in the project was to train the machine learning algorithms using 100,000 noxious opinions from Wikipedia Talk pages that were identified by a 4,000-person human team where every comment had ten distinct human reviewers.

This annotated data-set was among the largest ever created that looked at online misuse. Not only did include direct personal attacks, but also third-party and indirect personal attacks ("You are horrible." "Bob is dreadful." "Sally said Bob is dreadful.") Following training, the machines can establish a comment was a personal attack just and three individual moderators.

Then, the project team needed the algorithm inspection 63 million English Wikipedia comments submitted during a 14-year period between 2001 to 2015 to locate patterns from the violent remarks.

What they discovered was outlined in the Ex Machina: Personal Attacks Found at Scale paper:

  1. Greater than 80% of all comments characterized as abusive were created by over 9,000 people who made less than five violent remarks in a calendar year as opposed to an isolated group of trolls.
  2. Nearly 10 percent of all attacks were made by only 34 users.
  3. Anonymous users made up 34% of comments left Wikipedia.
  4. More than half of the personal attacks are being carried out by documented users although anonymous users have been six times more likely to launch personal attacks.

    (There are 20 times more registered users than anonymous users.)

Now the algorithms have created more clarity about who's contributing to the community's toxicity, Wikipedia may find out the best way to fight the negativity.

Even though human moderation is probably still needed, algorithms can help sort through the opinions and flag those that require human participation.


Objective Revision Evaluation Service (ORES System)

Objective Revision Evaluation Service (ORES System)

 

Another reason behind the significant decline in editors into Wikipedia is regarded as the company's complicated bureaucracy in addition to its rigorous editing approaches.

It was common for first-time contributors/editors to have an entire body of work out with no excuse. One way they hope to combat this situation is using the ORES system, a machine which acts as an editing system powered by an algorithm trained to score the caliber of edits and changes.

Wikipedia editors employed an online tool to label examples of past edits, and that was how the algorithm has been taught the seriousness of mistakes. The ORES system can direct individuals to examine the most damaging edit and ascertain the grade of errors --rookie mistakes are handled more appropriately as innocent.


AI to Write Wikipedia Articles

AI to Write Wikipedia Articles

 

Well, AI can do "OK" composing Wikipedia articles, but you must start somewhere, correct? A staff in Google Brain taught applications to summarize info on web pages and write a Wikipedia-style post.

It turns out text summarization is more challenging than most of us thought. Google Brain's attempts to acquire a machine to outline content is a little better than previous efforts, but there is still work to get done before a machine can compose with all the cadence and flair people can.

It turns out we're not quite prepared to have a machine generate Wikipedia entries, but there are efforts underway to get us there.


Objective Revision Evaluation Service (ORES System)

Objective Revision Evaluation Service (ORES System)

 

Another reason behind the significant decline in editors into Wikipedia is regarded as the company's complicated bureaucracy in addition to its rigorous editing approaches.

It was common for first-time contributors/editors to have an entire body of work out with no excuse. One way they hope to combat this situation is using the ORES system, a machine which acts as an editing system powered by an algorithm trained to score the caliber of edits and changes.

Wikipedia editors employed an online tool to label examples of past edits, and that was how the algorithm has been taught the seriousness of mistakes. The ORES system can direct individuals to examine the most damaging edit and ascertain the grade of errors --rookie mistakes are handled more appropriately as innocent.


Conclusion

Conclusion

 

While the use cases for artificial intelligence in the surgeries of Wikipedia are still being optimized, machines might definitely help the organization analyze the vast amount of data they generate daily.

More info and analysis will help Wikipedia create effective strategies to troubleshoot negativity from the community and recruiting issues for its subscribers.