Google is promising to be more vigilant about preventing terrorist propaganda and other extremist videos from appearing on its YouTube site amid intensifying criticism about the internet's role in mass violence.
Its crackdown will involve both computer programs and an expanded group of people dedicated to identifying videos promoting terrorism so they can be blocked from appearing on YouTube or quickly removed.
Google is making the commitment in the wake of violent attacks in the U.S. and elsewhere. A van struck a crowd of people outside a London mosque Sunday, the second time an automobile was used as a weapon in that city this month, and less than a week after a gunman attacked GOP lawmakers on a baseball field.
And earlier this month, British Prime Minister Theresa May called on governments to form international agreements to prevent the spread of extremism online. Some proposed measures would hold companies legally accountable for the material posted on their sites, a liability that Google and other internet companies are trying to avert.
Toward that end, Facebook last week pledged to use more advanced technology and more than 150 human reviewers to find and remove terrorist content before people see it on its social networking site.
Although Google said in a blog post that it is been trying to block extremist content for years, its general counsel Kent Walker wrote that "the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now."
Anti-hate groups like the Southern Poverty Law Center have skewered Google and Facebook for doing too little to muzzle hate groups online.
Google, along with other companies such as Facebook, Microsoft and Twitter, recently agreed to create an international forum to share and develop technology, support smaller businesses and speed up their joint efforts against online terrorism.
To step up its policing efforts, Google will nearly double the number of independent experts it uses to flag problematic content and expand its work with counter-extremist groups to help identify content that may be used to radicalize and recruit terrorists.
The Mountain View, California, company will also train more people to identify and remove extremist and terrorism-related content faster.
Google also is taking a tougher stance on videos that don't clearly violate its policies but still offend broad swaths of society, like those that contain inflammatory religious or supremacist content. YouTube won't remove those videos, but viewers will first have to click through an "interstitial" warning in order to see them.
Google also won't sell ads alongside this category of objectionable video to reduce the moneymaking opportunities for their creators. These initiatives could help Google woo back major advertisers who began pulling back from YouTube earlier this year after learning that their brands sometimes appeared next to unsavory videos.
YouTube also won't recommend these videos to its users, and it won't allow YouTube users to endorse them or leave comments — all efforts aimed at limiting their popularity.
Google is also teaming up with Jigsaw, a company also owned by its corporate parent Alphabet Inc., to target online ads at potential Isis recruits in hopes of diverting them to anti-terrorist videos.