Talk about a tough job.
YouTube is seeking to hire thousands of new moderators to review dodgy videos posted online.
It comes in the wake of controversy over the way ads have been linked to, and have therefore in a small way helped to fund, extremist content and hate speech uploaded to the site.
Software has been developed by the firm’s uber-geeks with the aim of identifying this sort of material, and also to ferret out videos that may be unsuitable for children.
The importance of the latter, in particular, shouldn't be underestimated. Many parents despair of their little darlings' addiction to the inane babbling of the stars the site has created, but that’s just what parents do. However, the danger of kids being exposed to really quite nasty stuff before they are able to cope with it is another matter entirely.
The problem, of course, is that the firm's software isn't infallible and doesn’t (yet) have a judgement chip installed.
This has inevitably led to complaints from YouTubers that mistakes are being made and that perfectly innocent content is being caught by the filters.
Those whose videos are flagged can find themselves ineligible for generating ad revenues, and given the sums involved, that matters.
Hence the need for some plain old people to exercise their plain old judgement.
They’ll probably find themselves sifting through a mix of the banal, the infuriating, the silly and the really quite unpleasant. Americans like to refer to salary as “compensation”. In their case that might just be the right word for it.
The side benefit, of course, is that the software should be able to learn from YouTube's human moderators, and thus improve the automated detection of nasty stuff that reputable organisations, and also the UK Government, don’t like their ads appearing alongside, and that reputable parents don't want their kids to see.
Given the sheer weight of content uploaded on to the site every minute, an automated system will inevitably have to serve as the first line of defence.
It's just a pity that it took a scandal, and a brief walk out by some advertisers, to get the firm to realise that it needs more besides that.
Still, the apparent recognition by YouTube that it cannot operate its platform in a social vacuum, and that it needs to spend money on, and invest money in, people to ensure that its space is a safe one, is nonetheless very welcome.
Here's hoping some of its Silicon Valley peers now follow its lead.