> Not "may" but will - it's very easy to render your network essentially unauditable (you don't have thousands of petabytes' worth of storage available? you can't fab your own ML coprocessor? too bad!)
Google can. So that does not apply.
> You bring up Trump's Twitter suspension in the other subthread, but in that case the failure was transient, and the person responsible and their reasoning were quickly identified; YouTube has demonetized videos with little to no explanation or recourse.
There's no recourse or explanation because it is not youtube's business model regardless of what/who makes a decision to demonetize them. Adding arbitrary human behavior to those decisions only makes it worse.
If nobody outside Google can test the algorithm, by definition it isn't auditable.
Adding arbitrary human behavior to those decisions only makes it worse.
Neural networks are derived from human behavior - they don't magically divine the spirit of what you want to do, they're an approximation based on training data someone has to put together.
Google can. So that does not apply.
> You bring up Trump's Twitter suspension in the other subthread, but in that case the failure was transient, and the person responsible and their reasoning were quickly identified; YouTube has demonetized videos with little to no explanation or recourse.
There's no recourse or explanation because it is not youtube's business model regardless of what/who makes a decision to demonetize them. Adding arbitrary human behavior to those decisions only makes it worse.