Use of algorithms to identify families for attention raises stereotyping and privacy fears.Vast quantities of data on hundreds of thousands of people is being used to construct computer models in an effort to predict child abuse and intervene before it can happen, the Guardian has learned. Amid mounting financial pressure, local councils are developing “predictive analytics” systems to algorithmically identify families for attention from child services, allowing them to focus resources more effectively. But while the new algorithmic profiling could be one way of helping social workers, it is likely to be hugely controversial due to its potential to intrude into individual privacy. There is also the risk of accidentally incorporating and perpetuating discrimination against minorities. Read more.