Latest neural networks publications
The combination of AI and large data sets has profoundly improved our ability to model the world around us, predict its next move and recognize its images and patterns.
Underpinning all of this data-driven innovation, though, are servers and accelerators that can devour astronomical amounts of energy, depending on the task.
Last year, research indicated that training a single AI algorithm can require up to 284 tonnes of carbon dioxide – five times the lifetime emissions of an average car.
Machine learning researchers are not solely concerned with improving the accuracy of models. They want to know how they can be corrupted and undermined – a research agenda that warrants more attention. Techerati spoke to an IBM researcher at the heart of it.
In a landmark scientific achievement, Columbia neuroengineers show the mental activity of mankind’s most inner sanctum can be sensed and distilled into ordinary speech Decades of scientific research has shown that the act of speaking – or even thinking about speaking – produces traceable and distinct patterns of activity in the brain. The realisation spawned a… Read More