The full article appears in The Gadget Show Official Magazine (published November 2017). Copies can be bought here
Polaroids are well and truly back. Instant photography has been bubbling along for a few years but a wave of nostalgia inspired by the likes of Netflix drama Stranger Things, coupled with a brand new Polaroid camera, is putting the distinctive square snaps back on the map.
The story of instant photos goes back to 1937, when Edwin Land founded Polaroid. The company popularised instant snaps but went bankrupt in 2001 and scrapped instant film production in 2008 as digital cameras took over.
Shortly afterwards a brand known as The Impossible Project bought Polaroid’s last factory and film stock. Since acquiring the Polaroid name this year, it has relaunched as Polaroid Originals with a new retro-styled camera, the OneStep 2 (£109.99).
Designed to resemble the original OneStep from 1977, the new camera takes both classic 600 film and Polaroid’s new i-Type film, and has a built-in flash, a self-timer for selfies and a 60-day battery life. But what exactly is it about instant photography that makes it so appealing?
The full article appeared in the 24 November 2017 issue of Metro and can be also be viewed in the e-edition.
Athletes who repeatedly suffer blows to the head face brain injuries and in the most extreme cases, death. Now, a new study has identified a biomarker that could be used to diagnosis a brain disease that affects athletes with repeated head injuries.
CTE (chronic traumatic encephalopathy), which can currently only be diagnosed after death, is a progressive degenerative brain condition found in athletes who have suffered repeated trauma to the head, including concussions. The condition has a number of behavioural symptoms including aggressiveness, depression and memory loss.
The disease is especially prevalent in American football players with a recent study from Boston University finding signs of CTE in 99 per cent of brains from deceased American football players…
You can read the full article at Wired UK (originally published 11 October 2017).
More than 80 per cent of the TV shows people watch on Netflix are discovered through the platform’s recommendation system. That means the majority of what you decide to watch on Netflix is the result of decisions made by a mysterious, black box of an algorithm. Intrigued? Here’s how it works.
Netflix uses machine learning and algorithms to help break viewers’ preconceived notions and find shows that they might not have initially chosen. To do this, it looks at nuanced threads within the content, rather than relying on broad genres to make its predictions. This explains how, for example, one in eight people who watch one of Netflix’s Marvel shows are completely new to comic book-based stuff on Netflix.
To help understand, consider a three-legged stool. “The three legs of this stool would be Netflix members; taggers who understand everything about the content; and our machine learning algorithms that take all of the data and put things together,” says Todd Yellin, Netflix’s vice president of product innovation…
You can read the full article at Wired UK (originally published 22 August 2017).
Music has always been at the cutting edge of technology so it’s no surprise that artificial intelligence and machine learning are pushing its boundaries.
As AIs that can carry out elements of the creative process continue to evolve, should artists be worried about the machines taking over? Probably not, says Douglas Eck, research scientist at Google’s Magenta.
“Musicians and artists are going to grab what works for them and I predict that the music that will be made will be misunderstood by many people,” Eck, told WIRED at Sónar+D, a showcase of music, creativity and technology held this week in Barcelona.
At the event, which is twinned with the Sónar dance music festival, Google held an AI demonstration where Eck showed a series of basic, yet impressive musical clips produced using machine learning model that was able to predict what note should come next.
The Magenta project has been running for just over a year and aims discover whether machine learning can create “compelling” creative works. “Our research is focused on sequence generation,” Eck says, “we’re always looking to build models that can listen to what musicians are doing. From that we can extend a piece of music that a musician’s created or maybe add a voice”.
Just as the drum machine was loathed and feared by many when it first hit the mainstream in the 1970s, AI’s role in the creation of art has sparked similar fears among critics. Eck, who admits that he was initially among the drum machine haters, explains that it took an entire generation of musicians to take the technology and figure out how to take it forward without putting good drummers out of work. He envisages a similar process of misunderstanding and eventual acceptance for AI-based music tools.
You can read the full article at Wired UK (originally published 18 June 2017).