Credit: V&A Museum
Technological innovations could help us achieve immortality and preserve the future of the human race
THE idea of creating a ‘digital you’ that lives on after you die may sound like something straight out of Black Mirror but the wheels are already in motion — as are developments to ensure the future survival of our race should doomsday descend.
The Eternime app, for example, which is still in testing mode, uses social media posts to build a digital avatar your family and friends can interact with after you’re dead. So far, so creepy, right? The app features in a major new exhibition at London’s Victoria and Albert Museum called The Future Starts Here, where the digital afterlife is just one of the technological innovations that could — in theory — keep us living forever…
The full article appeared in the 18 May 2018 print edition of Metro and can also be viewed in the e-edition.
The full article appears in The Gadget Show Official Magazine (published November 2017). Copies can be bought here
Polaroids are well and truly back. Instant photography has been bubbling along for a few years but a wave of nostalgia inspired by the likes of Netflix drama Stranger Things, coupled with a brand new Polaroid camera, is putting the distinctive square snaps back on the map.
The story of instant photos goes back to 1937, when Edwin Land founded Polaroid. The company popularised instant snaps but went bankrupt in 2001 and scrapped instant film production in 2008 as digital cameras took over.
Shortly afterwards a brand known as The Impossible Project bought Polaroid’s last factory and film stock. Since acquiring the Polaroid name this year, it has relaunched as Polaroid Originals with a new retro-styled camera, the OneStep 2 (£109.99).
Designed to resemble the original OneStep from 1977, the new camera takes both classic 600 film and Polaroid’s new i-Type film, and has a built-in flash, a self-timer for selfies and a 60-day battery life. But what exactly is it about instant photography that makes it so appealing?
The full article appeared in the 24 November 2017 issue of Metro and can be also be viewed in the e-edition.
By AJ Guel – originally posted to Flickr as Fumble, CC BY 2.0
Athletes who repeatedly suffer blows to the head face brain injuries and in the most extreme cases, death. Now, a new study has identified a biomarker that could be used to diagnosis a brain disease that affects athletes with repeated head injuries.
CTE (chronic traumatic encephalopathy), which can currently only be diagnosed after death, is a progressive degenerative brain condition found in athletes who have suffered repeated trauma to the head, including concussions. The condition has a number of behavioural symptoms including aggressiveness, depression and memory loss.
The disease is especially prevalent in American football players with a recent study from Boston University finding signs of CTE in 99 per cent of brains from deceased American football players…
You can read the full article at Wired UK (originally published 11 October 2017).
Wired UK/photography: Nick Wilson
The full article appears in Wired UK October 2017 issue and can be also be viewed online at Wired UK.
More than 80 per cent of the TV shows people watch on Netflix are discovered through the platform’s recommendation system. That means the majority of what you decide to watch on Netflix is the result of decisions made by a mysterious, black box of an algorithm. Intrigued? Here’s how it works.
Netflix uses machine learning and algorithms to help break viewers’ preconceived notions and find shows that they might not have initially chosen. To do this, it looks at nuanced threads within the content, rather than relying on broad genres to make its predictions. This explains how, for example, one in eight people who watch one of Netflix’s Marvel shows are completely new to comic book-based stuff on Netflix.
To help understand, consider a three-legged stool. “The three legs of this stool would be Netflix members; taggers who understand everything about the content; and our machine learning algorithms that take all of the data and put things together,” says Todd Yellin, Netflix’s vice president of product innovation…
You can read the full article at Wired UK (originally published 22 August 2017).
The full article appears in All About Space issue 66 (originally published on 26 June 2017). Free preview:
Bjork performing at Sónar 2017 (Santiago Felipe/Sónar+D)
Music has always been at the cutting edge of technology so it’s no surprise that artificial intelligence and machine learning are pushing its boundaries.
As AIs that can carry out elements of the creative process continue to evolve, should artists be worried about the machines taking over? Probably not, says Douglas Eck, research scientist at Google’s Magenta.
“Musicians and artists are going to grab what works for them and I predict that the music that will be made will be misunderstood by many people,” Eck, told WIRED at Sónar+D, a showcase of music, creativity and technology held this week in Barcelona.
At the event, which is twinned with the Sónar dance music festival, Google held an AI demonstration where Eck showed a series of basic, yet impressive musical clips produced using machine learning model that was able to predict what note should come next.
The Magenta project has been running for just over a year and aims discover whether machine learning can create “compelling” creative works. “Our research is focused on sequence generation,” Eck says, “we’re always looking to build models that can listen to what musicians are doing. From that we can extend a piece of music that a musician’s created or maybe add a voice”.
Just as the drum machine was loathed and feared by many when it first hit the mainstream in the 1970s, AI’s role in the creation of art has sparked similar fears among critics. Eck, who admits that he was initially among the drum machine haters, explains that it took an entire generation of musicians to take the technology and figure out how to take it forward without putting good drummers out of work. He envisages a similar process of misunderstanding and eventual acceptance for AI-based music tools.
You can read the full article at Wired UK (originally published 18 June 2017).
A mural in the newest building at Netflix’s Silicon Valley HQ features characters from its original series (Netflix)
WIRED went behind the scenes at the Californian HQs of Netflix and Dolby for an exclusive peek at how your favourite shows are brought to the screen
Netflix first launched in the UK in 2012 and, along with catch-up services like BBC iPlayer and streaming rivals like Amazon Prime, has completely transformed the way we watch television.
WIRED was invited along to the firm’s recent Netflix Labs Day at its Los Gatos headquarters in the heart of Silicon Valley for the global release of Marvel’s Iron Fist and to hear more about the innovations that brought it to the screen.
The firm is renowned for its ever-expanding range of original series from political drama House of Cards and 80s sci-fi throwback Stranger Things to 13 Reasons Why – one its latest offerings telling the disturbing story of why a teenaged girl took her own life. Unlike Amazon, Netflix has ditched the expensive process of producing pilot episodes, opting for a more direct approach.
A simulation of SDR vs HDR output for Marvel’s Iron Fist (Netflix)
“It really starts with a great idea, and a team wanting to bring it to life,” explained Cindy Holland, VP of Originals Series at Netflix. “We use data to work out what’s the minimum threshold audience size that we need, in order to justify the economics of a project that we’re thinking about”.
Marvel’s Iron Fist is one of the latest arrivals, with the comic book brand’s global clout helping Netflix conquer countries where it’s not so well known. It’s also the first of Netflix’s Marvel series to be shot using Dolby Vision – the audio and video firm Dolby’s enhanced version of HDR…
You can read the full article at Wired UK (originally published 14 April 2017).
The full article appears in All About Space issue 64 (originally published on 27 April 2017). Free preview: