Imagine taking a picture just by thinking.
You train a neural interface to recognise the patterns of activity that are fired when you hit the shutter button on a camera, or your phone. Then when you think that thought, the neural interface triggers snapshots from discrete – and discreet – wireless cameras distributed around your body. One in your glasses, one in your shirt button, one in your shoes.
Software in the cloud stitches the images together into a multi-megapixel whole and works out what the likely focus was meant to be, dynamically polishing the output, sharing it to your social streams and storing it for posterity.
This isn’t some wild sci-fi fantasy. It’s a very close reality.
Neural interfaces are already consumer-items, available for just a few tens of pounds in gaming systems. Recognising the same brain patterns being repeated should actually be relatively simple.
At Mobile World Congress last week Rambus showed me a camera the size of a pinhead. It needs no lens, and will cost less than 20p per unit once it is manufactured in volume.
Wireless data standards for short range transmission advance apace. Power requirements at the personal area range are low. And with the demonstrations the Alliance for Wireless Power showed me last week, a wireless charging unit could keep button-sized batteries powered up all day.
Send the images up to the cloud over 4G – it doesn’t have to be instantaneous if you’re not stood there holding your phone and waiting – or Wi-Fi. There’s loads of computing grunt on tap and automatic post-processing is already well developed.
This is real. The question is, do we want it?
People are already uncomfortable with Google Glass, but that stands out a mile. What happens when your smart wearable devices disappear into the fabric of your everyday clothing – a theme to which I keep returning because it is imminent.
It’s up to us to discuss this stuff and set some rules, if we want them,
Amid the hype and bluster of Mobile World Congress it is refreshing to hear someone admit they don’t know the answer. Francisco Jose Jariego Fente is Telefonica Digital’s Industrial Internet of Things Director. The question he willingly accepts he can’t answer is admittedly a tricky one: what is the business model for smart cities?
Telefonica has more evidence than most for what the answer, or answers, might be. Its project in Santander has proven there is little money to be made in the hardware: the city rolled out 12,000 sensors funded by a relatively small EU1m from the EU. And the sum of the data collected from those sensors, just 5MB per day, similar to a single photo or MP3 file, suggests there is very little to be made in its carriage or storage.
The biggest challenges, and hence the biggest potential revenues, come in processing and presenting the data in a useful form. This is where Telefonica has focused its efforts and is looking to commercialise the learning from the Santander experiment. IBM too has recognised that this is where the value lies.
But this value only becomes tangible when the rest of the smart city ecosystem is in place. Cities are complicated. They are managed by multiple authorities and commercial parties. They evolve constantly, reacting to the needs of their inhabitants. And those inhabitants themselves, who in many ways represent the city much more than its buildings or infrastructure, have a say in how it develops: any executive control is limited.
Building a smart city on a green field site like South Korea’s Songdo is one thing. But there are huge drivers to smarten all our cities. And that means retrofitting technology, processes and partnerships to an existing, evolved organic environment. One model isn’t going to fit every city. Making it happen will be a process of negotiation, integration, iteration. And there will be lots of different parties involved: political leaders, civil servants, service providers, technology companies, health services, police forces, property owners and most important of all, the citizens themselves.
Brokering a framework that keeps all of these people at least relatively happy, while delivering on the promise of smart cities is no small task. It will only come through dialogue. But it’s a conversation we need to have. Because the promise of smarter cities is too great to ignore.
In the first instance there is simply lower costs, both financially and to the environment. There are lifestyle benefits: less traffic, quicker parking, more efficient public transport. Taking things a step further, there are advantages to planners: recognising a noise problem in one place might inform a change in planning to a new building nearby, perhaps requiring materials that absorb or deflect sound, or the planting of trees as a screen. Ultimately, there is the prospect of properly understanding our cities and the interactions that make them live, so that we can make more informed decisions about their future, in local government, in corporations, and as individuals.
Smart cities have long held promise, but the complexity of the problem they present has retarded their progress. To get things moving, as we need to do, a broad and open conversation between all of the interested parties is required. To agree how the interactions will be managed, and vitally the costs and rewards will be divided.
Last night was spent debating Manchester’s future as a home to technology innovation, with representatives from the city leadership, technology, telecoms, law, and finance firms. One issue really stuck with me. Everyone was keen on the idea of promoting the digital industry but there was frustration at the lack of a common voice for this sector.
The problem is this: no-one can speak for the digital industry because there is no digital industry. Not in Manchester. Not anywhere. What is loosely grouped into ‘tech’ or ‘digital’ is actually three or four (possibly even more) very distinct sectors with very, very different needs.
In Manchester I’d classify these as ‘Products’, ‘Services’ and ‘Infrastructure’ but you could probably add ‘Materials’ and ‘Advanced Manufacturing’.
Product companies are proper tech start-ups. Companies that are incubating an idea with great potential to scale. They need an environment to network and meet to start with, so that teams can naturally assemble. Then above all else they need time. Time means the right sort of finance, and low overheads: office space, connectivity etc. Once they reach scale they need access to talent: generally high level talent with specific, technical skill sets but also sales, support, marketing and creatives.
Services companies are largely marketing/digital agencies of one form or another. The skills they require are very different, as much creative and inter-personal as technical. These companies have limited potential for scale: it’s a highly competitive market. Growing means adding people and the cost of managing those people rapidly starts to diminish the focus of the founders. The best hope is reasonable scale, stability, and good margins, and ultimately perhaps a trade sale to a local rival or national network. What they need is opportunities to sell and access to contracts, from the public sector and large local companies.
Infrastructure companies might serve the start-ups, or the agencies, or any other businesses around Manchester. Depending on their particular focus the challenges might be access to power, the cost of laying fibre, or competing with unfairly advantaged national players. They need technical skills but those skills are generally very different to those required by the product or service companies.
Manchester’s agencies have a good representative body. But that body doesn’t speak for start-ups, and it doesn’t speak for the infrastructure players. In fact I don’t know of any single bodies that claim to speak for those groups and their needs.
If Manchester, and the UK as a whole, is going to have future economic success powered by technology-driven businesses, then those businesses need to be understood for what they are. Not conflated in groups under meaningless terms like ‘the digital industry’, ‘the technology sector’, or ‘the knowledge economy’.
Twitter Image Courtesy of DigitalTrends.com
Recently I was a witness in a trial. You don’t need to know the (frustrating, depressing) details, just that I turned up when asked, said what I had seen, and left again.
This is what I want Twitter to do.
In the wake of the appalling abuse hurled at Stan Collymore, this is what I spent a chunk of yesterday explaining to various BBC shows. We do not want an unregulated, foreign, social network to be the ultimate arbiter of acceptable behaviour online.
Yes it should have a fair use policy, and yes it should automatically block people in breach of that policy. But more importantly than that it should make it very, very easy for the relevant law enforcement agencies in any country to collect evidence and respond appropriately.
My main concern with the Collymore case, ignoring the evidence it has brought to light of the number of absolute tools out there, is the time he reports it has taken for Twitter to pass evidence to the police: six weeks in one case, according to Collymore.
Now I recognise that there are cases where Twitter should not be sharing evidence of the identity of anonymous users. But not here: these cases are clear cut racial abuse and threats of violence. There’s no argument for freedom of speech.
Twitter needs a means of rapidly recognising when the sharing of data is justified, and responding. But perhaps the police and Crown Prosecution Service need a better system of responding too.
It’s clear from the case of Caroline Criado-Perez and others that when prosecutions do come, they are often only of a tiny minority of offenders. Is there perhaps a way for the police to automate the collection of data from Twitter (and other social networks) and process prosecutions? Rapidly sending out warning letters (emails?) to caution people that their abuse has been noted and a prosecution may follow may help to thin out the stream of invective and limit it to the truly committed idiots.
This might sound a little oppressive and that concerns me too. But think of it like policing a drunken city centre on a Friday night. There will be lots of infringements but the police are unlikely to prosecute all of them. Instead they will dissuade most people from committing serious offences through their presence and a stern word at the right times.
This might not satisfy everyone, but it might help to keep Twitter and other public social networks open for the type of rich debate and sharing of information that so many of us enjoy and value.
WARNING: TECHIE POST. Following my visit to the smart city project in Santander with Telefonica last year, I was inspired to start building my own smart home based on similar technologies. This is partly an exercise of my rusty engineering skills but mostly about learning the realities of smart cities/smart homes through experience. I’ll write up the lessons in a much less techie form, but for those who are interested, I’ll also be documenting the detail here.
In the last episode I got my first sensor up and running on the end of an Ethernet cable thanks to the RESTDuino sketch. Now I need to get the data this send back into my SQL database.
This proved to be incredibly simple. Now note: I am NOT a coder and this stuff is probably seriously ugly. But it works and that’s what I care about right now.
Into the code I had put together for the AlertMe API I added the Pest library, which makes accessing a RESTful interface using PHP dead easy. It’s then a simple task to extract the useful data from the little JSON string that comes back from my light sensor and squirt it into a new table in my SQL database.
It’s worth talking a little bit about the database here. I’m structuring it with a series of tables for each different type of data: power, temperature, light, etc etc. Rather than one for each room. This should mean that I don’t have to create a new table each time I add a new sensor. I can just record values indexed by their time stamp and room code. I’m hoping this should make things simpler.
Anyway, after just a few failed attempts (mostly down to my lack of understanding of JSON and arrays) my code was working and happily sticking data from the light sensor into my database.