O'Really?

June 15, 2012

Alan Turing Centenary Conference, 22nd-25th June 2012

Alan Turing by Michael Dales

The Alan Turing statue at Bletchley Park. Creative commons licensed picture via Michael Dales on Flickr

Next weekend, a bunch of very distinguished computer scientists will rock up at the magnificent Manchester Town Hall for the Turing Centenary Conference in order to analyse the development of Computer ScienceArtificial Intelligence and Alan Turing’s legacy [1].

There’s an impressive and stellar speaker line-up including:

Tickets are not cheap at £450 for four days, but you can sign up for free public lectures by Jack Copeland on Turing: Pioneer of the Information Age and Roger Penrose on the problem of modelling a mathematical mind. Alternatively, if you can lend some time, the conference organisers are looking for volunteers to help out in return for a free conference pass. Contact Vicki Chamberlin for details if you’re interested.

References

  1. Chouard, T. (2012). Turing at 100: Legacy of a universal mind Nature, 482 (7386), 455-455 DOI: 10.1038/482455a see also nature.com/turing

May 23, 2012

Who is the World’s Largest Advertising Agency?

Massive Golf Sale!

The British Monarchy are preparing to exploit new advertising opportunities and boost royal revenue during the 2012 Olympics in London. Photo credit: gokart.co.uk.

Advertising agencies are everywhere, there is no escaping them. But who’s the daddy of the advertising world? The mother of all ad agencies?

According to wikipedia, WPP is the “world’s largest advertising group by revenues”. This is hogwash. Some of the world’s largest ad agences are technology companies. For example, in descending order of revenue:

So Google Inc. is currently the world’s largest advertising agency by revenues, followed by WPP then possibly Facebook. It will be interesting to see if the “best minds” [1,2] on Planet Facebook can catch up with WPP and Google by encouraging it’s user’s to click on ads more and buy more stuff in their store.

“The best minds of my generation are thinking about how to make people click on ads. That sucks.” — Jeff Hammerbacher [1]

References

  1. Ashlee Vance (2011) This Tech Bubble Is Different Bloomberg Business Week
  2. Bruce Robinson (1989) How to Get Ahead in Advertising Handmade Films

* Revenue figures from wikipedia. Can’t really vouch for their accuracy but they look reasonable.

May 11, 2012

Journal Fire: Bonfire of the Vanity Journals?

Fire by John Curley on Flickr

Fire by John Curley, available via Creative Commons license.

When I first heard about Journal Fire, I thought, Great! someone is going to take all the closed-access scientific journals and make a big bonfire of them! At the top of this bonfire would be the burning effigy of a wicker man, representing the very worst of the vanity journals [1,2].

Unfortunately Journal Fire aren’t burning anything just yet, but what they are doing is something just as interesting. Their web based application allows you to manage and share your journal club online. I thought I’d give it a whirl because a friend of mine asked me what I thought about a paper on ontologies in biodiversity [3]. Rather than post a brief review here, I’ve posted it over at Journal Fire. Here’s some initial thoughts on a quick test drive of their application:

Pros

On the up side Journal Fire:

  • Is a neutral-ish third party space where anyone can discuss scientific papers.
  • Understands common identifiers (DOI and PMID) to tackle the identity crisis.
  • Allows you to post simple anchor links in reviews, but not much else, see below.
  • Does not require you to use cumbersome syntax used in ResearchBlogging [4], ScienceSeeker and elsewhere
  • Is integrated with citeulike, for those that use it
  • It can potentially provide many different reviews of a given paper in one place
  • Is web-based, so you don’t have to download and install any software, unlike alternative desktop systems Mendeley and Utopia docs

Cons

On the down side Journal Fire:

  • Is yet another piece social software for scientists. Do we really need more, when we’ve had far too many already?
  • Requires you to sign up for an account without  re-using your existing digital identity with Google, Facebook, Twitter etc.
  • Does not seem to have many people on it (yet) despite the fact it has been going since at least since 2007.
  • Looks a bit stale, the last blog post was published in 2010. Although the software still works fine, it is not clear if it is being actively maintained and developed.
  • Does not allow much formatting in reviews besides simple links, something like markdown would be good.
  • Does not understand or import arXiv identifiers, at the moment.
  • As far as I can see, Journal Fire is a small startup based in Pasadena, California. Like all startups, they might go bust. If this happens, they’ll take your journal club, and all its reviews down with them.

I think the pros mostly outweigh the cons, so if you like the idea of a third-party hosting your journal club, Journal Fire is worth a trial run.

References

  1. Juan Carlos Lopez (2009) We want your paper! The similarity between high-end restaurants and scientific journals Spoonful of Medicine, a blog from Nature Medicine
  2. NOTE: Vanity journals should not to be confused with the The Vanity Press.
  3. Andrew R. Deans, Matthew J. Yoder & James P. Balhoff (2012). Time to change how we describe biodiversity, Trends in Ecology & Evolution, 27 (2) 84. DOI: 10.1016/j.tree.2011.11.007
  4. Shema, H., Bar-Ilan, J., & Thelwall, M. (2012). Research Blogs and the Discussion of Scholarly Information PLoS ONE, 7 (5) DOI: 10.1371/journal.pone.0035869

May 3, 2012

Need to re-invent the Web (badly)? There’s an App for that!

The Mobile App Trap

The App Trap: Why have just one Web App when you can have hundreds of mobile Apps? A selection of popular Android apps from Google Play, also available for iPad and iPhone from the Apple App Store

I love the convenience of mobile applications but hate the way they re-invent the wheel and are killing the Web. What can be done about it?

I’m in love with the mobile Web

I’ve been smitten with the Web since first venturing out on the information superhighway back in the nineties. This love affair is taken to a new level with the advent of the mobile Web. As an incurable information junkie, having access to news is on the move is great. Using location based services like Google Maps is fantastic, on foot, bike or in the car. I love nerdily scanning barcodes to read Amazon book reviews while browsing the shelves in bookshops, much to Tim Waterstone’s annoyance. And it can be great to have wikipedia in your pocket to settle arguments down the pub.

I hate the mobile Web too

But there’s a big problem with all this appy clappy mobile fun, it’s killing the Web through fragmentation, both for producers and consumers of information. Let me explain.

One of the great things about the Web is that you there is one app to rule them all; a “killer app” called a Web browser. There are several flavours, but they all basically do the same thing using similar technology: they let you surf the Web. One software application (a browser), gives you access to an almost infinite number of Web applications. Wonderfully simple, wonderfully powerful – we’ve got so used to it we sometimes take it for granted.

Now compare this to the mobile Web where each page you visit on a mobile suggests that you download an app to read it. Where there used to be just one application, now there are thousands of glorified “me too” Web browsers apps many of which have re-invented the Web, badly.

Consider the applications in the table below and illustrated on the right. They are all accessible from a Web browser on one of the “four screens ”:  desktop, mobile, tablet and smart-TV:

Native mobile app Purpose Web app
Amazon mobile Online retailer Amazon.com
BBC News mobile News and propaganda news.bbc.co.uk
The Economist mobile More news and propaganda economist.com
eBay mobile online garage sale ebay.com
Flickr mobile photo sharing flickr.com
Guardian mobile Even more news and propaganda guardian.co.uk
Google Reader mobile Feed reader reader.google.com
Google Maps mobile Maps and navigation maps.google.com
MetOffice mobile UK Weather metoffice.gov.uk
PostOffice mobile Postcode / Address finder royalmail.com/postcode-finder
Google Search mobile Search engine google.com
Google Translate mobile Language translator translate.google.com
Twitter mobile Entertaining time-wasting application twitter.com
Wikipedia mobile Encyclopædia en.wikipedia.org/wiki
WordPress mobile Blogging tool wordpress.com
YouTube mobile Videos youtube.com

As you can see, users are encouraged to download, install, understand and maintain sixteen different apps to enjoy this small part of the mobile Web. And this is just the tip of the iceberg, there’s bucket-loads more apps like this in Google Play and the App Store. As a user, you could just use a mobile Web browser on your phone, but you’ll be discouraged from doing so. We’ll return to this later.

Producers and consumers both suffer

As well as being a pain for users who have to manage hundreds of apps on their phones and tablets, the pain is magnified for producers of data too. Instead of designing, building and maintaining one Web application to work across a range of different screens (a challenging but not impossible task), many have chosen to develop lots of different apps. Take twitter for example, in addition to the desktop and Web apps, twitter currently makes no fewer than five different applications just for tablets and phones:

    1. twitter.com/download/ipad (for iPad)
    2. twitter.com/download/blackberry (for Blackberry)
    3. twitter.com/download/wp7 (for Windows phones)
    4. twitter.com/download/android (for Android)
    5. twitter.com/download/iphone (for iPhones)

So a challenging task of delivering content onto a range of different devices has now been transformed into an almost impossible task of building and managing many different apps. It’s not just Twitter, Inc. that chooses to play this game. Potentially any company or organisation putting data on the mobile Web might consider doing this by employing an army of android, blackberry, iPhone and windows developers on top of the existing Web developers already on the payroll. That’s good news for software engineers, but bad news for the organisations that have to pay them. Managing all this complexity isn’t cheap.

Not Appy: How do we get out of this mess?

In the rush to get mobile, many seem to have forgotten why the Web is so successful and turned their back on it. We’ve re-invented the wheel and the Web browser. I’m not the first [1] and certainly not the last [2] to notice this. Jonathan Zittrain even predicted it would happen [3,4] with what he calls “tethered devices”. One solution to this problem, as suggested at last months International World Wide Web conference in Lyon by some bloke called Tim, is to develop mobile Web apps rather than native mobile apps:

There are lots of examples of this. Sites like trains.im provide train times via a simple Web-based interface, no app required. Many Web sites have  two versions, a desktop one and a mobile one. Wikipedia has a mobile site at en.m.wikipedia.org/wiki, Flickr at m.flickr.com, The Economist at m.economist.com, BBC at m.bbc.co.uk/news and so on. But in many cases these sites are poor cousins of the native mobile apps that software developers have focused their efforts on, diluting their work across multiple apps and platforms.

Maybe it’s too late, maybe I’m suffering from the suspicious of change” syndrome described by Douglas Adams like this:

  1. everything that’s already in the world when you’re born is just normal;
  2. anything that gets invented between then and before you turn thirty is incredibly exciting and creative and with any luck you can make a career out of it;
  3. anything that gets invented after you’re thirty is against the natural order of things and the beginning of the end of civilisation as we know it until it’s been around for about ten years when it gradually turns out to be alright really.

The mobile Web makes me suspicous because many apps re-invent the wheel. I’ve argued here that it is against the natural order of the Web, we’ve waved goodbye to the good old Web [5] and its the beginning of the end. I really hope not, it would be a tragedy to carry on killing the Web as it’s given us so much and was designed specifically to solve the problems described above. Let’s hope native mobile apps gradually turn out to be alright really.

References

  1. Gary Marshall (2011). Could smartphone apps be taking us back to the days of “best viewed with … ”? Net Magazine
  2. Jason Pontin (2012). Why Publishers Don’t Like Apps: The future of media on mobile devices isn’t with Apps but with the Web Technology Review
  3. Jonathan Zittrain (2007). Saving the internet. Harvard Business Review, 85 (6) PMID: 17580647
  4. Jonathan Zittrain (2009). The Future of the Internet: And How to Stop It Penguin, ISBN:014103159X
  5. Hamish MacKenzie (2012) Web 2.0 Is Over, All Hail the Age of Mobile, Pandodaily

March 15, 2012

Be nice to nerds … you may end up working for them

Thought for the day: be nice to nerds because you might end of up working for them.

This sound advice comes from DARPA defector and newly appointed Googler Regina Dugan (see picture below).

Regina Dugan by Steve Jurvetson

What’s that you say? You’re not sure exactly what a nerd is? There are many definitions but the graphic below sums it up better than the Oxford English Dictionary ever could.

Are you a nerd, geek, dork or dweeb?

But beware! Many self-confessed nerds may actually be dorks, dweebs or geeks. It’s a grey area out there in the Venn of Nerdery, not quite as clear cut as the diagram above. To be sure of treating nerds right, you’ll need to be nice to dorks, dweebs and geeks too! See video for details…

[Creative Commons licensed picture of Regina Dugan at TED via Steve Jurvetson]

May 1, 2011

Myopia, Hubris and Amnesia: Three Reactions to Innovation

Arthur C. Clarke (1917-2008)According to Arthur C. Clarke [1]:

“New ideas pass through three periods:

  1. It can’t be done;
  2. It probably can be done, but it’s not worth doing;
  3. I knew it was a good idea all along.”

These three stages can be summed up as Myopia, Hubris and Amnesia. Which sounds a bit like the famous misquote (?) by Mahatma Gandhi:

“First they ignore you,
then they laugh at you,
then they fight you,
then you win.”

We are all surrounded by innovations of various kinds. If Clarke and Gandhi are right, we are either:

  • myopically ignoring them…
  • laughing and fighting them hubristically or
  • amnesiacally approving of the winners

Which one are you?

References

    1. Benford, G. (2008). Obituary: Arthur C. Clarke (1917–2008) Nature, 452 (7187), 546-546 DOI: 10.1038/452546a

April 28, 2011

Are machines taking over the planet?

TastyTalk of machines taking over the planet is the stuff of science fiction but if world domination was just a simple numbers game, some machines have already “taken over” from their human masters.

One machine, the particular brand of computer processor found inside all iPhones and lots of other electronic devices, has been quietly spreading around the globe at a phenomenal rate. There are some interesting statistics on just how many of these processors are out there published in an interview with engineer Steve Furber [1]. Here is an excerpt from the interview:

“Around the end of 2007, the ten-thousand-millionth ARM [Advanced RISC Machine] had been shipped, so there are more ARMs than people on the planet. I believe production is currently running at about 10 million a day. It is projected to rise to about one per person on the planet per year within two or three years”.

Those numbers highlighted in bold (emphasis mine) are completely mind-boggling. As humans, we are outnumbered by just one brand of machine! Of course, they are just lots of “dumb” computer chips with no intelligence. But Furber suspects that:

“there’s more ARM computing power on the planet than everything else ever made put together” [1]

So if you could find a way of using all these processors at once, maybe they’d become magically self-aware in a neural network [2,3,4,5]? Cue ominous Terminator theme tune

References

  1. Jason Fitzpatrick (2011). An interview with Steve Furber Communications of the ACM, 54 (5) DOI: 10.1145/1941487.1941501 (since 2007, numbers have risen to 10 billion in 2008 an another one billion in the first quarter of 2011 alone!)
  2. Steve Furber (2011). Biologically-Inspired Massively-Parallel Architectures: A Reconfigurable Neural Modelling Platform Lecture Notes in Computer Science, 6578 (2) DOI: 10.1007/978-3-642-19475-7_2
  3. Steve Furber, & Steve Temple (2008). Studies in Computational Intelligence Computational Intelligence: A Compendium, 115, 763-796 DOI: 10.1007/978-3-540-78293-3_18
  4. An estimated one million ARM processors give you about 1% of the capacity of the human brain see the details of the Spiking Neural Network Architecture (SpiNNaker) project
  5. James Cameron, et al (1991) Terminator 2: Judgment Day (T2)

[Creative commons licensed picture of Terminator terror by Tasty by cszar]

« Previous Page

Blog at WordPress.com.