By Alex Hoffman | @alexbhoffman
Smartphones have changed nearly every aspect of our lives. They have become the personal assistant and digital companion that keeps us punctual, social, and sane. Still, these devices don’t quite shine until we push them — manually launching apps and inputting information.
Pandora demands an artist, Urbanspoon requires a cuisine, and Maps wants a destination.
These minor, but collectively time-consuming requirements signal that our devices have yet to integrate our physical settings with our personal interests.
For instance, when we’re at home on the couch, we’re typically in the mood for relaxing things — be it a good TV show or book. And when we’re working out at the gym, we often seek out motivational music. These contexts are easy to generalize, but extremely hard to personalize. The right show, book, or playlist can vary infinitely depending on our taste, personality, and culture.
Many of today’s smartphones, though, house over ten sensors that gather all kinds of information about us. The accelerometer measures our speed, the GPS our location, and the radio our connections. So too, a microphone hears our sounds, the camera sees our sights, a gyroscope feels our motion, and so on.
By coupling this sensory information with third-party data, our devices are beginning to usher in a new culture driven by context, wherein experiences and recommendations can be automatically catered to us. Instead of manually tapping to set our Android’s alarm at bedtime, it be will able to infer from our Google Calendar appointments and Google Maps traffic data just how long we need to commute to make our first meeting on time and wake us accordingly.
Similarly, in time, our iPhone’s accelerometer will sense an increase in speed as we leave our driveway, link to our social graph on Facebook to determine which artists we like, and infer the best Pandora station to start playing us before we join Monday morning traffic.
Context culture will change the way we live, work, and play.
Freaky, But Inevitable
Our devices know us intimately — our habits, interests, connections, and desires. And Facebook’s Open Graph, Amazon’s storefront, and Google’s search know us, too. But until now they couldn’t translate their knowledge about us into value without our help.
Understandably, this frightens many of us who are concerned about our privacy. This technology is always on and aware; it’s keeping tabs on where we are, who we’re with, and what we’re buying, among other things. Robert Scoble, a predominate blogger, notes that this level of open access into our lives is downright “over the freaky line” for many people.
What happens when our data is pried from the cloud, lost in translation, or used maliciously? We’ve already witnessed how certain contextual apps can go awry, with location-based, dating apps like Grindr and Skout leading to cases of rape.
Serious concerns like these dovetail most technological shifts we’ve ever experienced though; and while the complete eradication of risk is impossible, we’ve been successful in creating protections that make widespread adoption relatively safe — i.e. police programs that seek to prevent sexual abuse originating from online chat rooms. And the reality is that the features that make contextual apps freaky also happen to make them valuable to us.
The Coming Wave
Gimbal, a new developer platform, recently enabled the creation of context-aware apps for Android phones and tablets. Its standout capabilities are geofencing, image recognition, and interest sensing. These features may soon be understood as the protons, electrons and neutrons of context-aware apps, and they’re just the beginning.
Geofencing involves creating a virtual perimeter around a physical space. When a device crosses into one of these boundaries an automatic event is triggered — a push notification is sent, an app launches, or a check-in is recorded, etc.
Just as many new cars unlock when the driver nears with a smart-key, relevant and contextual experiences will unlock for us as we go about our daily lives wielding our smartphones.
So if TheFuture.Fm launched an app built on Gimbal, it could geofence all of the local gyms and set a DJ-curated workout mix to launch every time a user enters. Or Live Nation could geofence all of its venues, making it so walking through the doors would trigger a notification for a drink discount or upcoming concert promotion informed by your Facebook, Bandsintown or SuperGlued profile.
Image recognition will likely displace the QR-code. Instead of scanning an abstract barcode, we can simply take a picture of a real place, person, or object. The real-life image is recognized by a Gimbal-based app, which then triggers a specific event.
For example, an artist could project a promotional image on the backdrop at their show that, when captured, automatically loads their mobile app, adds their latest album to our online music collection, or follows them on Twitter. Apps like Songza could use image recognition cater us an outdoorsy playlist when we snap a photo of a hiking trail.
Image recognition gets particularly interesting with products like Google Glasses on the horizon, because a digital world will overlay our physical world. In the next five years, a single glance at a festival stage could start filming the event and load a digital schedule right in our glasses — complete with profiles and links for each band that we can browse hands free.
Lastly, Gimbal’s interest sensing feature uses data collected from our behaviors and activities to create a profile of who we are and what we do. This interest graph serves an important purpose; it helps our devices and apps personalize to us. If we’re into politics and punk rock our devices will know this and cater news items, playlists, and more accordingly.
This feature in particular underscores the potential context-aware technology has to revolutionize our lives. It’s the feature that transforms our devices from machines we passively use to extensions that truly know and understand us as individuals — narrowing down the infinity of choices to the particular book, show, or song that suits us.
Making Sense of the Millions
Nowadays, we have plenty of access to music, and not enough guidance.
“Songs used to be discrete products from artists to fans; now they’re becoming more like temporary tattoos,” writes reporter Eliot Van Buskirk of Evolver.FM. “Music is finally everywhere.” Anyone can play anything in seconds through a myriad of online music services, he says, but “it’s not just the song that matters; it’s the context.”
While the demand for music is greater than ever, the burden of choice has begun to weigh heavy on listeners — especially mainstream listeners who are still adapting to the world of discovery and engagement beyond broadcast radio and a few concerts a year.
If integrated properly, context-aware platforms like Gimbal may bring the music industry its most revolutionary breakthrough since the Internet; a personal record store geek that confidently guides us through life with a unique and evolving soundtrack. This geek will compel us to discover new bands, attend more shows, buy more merch, and generally remember the magic of fandom that, for most of us, disappeared with the death of vinyl and the CD.
With the rise of context culture, our temporary musical tattoos may become more permanent and profitable. And developments in technology that once brought the music industry to it’s knees, could enliven and perhaps even empower it for years to come.
(Photo Credit: Flickr)
Alex Hoffman is a senior writer at sidewinder.fm and principal at Subversive Inc., a management and consultancy group. Hoffman previously founded the non-profit DJ mentoring program, Scratching For Success, and was the Director of Artist Services at Grooveshark.