The mobile landscape and Apple’s iOS are evolving at the speed of light.
One of the best ways to keep up with this ever changing ecosystem is the VTM iOS Developer Conference where you can attend talks by domain experts and meet many fellow developers. It’s a great way to keep your skills sharp and find inspiration.
A couple of us attended the conference over the last weekend in Boston (Nov 12-13). We enjoyed listening to an interesting variety of both technical and philosophical discussions about new features in iOS5, development strategies, and the future of mobile.
This year marked Jeff’s third time speaking at a VTM conference. He gave a talk about how to enrich the user experience of mobile apps by not limiting it to the two dimensional interface of the touchscreen. Jeff discussed how to use the many sensors available on smartphones (e.g. gyroscope, accelerometer, compass, etc.) as complimentary inputs.
Staying true to the premise of his talk, Jeff built a sample app for the audience step-by-step with a brand new UI for “time travel”. His app set the “time” by physically moving the device just like the arms of an analog clock. (check out his slides here)
Another highlight for me was the closing address, “Beyond the Gold Rush”, by Brent Simmons. He talked about the past, present, and future of mobile. Brent predicted that the next 5 years will be even more magical and revolutionary.
He challenged the audience to realize this future by “solving the hard problems” because the mainstream users have now embraced mobile devices and are ready to go way beyond the fart and flashlight apps.
We accept the challenge. Let’s get to it!
Overall the VTM conference was a great way to spend a beautiful New England weekend with comrades.
Next month a couple of us are headed to the VTM iPhone Developers Conference in Seattle! (April 9-10) It should be a great conference covering the latest trends, and with Erica Sadun as the technical chair, the conference won’t shy away from getting technical.
I’m on the speaker roster for the second time at a VTM conference. Last fall, I spoke about CoreMotion and AR (see my slides here). This time, I’ll be talking about Computer Vision and Augmented Reality. I will cover how pervasive cameras are becoming (now on the iPad!), and how they really aren’t just about photos and video anymore. Just like how phones are less and less about phone calls, and more about powerful apps, cameras are slowly becoming less and less about snapshots and more about interacting with reality. I talked about this a little over a year ago on an O’Reilly Where 2.0 talk, so it will be fun to see how far we’ve come and how far we still have to go before we all have Iron Man vision in our smartphones and tablets.
In addition to talking about trends and hopefully inspiring some ideas for the future, I’ll share some code that is applicable when dealing with the camera which will be useful in augmented reality apps and any kind of realtime video processing. I’ll cover some important aspects of using AVFoundation and OpenGL.
If you’re not registered for the conference, there’s still time to sign up, and you can get a $150 off by using the special code SEAMAL6. (That’s like getting the earlybird discount even though you’re clearly a latebird!)
PS: We’re hiring top notch iPhone/iPad engineers. Get in touch if you’re interested, or say hello at the conference if you’re attending.
If you have the latest iPhone OS (4.2+) and an iPhone 4 or iPod touch 4th gen, check out the new gyro-enabled panorama viewing we just launched!
Quick update: We always get a little annoyed when technical terms are misused, so we’re the first to admit that this new feature isn’t really Augmented Reality [also pointed out by Gizmodo, DownloadSquad], but we thought AR evoked the type of controls we’ve enabled, so we titled this post accordingly. But to be accurate, it’s true 3DOF gyroscope navigation of a simulated camera view — and it’s pretty cool! And if you want to see some actual AR work we’ve done, check this out.
- Live now for every panorama uploaded so far.
- Requires iOS 4.2+
- Allows you to physically spin around as if you were actually at the scene of the panorama.
- Doesn’t currently auto-align with compass North, but you can use touch controls to orient with north.
Let us know what you think and how we can improve it!
For the last and final installment of Off-limits Panoramas, we head down into the New York Subway system to answer Why does speed matter? when you’re capturing a panorama. This time, our opposition isn’t the park rangers, or even the security guards.
No, this time around, it’s a 350-ton, fast-approaching subway line.
And by the way, we launched 360 Panorama 3.0 Today!!!
This past Monday, I posted an introduction to Off-limits Panoramas (Part 1) — where we basically break the rules and take a panorama (using 360 Panorama v 3.0 — coming December 1 to the iTunes App Store) in a place where it’s not exactly allowed. We do this all in the name of answering one key question: Why does speed matter when it comes to capturing panoramas?
Today we’ve got another exciting exploit in NYC, this time in the the Met Life building, which is considered one of the most secure in New York City.
This time around, Vikas sticks his neck out to capture the interior of this historic building’s main floor plaza:
Here’s the actual panorama.
Off-limits Panoramas Part 3, the Finale: Coming Wednesday.
360 Panorama is fast. In fact, it’s the fastest panorama capture app — period.
But why does speed matter?
Well, last week in New York City, Vikas and I decided to get our hands dirty and show off one reason why speed matters when you’re capturing a panorama. Get ready, because we’re going where no panorama has gone before — we’re going off-limits, starting with the picturesque Sheep’s Meadow in NYC’s Central Park.
Without further ado, here’s the first of a three-part segment of off-limits panoramas, where we go successively more off-limits:
You can see the actual pano right here.
Off-limits Panoramas Part 2: Coming later this week.
Since the sale of RedLaser to eBay, people have been asking what’s next for Occipital. While RedLaser has been in the limelight, the team has been hard at work on our next-generation computer vision application.
Today we’ve launched 360 Panorama (app store: $2.99): realtime panorama creation for your iPhone 4 or 3GS. Gone is the need to stitch together a series of still photographs. With 360, simply pan your phone in any direction and watch as our computer vision system builds a panorama in realtime. Capturing a panorama on your phone has never been easier.
360 Panorama is our first major release since RedLaser, and we’ve surmounted many challenges to make it happen. We’re proud to say there’s simply no other app like it.
Here’s a short video preview of 360 Panorama in action:
Video updated 12/14/10 to feature v 3.0!
Tech Note: The first release of 360 Panorama is optimized to make panorama capture extremely fast and easy so that you can share panoramas very rapidly. As we improve the precision of generated panoramas, we’ll also add options for higher resolution outputs. Current resolution of a 360 degree panorama is
2048 4096 pixels wide (which is still wider than most monitors).
Occipital has grown up a little bit since August 2009. We had survived 2009 by running on fumes and building a shiny stage-1 rocket booster called RedLaser. Back then, we were happy that RedLaser had been installed on 95,000 iPhones. RedLaser has now been installed on over 2 million iPhones. The growth was pure word-of-mouth (see the Newsweek article about it written by Vikas from November 27 2009.) And as we recently reflected with UX Magazine, we learned that user experience was tantamount to success.
2010. We spent the early months of 2010 supercharging RedLaser. We added data feeds, local results, new barcode formats, and we licensed RedLaser’s technology to 70 companies including Target. Apple featured RedLaser in a TV ad which premiered at primetime during Lost, scanning a bright red Gaggia Espresso Machine.
It was around this time that we realized RedLaser had outgrown our basement office. Occipital is fundamentally a computer vision technology company, but we had transitioned to spend most of our time fueling a large-scale mobile commerce tool. We started laying the groundwork to scale RedLaser up and away from our core engineering focus, and it was around that same time that we started kicking around ideas with a little company called eBay that just happened to specialize in large-scale mobile commerce tools, which brings us to today’s announcement.
Today we’re announcing that RedLaser has been acquired by eBay, Inc. We are confident that eBay is a truly better home for RedLaser than Occipital. Not only will RedLaser continue to thrive (now free for the first time on the App Store), but we’re also excited to report that the RedLaser SDK and all of the companies it supports will continue and expand under eBay. If you’re wondering, Occipital remains a freestanding company and we will not be moving over to eBay. eBay has an entirely new team running RedLaser.
Tomorrow. We’re really just getting started. Remember, we were just on the crawler with our partially-completed rocket. We no longer have our stage-1 rocket booster, but we have something even better in the works – a stellar engineering team:
Robert Grant, a Computer Science master’s alumnus from the University of Michigan, joined the team on February 18 as Occipital’s first Computer Vision Engineer. Since then, Rob has been leading development for Occipital’s next major release, which begins our steps down the path of creating a human-computer interface that blends seamlessly with human vision, which will be Occipital’s primary focus for the foreseeable future.
Rémi Chaignon, currently working in Paris, is starting remotely in July as our first Augmented Vision Engineer until he will join the company in Boulder this October. Rémi worked at the University of Teeside on the fundamentals of an Augmented Reality Game engine dubbed GEAR.
Shaun Werkhoven, the most recent to accept our invitation to join the team, has a PhD in Computer Vision from the University of Newcastle. Shaun, who currently resides in Sydney until his trip to Boulder next month, deeply studied Interest Points as applied to object recognition and 3D reconstruction for his thesis. Shaun will play a crucial role, applying his research knowledge to more optimally help us solve problems as a Computer Vision Engineer.
Without a doubt, this is the most exhilarating time in company history. We’re looking forward to the formal launch of Occipital now that we’re refocused on what we do best as a company – computer vision, and we can’t wait to watch as eBay accelerates and improves RedLaser with a new dedicated team.
If you haven’t already downloaded RedLaser – it’s free today.
I love photography. And I like the fact that I can take digital photos on a device that fits in my pocket and is always with me. However, I hate the fact that many of the pictures I take come out blurry just because there is a little motion in the scene or because I didn’t have the camera perfectly still for the shot.
I’m tired of guessing when everyone is standing still enough to get a good picture. I’m tired of working so hard to keep the camera still when I take a picture. Most importantly I’m tired of missing the truly special moments just because I can’t get the camera to stay still at the same time as everyone in the scene. There has to be a better way!
When I found what appeared to be the solution I was estatic: ClearCam‘s quickshot mode. A camera that took several shots and then just kept the best one – the one where the camera was still AND motion in the scene was minimal. Brilliant!
Then disappointment set in when I realized it wasn’t in a condition to be released through the App Store. So a long story short, I called up Jeff and Vikas and said what’s up with that? Don’t you realize what you’ve developed is the perfect solution to this problem? Of course they knew that but were way too busy with the success of RedLaser to put energy into it. Kudos for that, but seriously I was missing way too many moments with this mobile camera that had everything going for it but the ability to reliably capture moments.
With no where else to turn, I reluctantly dusted off my Xcode development environment, formed a partnership with Occipital and off we went. In the process we have taken ClearCam’s quickshot mode to the next level by virtually eliminating time required between shots (just in case there’s more than one moment to capture, or you’re just not sure which moment you will need to capture).
And finally, for the first time ever, I find myself trusting my iPhone camera. Seriously trusting it. For me, it has become one of the only apps I use on a daily basis. I hope it changes mobile photography for others as much as it has for me. It’s been fun guys, thanks.
Now if only there were an app that would help me improve those once in a lifetime shots I keep coming across without my DSLR. There has to be a better way…
If you were watching Lost, NCIS, Parenthood, or Melrose Place on Tuesday 4/6, you might have noticed this commercial featuring RedLaser:
ProductWiki, a collaborative product review and information site, was also featured in the commercial. Check out the extension they created to search Product Wiki using RedLaser here (using our Custom Apps feature).
We just recently set up a display that shows scans happening in realtime at Occipital HQ. Here is what happened after the commercial aired: