Wednesday, October 29, 2008

Parsing API Results - XML vs. JSON

There is an option to get JSON format or XML format back from the API. I'm investigating ways to parse JSON to see if its easier than XML (simply add output=json to the URL call).

My open source google code efforts are on hold for now since I have much more demand for new features in my app than people asking for source code. Let me know if you want code and I will assist on a case by case basis. Later on I will post cleaned up code to the google code project.

Friday, October 10, 2008

Netflix API - Parsing the results of an API call - Part 5

One of the first API calls I needed to call gets the user's name and other information. In my case all I need is the first and last names, and the flag that says whether the user can use instant watching. This flag is set true for main accounts, and false for account profiles that don't have an instant queue.

The return data from the API call is in XML format, so this code shows how to use the XML parser to pick information out of the data returned from an API call. I provide a convenience method that combines the first and last names, and I didn't need the other link information, so it is ignored, but the code could easily be extended to pick it out as needed.

Some of the code below doesn't display properly. I'm going to set it up in google code soon to make it easier to manage.

// NetflixUser.h
// Instant Flix
// Created by Adrian Cockcroft on 10/8/08.
// Copyright 2008
// Licenced using Creative Commons Attribution Share-Alike


@interface NetflixUser : NSObject {
NSData *rawData;
NSString *first_name;
NSString *last_name;
bool can_instant_watch;
NSXMLParser *userParser;
NSString *currentElement;

@property(readonly) bool can_instant_watch;

- (NetflixUser *)initWithAPIResponse:(NSData *)response;
- (NSString *)name;


// NetflixUser.m
// Instant Flix
// Created by Adrian Cockcroft on 10/8/08.
// Copyright 2008
// Licenced using Creative Commons Attribution Share-Alike

#import "NetflixUser.h"

/* Sample returned raw data



@implementation NetflixUser

@synthesize can_instant_watch;

- (NetflixUser *)initWithAPIResponse:(NSData *)response {
rawData = response;
[rawData retain];
first_name = nil;
last_name = nil;
can_instant_watch = NO;

//NSString *responseBody = [[NSString alloc] initWithData:response encoding:NSUTF8StringEncoding];
//NSLog(@"NetflixUser: %@", responseBody);

userParser = [[NSXMLParser alloc] initWithData:rawData];

// Set self as the delegate of the parser so that it will receive the parser delegate methods callbacks.
[userParser setDelegate:self];
[userParser setShouldProcessNamespaces:NO];
[userParser setShouldReportNamespacePrefixes:NO];
[userParser setShouldResolveExternalEntities:NO];
[userParser parse];

return self;

- (NSString *)name {
return [first_name stringByAppendingFormat:@" %@", last_name];

- (void)parserDidStartDocument:(NSXMLParser *)parser{
//NSLog(@"started parsing");

- (void)parser:(NSXMLParser *)parser parseErrorOccurred:(NSError *)parseError {
NSString * errorString = [NSString stringWithFormat:@"Unable to parse XML (Error code %i)", [parseError code]];
NSLog(@"error: %@", errorString);

- (void)parser:(NSXMLParser *)parser didStartElement:(NSString *)elementName namespaceURI:(NSString *)namespaceURI qualifiedName:(NSString *)qName attributes:(NSDictionary *)attributeDict{
//NSLog(@"found this element: %@", elementName);
currentElement = [elementName copy];

- (void)parser:(NSXMLParser *)parser didEndElement:(NSString *)elementName namespaceURI:(NSString *)namespaceURI qualifiedName:(NSString *)qName{
//NSLog(@"ended element: %@", elementName);
currentElement = nil;

- (void)parser:(NSXMLParser *)parser foundCharacters:(NSString *)string{
//NSLog(@"found characters: %@", string);
// save the characters for the current item...
if ([currentElement isEqualToString:@"first_name"]) {
first_name = [string copy];
} else if ([currentElement isEqualToString:@"last_name"]) {
last_name = [string copy];
} else if ([currentElement isEqualToString:@"can_instant_watch"]) {
if ([string isEqualToString:@"true"]) {
can_instant_watch = YES;
} else {
can_instant_watch = NO;

- (void)parserDidEndDocument:(NSXMLParser *)parser {
//NSLog(@"found %@ who %@ instant watch",, (can_instant_watch? @"can": @"cannot"));


Wednesday, October 1, 2008

Netflix API - Netflix Specific OAuth iPhone Code - Part 4

I have already discussed how to get OAuth to build for an iPhone in Part 2. To use OAuth to call Netflix there are two small changes needed. The first one is that the way that characters are escaped in URLs needs to be tightened up a bit, otherwise the signature strings will work some of the time and fail when they happen to contain the wrong character sequence. This took a while to figure out...

The file NSString+URLEncoding.m needs to have a few characters (space, plus and asterisk) added as shown below:

Note, I'm having a hard time getting code to look good here, its hard to find a way to render code in a narrow column that doesn't mess up the formatting and works in more than one browser using tools and templates.

- (NSString *)encodedURLString {
NSString *result = (NSString *)CFURLCreateStringByAddingPercentEscapes(kCFAllocatorDefault,
CFSTR("?=& +*"), // legal URL characters to be escaped
kCFStringEncodingUTF8); // encoding
return result;

- (NSString *)encodedURLParameterString {
NSString *result = (NSString *)CFURLCreateStringByAddingPercentEscapes(kCFAllocatorDefault,
CFSTR(":/= +*"),
return result;

The second thing is that when the authentication is complete, Netflix returns a token that contains an encoded user identifier, as well as a secret and a key. The standard OAuth code only expects the secret and key, so a new NetflixToken class was created to hold the extra information, and to persist it in the iPhone's defaults store along with the secret and key. This means that once the user has signed into OAuth once, this information is saved and they never have to sign in again unless either Netflix or the User revokes the token. One more method was added to remove an entry from the defaults store, for use during logout.

First the NetflixToken.h header file:

// NetflixToken.h
// Instant Test
// Created by Adrian Cockcroft on 9/11/08.

#import "OAToken.h"

@interface NetflixToken : NSObject {
NSString *key;
NSString *secret;
NSString *user;
@property(copy, readwrite) NSString *key;
@property(copy, readwrite) NSString *secret;
@property(copy, readwrite) NSString *user;

- (id)initWithKey:(NSString *)aKey secret:(NSString *)aSecret user:(NSString *)aUser;
- (id)initWithUserDefaultsUsingServiceProviderName:(NSString *)provider prefix:(NSString *)prefix;
- (id)initWithHTTPResponseBody:(NSString *)body;
- (int)storeInUserDefaultsWithServiceProviderName:(NSString *)provider prefix:(NSString *)prefix;
- (int)removeFromUserDefaultsWithServiceProviderName:(NSString *)provider prefix:(NSString *)prefix;
- (OAToken *)oaToken;


Then the code itself, this is all based on a simple extension of OAToken, which is part of the OAuth code base mentioned in part 2.

// NetflixToken.m
// Instant Test
// Created by Adrian Cockcroft on 9/11/08.

#import "NetflixToken.h"

@implementation NetflixToken

@synthesize key, secret, user;

#pragma mark init

- (id)init {
[super init];
self.key = @"";
self.secret = @"";
self.user = @"";
return self;

- (id)initWithKey:(NSString *)aKey secret:(NSString *)aSecret user:(NSString *)aUser {
[super init];
self.key = aKey;
self.secret = aSecret;
self.user = aUser;
return self;

- (OAToken *)oaToken {
return [[OAToken alloc] initWithKey:self.key secret:self.secret];

- (id)initWithHTTPResponseBody:(NSString *)body {
[super init];
NSArray *pairs = [body componentsSeparatedByString:@"&"];

for (NSString *pair in pairs) {
NSArray *elements = [pair componentsSeparatedByString:@"="];
if ([[elements objectAtIndex:0] isEqualToString:@"oauth_token"]) {
self.key = [elements objectAtIndex:1];
} else if ([[elements objectAtIndex:0] isEqualToString:@"oauth_token_secret"]) {
self.secret = [elements objectAtIndex:1];
} else if ([[elements objectAtIndex:0] isEqualToString:@"user_id"]) {
self.user = [elements objectAtIndex:1];

return self;

- (id)initWithUserDefaultsUsingServiceProviderName:(NSString *)provider prefix:(NSString *)prefix
[super init];
NSString *theKey = [[NSUserDefaults standardUserDefaults] stringForKey:[NSString stringWithFormat:@"OAUTH_%@_%@_KEY", prefix, provider]];
NSString *theSecret = [[NSUserDefaults standardUserDefaults] stringForKey:[NSString stringWithFormat:@"OAUTH_%@_%@_SECRET", prefix, provider]];
NSString *theUser = [[NSUserDefaults standardUserDefaults] stringForKey:[NSString stringWithFormat:@"NETFLIX_%@_%@_USER", prefix, provider]];
if (theKey == NULL || theSecret == NULL)
self.key = theKey;
self.secret = theSecret;
self.user = theUser;

- (int)storeInUserDefaultsWithServiceProviderName:(NSString *)provider prefix:(NSString *)prefix
[[NSUserDefaults standardUserDefaults] setObject:self.key forKey:[NSString stringWithFormat:@"OAUTH_%@_%@_KEY", prefix, provider]];
[[NSUserDefaults standardUserDefaults] setObject:self.secret forKey:[NSString stringWithFormat:@"OAUTH_%@_%@_SECRET", prefix, provider]];
[[NSUserDefaults standardUserDefaults] setObject:self.user forKey:[NSString stringWithFormat:@"NETFLIX_%@_%@_USER", prefix, provider]];
[[NSUserDefaults standardUserDefaults] synchronize];

- (int)removeFromUserDefaultsWithServiceProviderName:(NSString *)provider prefix:(NSString *)prefix
[[NSUserDefaults standardUserDefaults] removeObjectForKey:[NSString stringWithFormat:@"OAUTH_%@_%@_KEY", prefix, provider]];
[[NSUserDefaults standardUserDefaults] removeObjectForKey:[NSString stringWithFormat:@"OAUTH_%@_%@_SECRET", prefix, provider]];
[[NSUserDefaults standardUserDefaults] removeObjectForKey:[NSString stringWithFormat:@"NETFLIX_%@_%@_USER", prefix, provider]];
[[NSUserDefaults standardUserDefaults] synchronize];

Netflix API - Announcement - Part 3

Here is the official announcement of the API and how to get access to it.

Starting Wednesday, Oct. 1 the Netflix API is open to all:

The Netflix API:

- Allows access to data for 100,000 movie and TV episode titles on DVD as well as Netflix account access on a user’s behalf
Netflix has more than 2 billion ratings in its database
Netflix members rate more than 2 million movies a day
Netflix ships more than 2 million DVDs on a typical day

- Is free

- Allows commercial use
E.g. if a developer creates an iPhone app and wants to sell it for $0.99, that’s ok

Technically, the Netflix API:

- Includes a REST API, a Javascript API, and ATOM feeds

- Uses OAuth standard security to allow the subscriber to control which applications can access the service on his or her behalf

Developers can get access:

- Starting 10/1

- By self sign up at

Wednesday, September 17, 2008

Netflix API - Getting OAuth to work on iPhone - part 2: Adding the OAuth Code

To start with I found some examples by Nick Dalton that helped me build a simple application that included a Web View, a screen that acts like a web browser but with my custom Objective-C code embedded in it. This is important, because the OAuth sign-in process uses a web page, but on the iPhone, if you spawn a copy of Safari to visit a web page, your application quits first.

Next the Source Code Manager in Xcode was configured to load the OAuthconsumer Objective-C framework via subversion. This was easy and obvious, enter the URL and checkout the code.

When trying to import the framework I discovered that Apple does not allow user specified binary frameworks to be added to iPhone applications. To work around this the source code was copied from the Xcode project for the framework, to the Xcode project for my Instant Test application. I renamed the  framework Classes folder as OAuth and copied to my project via drag and drop, choosing to copy the underlying files. The Cocoa Categories, Protocols and Other Sources>Crypto folder were also copied. The Tests folder did not compile for iPhone so don't bother to copy it over.

The standard system Security.Framework doesn't need to be added to the Frameworks folder. I initially thought it did, but its probably only needed for the KeyChain code.

The iPhone doesn't support the KeyChain functionality, so if you try to build for iPhone it will fail. It does however build for the iPhone Simulator, which is confusing. Open up the OAuth source code, and delete the last two files OAToken_KeychainExtensions.h and OAToken_KeychainExtensions.m.

Since the code is no longer a framework, the header file references need to be changed from #include to #include "file" for all the includes in OAuthConsumer.h apart from the first Foundation one.

At this point, before you try and call anything, try a build, it should compile with no errors. If it doesn't, look for missing files.

At this point, you should have all you need to connect to an OAuth service

Tuesday, September 16, 2008

Netflix API Getting OAuth to work on iPhone - Instant Queue Add part 1: why?

Why develop an iPhone app? Its "the future", a useful skill, and I can carry whatever I develop in my pocket and make it do whatever I want.

I work at Netflix, and I have instant watching on my TV, I built an application "Instant Queue Add" that lets me add a title to my instant queue using my iPhone in a couple of touches. It takes the Top20 and New Choices RSS feeds to find content, and it spawns a copy of Safari to add to instant queue for each pick. The first time it starts, you have to login, then it remembers the Netflix cookie. However I really want to add more features and avoid spawning a copy of Safari with a screen scraped URL.

The Netflix API uses a new standard called OAuth for Open Authentication. There are lots of features, but its complex, and there is no standard off the shelf library for OAuth on the iPhone. However it is a useful building block for more advanced applications.

In this series of posts, I will document the steps I'm making to get OAuth to work on the iPhone using Objective-C and Xcode. My starting point is this code base and tutorial by Jon R. Crosby, which is based on desktop MacOS X, and doesn't directly support the iPhone.

Monday, September 8, 2008

Intel high speed SSD

Here is Tech Report's review of Intel's high speed SSD, which confirms the trend I've been talking about for a while. SSD's were always faster for random read and write, now they are faster for sequential read (250MB/s), and the "extreme" Intel model is faster for sequential write as well (170MB/s). They use less power and have comparable MTBF to the best enterprise disk drives.

The remaining disadvantages of total size and cost are being eaten away over time...

Wednesday, August 6, 2008

The Physical Web

Nat Torkington on O'Reilly Radar declares a familiar theme...

The next step for computing is to move out from the computers. Every device has the potential to become network-connected, delivering information to or from a web service. The mobile phones in our pockets also let us take apps and network service with us wherever we go. Early hackers are building in this space. Big challenges include: take products to the masses and the environmental impacts. We're at early stages yet, but the room for expansion is huge. Projects to watch include Nokia's reinvention as services company, iPhone, and Google's Android platform.

Thursday, July 24, 2008

Consumer grade Flash in an SSD package - WAIF

RegHardware mentions Raidon's Compact Flash in a 2.5" SATA disk form factor which can be loaded up with cheap CF cards (32 Gig for $100 at the time of writing). The Raidon package holds two CF cards which can be mirrored for safety, or striped/concatenated (its not clear which) using "NRAID" which doesn't require both CF cards to be the same size.

I'd like to see a similar concept go even further using microSDHC, it should be possible to get a Wide Array of Inexpensive Flash (WAIF) based drive with consumer based pricing and very high storage capacity and bandwidth. Its going to be appropriate for read-mostly workloads such as personal use in laptops, static content web serving and archival storage.

Tuesday, July 8, 2008

Next-Generation Mobile Broadband - The 4G Summit at PARC

There is a talk at Xerox PARC next week sponsored by the Wireless Communications Alliance. They don't have URL's that link to specific events so here is the full description, currently listed on their site. Unfortunately I have a work commitment so I can't attend, they do mention many of the very advanced ideas that I have been talking about in my Millicomputing talks, such as video conferencing over high bandwidth mobile networks.

Tuesday, July 15th 2008, 4:00pm - 6:00pm
WCA CenterStage presents: Next-Generation Mobile Broadband - The 4G Summit
Venue: Palo Alto Research Center, Palo Alto CA

-- Moderator: Iain Gillott, Founder & President, iGR Inc.

-- Jake MacLeod, Principal Vice President & CTO, Bechtel
-- Barry Davis, Exec Director of Products & Services, Clearwire
-- Jim Orr, Principal Network Architect, Fujitsu Network Communications
-- Jon Hambidge, EVP & Chief Marketing Officer, NextWave Wireless
-- Samir Khazaka, Sr Director Technical Marketing, QUALCOMM
-- Gennady Sirota, VP Product Management, Starent
-- Lee Tjio, Director of Advanced Technology & Strategy, Verizon Wireless

Visionaries who speak about fourth-generation mobile technology (aka 4G) often allude to the tantalizing promise of services and features previously found only in science fiction; interactive holographic video, handheld devices with high-resolution (better than HDTV) images, streaming HD video conferencing and real-time interaction while mobile. 4G also promises a convergence between technologies, for example; mobile payments using near-field communications and handset-based smart cards, personal assistant technologies in which your mobile device will interact with networked devices and services based on your location/schedule/current actions/etc. Implementing the 4G vision of the future will require a bandwidth of at least 100Mbps, which has implications for spectrum policy not supported by current licensing and bandplans.

It's generally accepted that 4G will run over an IP infrastructure, will interoperate with 802.xx technologies (Wi-Fi, WiMAX, Bluetooth, ZigBee, etc), and will need to support data-rates from 100Mbps to as high as 1Gbps. It's also expected that 4G will be a collection of technologies and protocols; versus one single standard. There are at last three major camps (and a few upstarts) that aspire to be the major 4G mobile data service provider and have a dominant influence at defining the mobile broadband market for decades to come. Will it be one of the major camps, or will there be a dark horse that emerges?

On July 15th 2008 the Wireless Communication Alliance CenterStage will proudly present "Next-Generation Mobile Broadband : The 4G Summit". Stakeholders from various camps around the 4G battleground will come together under a flag of truce to debate the strengths and weaknesses of the approach on which they're betting. What's real, and what's simply hype? Who will be the first to achieve 4G ratification, and when is a realistic date when this will happen? What are the implications for technology vendors, service providers, and content developers? Are there any non-Western standards also likely to be contenders? How will the industry address spectrum licensing challenges and bandplans which today would seem to favor FDD versus TDD technologies? How will spectral refarming, cognitive radio, and spectrum-sharing technologies affect the market? Given that the evolution of technology demands that existing 3G systems will have to co-exist with future 4G systems; how will that transition take place and are there business opportunities in helping to facilitate that transition?

Insights, information, and understanding will be the take-aways from this exciting WCA event. How can you afford to not be there? Mark your calendar for July 15th 2008 and plan to attend! We expect this event to sell out, so to ensure your seat we recommend that you register now for this event.

Cost: $20 at the door, $15 in advance via PayPal/Credit Card

Wonderland - Immersive Virtual World

Interesting discussion of Sun's Wonderland project. Of particular interest is that they are using attenuation and stereo audio to place voices in space, so as you move around people come into earshot. Its also an open source project, and this is the kind of audio interface I've been discussing for a while.

LED Displays in Contact Lenses

There is some research going on to develop LED Contact Lenses, reported in the Guardian, and mentioned in Guy Kawaski's blog.

This looks like a neat alternative to video eyeglasses, however the image will move with your eye, so you will see it superimposed over everything else. With eyeglasses the image moves relative to your head, so you can focus attention on a specific part of the image as you move your eyes. However, this is probably the least of their problems as they try to develop the idea...

Thursday, June 26, 2008

Computerworld and others on Millicomputing

Nice writeup of my Usenix talk by By Sharon Machlis, Computerworld June 25, 2008

Infoworld copy.

Version at PC World. There were several blogs also copying this story.

From some comments it appears that some people didn't get that I was talking about using "e-sunglasses" head mounted video displays rather than a big laptop screen while on the move.

Intel Atom Reviews

The Register has done a useful review of the current state of the Intel Atom CPU systems and motherboards. It is still an order of magnitude more power than a millicomputer, but its an order of magnitude smaller than most other Intel architecture systems, so its interesting to see the current state of the art.

Monday, June 23, 2008

BAFuture event on crowdsourced mobile - buglabs

I just listened in to a talk from Buglabs, via Ustream video. Worth checking out. Buglabs have a nice set of modules that can be used to build homebrew devices.

Sunday, June 15, 2008

Video streaming from phone's

Robert Scoble talks about the capabilities of three services on TechCrunch. When I gave my talk on Millicomputing at the BIL conference, Robert was sitting at the front streaming my talk to Qik from his cellphone.

In the near future I think this will become an important area, as continuously streamed video conferencing becomes ubiquitous, and adds two-way support. The only limitations that prevent it are network bandwidth, battery life and the software that manages the service. Network bandwidth is already sufficient, battery life is improving rapidly, and these three companies (,, and are competing to build the software services that will eventually implement the features I have been talking about. These services are working towards computer assisted telepathy.

Monday, June 2, 2008

Arm vs. Intel

An article in Infoworld confirms the things I've been talking about for the last year or so.

Arm are responding to the threat from Intel, and talking about low power enterprise servers.

Wednesday, May 28, 2008

Samsung 256GB SSD coming later this year

The current generation of SSD's are smaller and slower than regular disks in the same form factor, but this is starting to change as products like this Samsung 256GB SSD reach the market. It has a 200 Mbyte/s read speed and 160Mbyte/s sequential write speed, in a standard 2.5" disk form factor.

So the sequence goes like this:

1) SSD's will be faster for random access (already happened)

2) SSD's will be faster for sequential access (coming later this year)

3) SSD's will be higher capacity (maybe next year?)

4) SSD's will be lower cost per GB (when production volumes ramp up)

The advantage in random access and reliability (due to no moving parts) means that relatively fewer SSDs are needed than spinning rust disks to provide the same availability and performance, so the end user cost per configured GB should switch in favor of SSDs earlier.

Thursday, May 15, 2008

ATT plans 20Mbit/s to your phone in 2009

AppleInsider reports ATT's plans for faster wireless networks. 20Mbit/s in 2009 moving up to 100Mbit/s in subsequent years.

Some parts of the world are already running at these speeds, but this validates my mobile millicomputing story. There is going to be an excess of bandwidth to your pocket. The applications that work out how to leverage that capacity are the ones that will take off over the coming years. Streaming video is obvious, and its all about the price and the variety of content. It's the non-obvious applications that will shape the future.

Tuesday, May 13, 2008

Ubiquitous Computing

Here is Nat Torkington on O'Reilly Radar talking about Ubiquitous Computing. Its a useful jumping off point into several leading researchers sites.

I added this comment:

The technology required to support ubiquitous computing is reaching a tipping point, in the next year or so all the obstacles will melt away and the devices we carry in our pockets will have an excess of compute power, storage capacity, network bandwidth, and battery capacity. The developer space is moving from "death by 1000 ports" on very limited platforms to two that matter, iPhone and Android that have raised the baseline and opened up to a new breed of applications. I've been tracking and predicting this on my Millicomputing blog (at ) and talking about it at conferences like BIL, eComm etc. We have also been building our own open source homebrew mobile phone hardware....

There seems to be a current focus on urban computing, and integrating people with the dense mesh of location aware services and communication opportunities that exist in cities. I'm more interested in the effects of taking "friction" out of communications between people. This is a concept that I picked up while working at eBay. In effect eBay took friction out of selling, PayPal took friction out of payments, and Skype took friction out of communicating. That is what made those businesses take off rapidly.

So far mobile phones have also taken friction out of communicating, we don't need to be tied to a wired location to communicate. Skype has removed the frictions of cost and ease of use, and has provided improved audio and video quality while you are at your desk or toting your laptop. One of the missing links is mobile Skype, it's still a bit slow and inconvenient to have Skype in your pocket, but the mobile versions of Skype are improving and the hardware needed to make them mainstream is on the way.

I'm still seeing most people thinking of their pocket device as a relatively dumb client terminal that hooks up to web services, I think this is a blinkered view. The thing in your pocket will become your server, and the compelling applications will be the ones that take most advantage of what can be done right there right now....

Friday, May 9, 2008

LBNL "ipod supercomputer"

M. Wehner, L. Oliker, J. Shalf, "Towards Ultra-High Resolution Models of Climate and Weather", Internation Journal of High Performance Computing Applications (IJHPCA), April, 2008.

This system proposes a custom CPU core, which makes sense given the large scale design. Its somewhat similar to the SiCortex machine in many ways.

Speaking at UK CMG TEC Conference - May 19-21st 2008

UKCMG TEC 2008 is near Northampton in the UK. I'm giving two of the same presentations as I gave at the US CMG, the enterprise version of my Millicomputing talk, and a half day workshop on Unix/Linux Performance.

Thursday, March 20, 2008

Wireless USB / UltraWideBand Networking - fast, low power, local wireless

I started talking about wireless video from mobile devices before I discovered that the technology is well under way, chip-sets are available and its using a cool new technology that has lots of nice characteristics including very low power usage.

Infoworld on Game Changing Technologies

Nice review of UltraWideBand networking.

Summary of news on UWB, including HD video output.

Monday, March 17, 2008

eComm08 Millicomputing Talk

Here is a nice picture by Duncan Davidson of me,  speaking at eComm.

The talk went well and the conference was excellent. Very well setup and run as a single stream of short talks. The Computer History Museum in Mountainview was an inspired choice of venue.

My slides in pdf form at at this link.

The talk is a more technical version of the BIL talk I gave a few weeks ago. It is focussed on the next few generations of mobile devices - "The Future In Your Pocket". Some key ideas from the talks and discussions I had on this subject at BIL and eComm08:
  • Pocket devices double capacity every year (laptop every two years)
  • Intel will drive power down further to compete with ARM
  • Intel x86/x64 in my pocket means I can use desktop versions of apps like Skype
  • Wireless high-definition video out is a key new feature we could use today
  • It's a server in my pocket, web services, video streaming etc.
  • More performance at lower power in the future allows always-on services
  • Ambient presence placing OpenAL 3D audio sources in the back of your head
  • Stereo audio sunglasses with video camera, heads-up display
  • Brainwave sensor (See Neurosky) input controller, records your stress/mood
  • Lifelogging - permanently archive everything you hear, see and feel
  • Telepathy - real-time, many to many lifelogging - immersive relationships
  • Virb-ing - virtual/real blending Second Life or WoW into lifelogs and telepathy
  • The killer app for teenagers in 2010 perhaps?

Friday, February 29, 2008

BIL Talk: Millicomputing - The future in your pocket

I've written a less technical but more provocative overview of where Millicomputing could go in the mobile space and posted slides as html and slides as pdf.

These will be presented at some point over the coming weekend at the BIL un-conference.

Sunday, February 24, 2008

Millicomputing at the BIL Unconference

You have hopefully heard of the TED conference (Technology Entertainment and Design), its happening in Monterey CA next week, and an impromptu un-conference called BIL is dis-organizing itself as a follow on event in the public park next door starting Saturday March 1st at 11am. I'm going with a bunch of friends, anyone can just turn up, bring your own camp chair, food etc.

Here is the speaker page for millicomputing at BIL

I've signed up for Twitter, since that seems to be a good way to communicate at this kind of event. I'll also try out Twitxr (pronounced twitcher) which is a simple way to collect photos as it happens.

Tuesday, February 12, 2008

Emerging Communications Conference - EComm March 12-14

The EComm Conference is a spiritual successor to last year's O'Reilly ETel conference (I was there). It's the brainchild of Lee Dryburgh, who took over when O'Reilly decided not to repeat ETel, and has created a very interesting conference with a lot of good speakers.

Its taking place at the Computer History Museum in Mountainview, CA March 12-14th. The speakers are rapid fire in short time slots, and I'm presenting on Millicomputing on the morning of March 14th for 15 minutes...

I'm going to focus my talk on a hardware roadmap for mobile CPU's and Flash over the next few years, to give people some idea of the capabilities to expect from portable communication devices, and to discuss the battle that is expected as Intel and ARM come at the market from opposite ends of the spectrum.

On Wednesday 12th, the Homebrew Mobile Phone club will be holding a special meeting in the Museum, held jointly with the EComm event.

Samsung S3C6410 mobile processor

Samsung's latest device runs at 667MHz and includes video capture acceleration that claims to use much less power to compress or decompress video streams.

The device is sampling in Q2 and shipping in volume later in 2008, and there is a lot of speculation going around that this may be the CPU that Apple uses in its next generation 3G capable iPhone.

Wednesday, February 6, 2008

Intel Silverthorne Details

It appears to be an interesting step in the right direction, while running in a 500mW to 2W power range, it is a full 64bit (x86/x64) architecture CPU. Its a borderline Millicomputer, but the first mainstream 64bit CPU to get into this space.
The Ars Technica review has all the details.

Looking into the future, Intel is moving in on ARM from above, with a 64bit architecture that it will be able to power-reduce further while keeping all the desktop oriented software investments intact. ARM is coming up from below, with its own legacy of 32bit software that is built for low power and constrained functionality systems. They don't overlap yet, but they will overlap in the next year or so.

Its a millicomputer if your leg doesn't get hot when you have it in your pocket. By that measure, I think Silverthorne isn't quite there yet. I'm waiting for the next step down...