Project Able
For the past eight years, an idea has been rattling around in the back of my head. It concerns the future of computing—the core properties, the shape it should take, the experiences it should enable.
For years, I've been trying to shake it, but it keeps coming back.
At Amazon, my work on Echo and Alexa planted the seeds of the idea. Back then, it was limited to wearables. The idea was to physically sense a world-behind-the-world, something I actually made a little progress on, as can be seen in the horribly-shot video below.
My work at Oracle shaped the way I think about platforms—how the layers above are enabled by the layers below, and how Identity forms the backbone of any platform.
At Meta, my work in both Ads and Developer Platform kept pushing it to the front of my mind, shaping my thoughts on the role of privacy in tomorrow's platforms, and on the specific shape that next-generation computing experience might take given the broader market forces.
My work at AudioEye has further shaped those thoughts, making me wonder what the "ultimate accessible experience" is, and how it can be enabled by default for all people.
But so far, I haven't yet been able to draw a box around it all. I haven't been able to pull it together into a singular vision of the future that the average person (or even a the average engineer) might understand or get excited about.
After some thought, I've decided it's time to put pen to paper, to finally try.
I'm calling this effort Project Able, for reasons I'll explain in a future update. I don't have much more to share for now. But I needed to publish something publicly, immediately, to help hold myself accountable.
Here we go.