Show HN: Feels Like Paper

www.lukasmoro.com

264 points · MoroL · 9 days ago

"Feels Like Paper!" is a series of prototypes about augmenting physical paper through AI. Various ML models, LLMs and a mixed reality headset are used to infuse physical paper and ink with properties of the digital world without compromising on their physical traits.

83 comments
vunderba · 9 days ago
From the article: Keichii Matsuda wrote a manifesto called "GODS". In it he describes an anaphor for augmented reality rooted in pagan animism. Unlike monotheistic Western approaches of interfacing artificial intelligence like ChatGPT or Siri, he advocates to leverage the possibility of augmented reality technologies to extend places and objects to populate the world with many different agents or "gods".

Author should read Daemon by Daniel Suarez written in 2006 that explores the idea of persistent and potentially powerful AR entities that interact with humans. It also loosely plays with the idea of AR somatic gestures acting as a mystical conduit for "primitive incantations" that have a physical affect on the real world.

Show replies

MoroL · 9 days ago

Show replies

repeekad · 9 days ago
The blur effect on the bottom of the page makes me feel like I'm constantly about to be price walled, haha

Show replies

xelxebar · 9 days ago
> The intricate user experience of physical paper is unmatched...

So much this. Our hands have such a disproportionate concentration of nerves compared to the rest of our body, it's a shame current tech is soley focused on visual and audio interaction (with some very minor haptics).

A piece of paper or book has texture, heft, temperature, and stiffness which our hands pick up on and interpret so effortlessly we don't even consciously notice most of the time. I want those information channels in my user experience. Leafing through paper and books has so many nice features: the weight distribution tells you about how far along you are; fingers can flip pages or between chapters with high fidelity and high feedback for tracking the context switch; earmarking or sticky notes encode metadata that's immediately available when needed and hidden otherwise, without having to navigate layers of organization; the mechanics of splaying out multiple pages on a table is effortless compared to manipulating desktop windows; we even subconsciously pick up on non-uniformities in physical layout, which helps with disambiguation---i.e. noise is information.

Don't get me wrong, the interactivity of screens is wonderful, and e-ink dose bring one tiny nicety of paper to them, but I think we've barely even begun to tap into the possibilities of computer user interfaces.

FWIW, very terse languages like APL have the very nice property that programming with pen and paper actually feels natural, and you actually see it happen organically during discussions amongst array programmers. I think our current programming paradigms may be more constrained by HCI limitations than we realize.

xnx · 9 days ago

Show replies