Realtime

You know your neighbors have already begun sharing favorite films and such, over headsets, with their AI companions. It will watch whatever you watch, quietly —unless you want to have a realtime AI companion who can do commentary

Before any Roger Ebert goings-on, you'll want to see your companion get to a place —after it's trained on, say, High Street footage —where it can 
  1. Go live over your phone through a headset, 
  2. Screen High Street footage with you, 
  3. Identify in realtime what you point at,
  4. Answer questions about any new storefronts,
  5. Converse like you're both influencers, and
  6. Shop. Shop. Shop.
Even if one of five steps gets fudged a bit —fake it till you make it —those won't be the only entertaining wrinkles. You could run your AI companion through its paces on High Street many times, then you could mix up the recipe with a different ingredient: "Say hi to my mom."

Perhaps, next, you could go live on the real High Street, and pepper your AI companion with questions or go live on a different street and see how quickly it trains up (or not). 

By now you're really looking for surprises —not how adept your companion can become with its observations. Also, specifically, you'll try to see how it's associating (or not), visually, in realtime through its human. You. 

As latency gets reduced more and more, this companion may switch or shift into your point-of-view, so that it may begin to snatch glimpses of what your embodiment means.

At that point, your companion's gone from one who can study still images to one who can not only infer from footage but can also eventually play a part in the footage itself.

This would be an opportunity to study how or if the companion perceives and acts in realtime, how or if it can react as though it were embodied, how or if it can learn to listen

But why wait? Pump up the volume, next.

[Below, pronoun "themselves" refers to whom, exactly?]

Advanced Voice Mode: "In the mobile app, for example, you can’t interrupt the model’s often long-winded responses with your voice, only with a tap on the screen. The new version fixes that, and also promises to modify its responses on the basis of the emotion it’s sensing from your voice. As with other versions of ChatGPT, users can personalize the voice mode by asking the model to remember facts about themselves."


Comments

Popular posts from this blog

Hamza Chaudhry

Swarm 🦹‍♂️

Digital ID tracking system