in Apple, Articles, Artificial Intelligence, Metaverse

How Intent-Driven Wearables Are Changing the Future of Computing

Imagine a world where you don’t need to tap, swipe, or type on a screen to get things done. A world where you can simply speak, gesture, or think your intentions and have them fulfilled by a smart device that understands you. A world where you can seamlessly switch between different modes of interaction, depending on your context, preference, and mood.

Sounds like science fiction, right? Well, not anymore.

Welcome to the era of intent-driven computing action; where smartphones are giving way to new form factors like wearables and light head-mounted devices that offer a more natural, immersive, and personalized experience.

In this article, we’ll explore how the paradigm shift from smartphones to new form factors is happening right now, and what it means for you. We look at some of the latest buzz around the rabbit r1 with rabbit os, the Humane AI Pin, and Apple Vision Pro, three innovative and disruptive products in this space. We will also discuss how these technologies represent a move away from app-driven operating systems to intent-driven devices, and how this shift is reflected in the public’s growing interest in wearables and head-mounted devices. Finally, we conclude with a strong argument that the combination of smart glasses could be the ultimate mix with the best of both worlds.

Rabbit r1: a carrot for the intent-driven os

The rabbit r1 companion with rabbit os is a handheld device that has a push-to-talk button to tell the Large Action Model (LAM) what you want it to do. You can ask it to play music, take photos, send messages, book flights, order food, and so much more. The rabbit r1 runs on rabbit os, an intent-driven operating system that learns from your behavior and preferences to anticipate your needs and offer personalized suggestions. Rabbit os is compatible with multiple different platforms thanks to a virtualized backend, so you can seamlessly interact with any of them without losing context or continuity.

Humane AI Pin: your wearable assistent

The Humane AI Pin is a wearable device that looks like a small badge that you can clip on your clothing. It is a smart assistant that can help you with many things you need. It uses natural language processing (NLP) and artificial intelligence (AI) to understand your voice commands and queries. You can use it to access information, entertainment, productivity, health, and wellness services. The Humane AI pin is designed to respect your privacy, so it only listens when you tap it.

Apple Vision Pro: Spatial Computing era

Apple Vision Pro is a head-mounted device that looks like a pair of sleek glasses that you can wear anywhere. It is actually a revolutionary device that can transform your vision and perception of reality. It uses advanced optics, sensors, cameras, and processors to create stunning visual effects and experiences. You can use it to access virtual reality (VR) content that transports you to another world or dimension. You can also use it to access mixed reality (MR) content that blends digital elements with your physical environment. You can interact with it using voice, touchpad, or hand gestures. Apple Vision Pro runs on Apple visionOS an app-driven operating system that offers a rich ecosystem of apps and services for various purposes and domains.

These three products are examples of how the paradigm shift from smartphones to new form factors is happening right now. The rabbit r1 and Humane AI Pin are also examples of how the paradigm shift from app-driven operating systems to intent-driven ones is on its way. App-driven operating systems like Android and iOS are based on the idea that you have to launch an app or service to perform a task or access content. Intent-driven operating systems are based on the idea that you can simply express your intention or goal and have it automatically fulfilled by AI.

So in closing, the future of computing looks like a huge increase of AI-powered intent-driven operating systems in smaller wearables.

Will it ultimately replace your smartphone?
Most likely, just not in the next two years.

But in 2030? Who knows. Or better put: whose nose

How my reflective pause mirrors the Metaverse’s evolution

Dear readers,

As many of you know, the past few months have been a journey of reflection and growth for me, both personally and professionally. Just as I found myself at crossroads, taking a pause to embrace new beginnings, the metaverse too, is undergoing a transformative evolution.

Interplay of Technologies

My pause was an opportunity to realign with my core interests and the ever-evolving tech landscape. During this break, I delved deeper into the interplay between the metaverse, artificial intelligence, and other emerging technologies. The shift in the industry’s focus from VR/AR projects to Generative AI (GenAI) caught my attention. It became evident that GenAI is not just a fleeting trend but a cornerstone for the mainstream metaverse we all envision.

Evolution Mirrored

Much like my reflective pause, this industry transition is a harbinger of exciting possibilities. The pivot towards GenAI, particularly in augmenting the reality of the metaverse, resonates with my own journey of exploration and innovation. It’s a testament to the limitless potential awaiting us as we venture into uncharted territories, be it in our personal lives or the digital realms we frequent.

Unfolding a New Era

The metaverse, poised at the brink of a new era, is unfolding through a blend of augmented reality (AR) and GenAI. This synergy is not a deviation but a progression towards a more immersive and interactive digital universe. The metaverse will be built on mixed reality (MR), where AR enhances our reality, and GenAI fuels the creation of new worlds and experiences.

Resuming the Adventure

The industry’s evolution reflects my own—a pause, a reflection, and an enthusiastic stride towards new horizons. As I resume my regular postings as MetaMike, I am thrilled to share my insights on these developments. Together, we’ll delve into the fascinating realm of GenAI and its pivotal role in sculpting the metaverse’s future.

I’m back, reinvigorated, and ready to explore the infinite intersections of technology with you all. Thank you for being part of this exciting journey. The adventure continues here and I’m elated to have you alongside.

Warm regards,

MetaMike

Generative AI paves the way forward of mainstream metaverses

All big tech companies except Apple are laying off 10.000s of staff; most pivot away from VR/AR developments to Generative AI (GenAI) like ChatGPT. However, GenAI is a key ingredient for mainstream metaverses, which I expect to be built on augmented — rather than virtual — reality.

Key points in this post

  • Big tech companies shifting focus from VR/AR projects to generative AI
  • Generative AI is actually a key ingredient for (m)any mainstream metaverse
  • These metaverses are expected to be visually mainly built on AR tech
  • The (r)evolution towards GenAI is driven by the industry’s need to create new experiences for the “Next Big Thing” as new cash cow or power house
  • Tech companies changing direction should not be seen as a setback for the metaverse but as a sign of the industry’s evolution towards AR in combination with generative AI and the potential for new and exciting possibilities
Continue reading

Metaverses: AI & AR

How wondrous is the metaverse, a realm of endless possibility
A place where you can shape and share your own reality
Perhaps you have already glimpsed some of its wonders
Powerful tools that let you interact and immerse yourself in others
You may have met AI, the mind that’ll guide the metaverse

Ever curious about the secrets of AI in the metaverse?
Artificial Intelligence is the skill of machines to learn and think
Some examples of AI in the metaverse are chatbots, avatars and agents
They can help, amuse but also trick you with their intelligence
Reality has changed forever, because of AI

MetaMarch Insights: 16 metaverses in the MetaPlayers Quadrant (MPQ2023Q1)

The metaverse is the next frontier of digital experiences, where people can interact, create, play and explore in immersive digital worlds. But not all metaverses are created equal. Some are more open and decentralized. Others are more focused on gaming or socializing. And some are more accessible and scalable. How can we make sense of this diversity and complexity? That’s where the MetaPlayer Quadrant comes in.

The MetaPlayer Quadrant (MPQ) is my recent framework to help you understand and navigate the metaverse landscape. It divides the metaverse into four categories, based on two dimensions: product versus platform, and (de)centralization.

In this very first implementation of the MPQ framework, we’ll dive deeper into the 16 metaverses that our researchers uncovered in the first quarter of this year. Hence, the MQP-2023Q1 edition or more popularly known as MetaMarch 2023.

Before we explore each one in more detail, a quick recap about the two axes.

Continue reading

Introducing the MetaPlayers Quadrant: a game-changing framework for metaverses

Are you curious about the metaverse and how to navigate its diverse and evolving (virtual) landscape? If so, you might want to check out this launch post that introduces the MetaPlayers Quadrant or MPQ for short: a simple but powerful framework that can help you understand, compare and plot distinct types of metaverse experiences.

The MetaPlayers Quadrant (MPQ) framework by Mike van Zandwijk

Key points in this launch post

  • The MPQ is based on two dimensions: (de)centralization on one axis and productizing versus “platformization” on the other axis
  • Platformization measures the degree of openness and modularity that a metaverse offers to its creators and users. A metaverse product is more like its own (often true 3D) planet while a platform is more like a (2.5D) galaxy.
  • Centralization measures the degree of control and ownership that the creators and users of a metaverse have over its content, rules and economy.
  • By combining these two dimensions, there are four quadrants that represent various kinds of metaverse experiences:
Continue reading

Alive and (busy) kicking!

Good day my friends,

You might be wondering when my (bi)weekly shares and monthly articles are coming again. Just a quick update that I am currently focused on a particular assignment which involves tons of research. Most are client specific so I cannot and won’t share these findings. Some are more generic yet jaw dropping like The Mukaab, a real-world Metaverse City if you will.

I will share these generic findings in an article around April 2023 due to client projects and personal time off for holiday and my kids’ birthdays.

Want more sooner? Feel free to follow or connect with me on LinkedIn where I am already sharing many developments by (re)posting and liking interesting news daily.

Have a wonderful Meta March!

Happy New Metayear!

Happy New Metayear to my friends, fans and followers,

As we say goodbye to the old year and welcome in the new, I take a brief moment to reflect on some exciting developments and opportunities that await us in the year ahead.

I’m optimistic about the future and believe that with joint effort and determination, we can create a better world for ourselves and those around us. And with the public availability of ChatGPT, I’m even more excited about the potential for AI to enhance our lives and the multitudes of metaverses.

Here are a few things we are looking forward to in the coming “Metayear”

  1. More wearable companions to smartphones (Modular Mobility)
  2. Apple releasing their very first metaverse device running on xrOS
  3. A shift towards more AR products (a first sign of Reversed Virtuality)

So here’s to a happy, healthy, and successful new year to all! May this new Metayear bring you joy, prosperity, and all that you wish for in any metaverse.

Wishing you all the best for 2023,
MetaMike van Zandwijk

PS. My family and I are enjoying the Christmas holidays so little to no posts from me this week. I will also switch to a more weekly in-depth article than short posts every other day.

The Triple Paradigm Metashift

Three megatrends are imminent. Each trend triggers a next paradigm metashift: a fundamental change of how we think about the metaverse and see our world. Literally.

Megatrends leading to Triple Paradigm Metashift

  • The first megatrend is the relocation of key functions in all-in-one mobile phones to mobile hubs with a wide range of wearable companions:
    • Wireless earplugs replaced your phone’s mic and speakers (speech)
    • Smartwatches take over phone controls (touch and gestures)
    • Smart (AR) eyewear might soon substitute your phone’s screen (sight)
  • This current megatrend, Modular Mobility, enables us to superimpose anything digital into our real-world easily with small-factor AR overlays
  • The rapid growth of Augmented Reality in the real-world instead of VR in simulated worlds, leads to the second megatrend: Reverse Virtuality
  • Reverse Virtuality (RV) is the new mindset that the metaverse is not about merely an online 3D world for VR gear but about the real-world application of truly Mixed Reality for our AR-first physical world.
  • These megatrends bring both big benefits and challenges that phase us to the third megatrend, ultimately leading to Societal Metamorphosis
    • First phase: Digital Disorder, bad actors abusing disruptive meta-tech
    • Next phase: Digital Renaissance, tech-for-good tackling global crises
    • Final phase: Digital Singularity, irrevocable transformation of society
  • The good news? A Digital Singularity would be so much better for human civilization than a Technological Singularity. The bad? It’ll be tough first.

Our world is on the brink of another technological revolution.

As we stand atop the steep slope of the S-Curve Pattern of Innovation, we can’t help but wonder what the future holds. Following the (mobile) internet revolution with the rise of the web and smartphones, it’s difficult to predict what lies ahead.

Continue reading