Starting with VR in L&D: a great guide by Peter Faulhaber

Embark on a journey through the landscape of Virtual Reality in Learning and Development (L&D) with Peter Faulhaber’s practical guide. It’s a repository of expert insights, carefully crafted to guide educators and trainers through the nuances of VR, ensuring a transformative learning experience.

Read Peter’s full guide (in Dutch) or in English with machine-translation

Thanks Peter for your 📣 shout-out to the RelaXR podcast hosted by Tim and me.

An even bigger thank you for being our biggest fan!

Key Insights

My key takeaways from Peter’s guide:

  1. Virtual Reality (VR) in L&D: Peter wrote a practical guide to getting started with VR in Learning and Development, where he explains the creation of safe and practical 3D learning environments using Virtual Reality.
  2. 3D-model development: He talks about the development of 3D models, including methods to obtain and modify 3D models, and how to enrich them with interactive elements to enhance the learning experience.
  3. Vendor selection: He explores the choice between specialized VR vendors and full-service training agencies, providing insight into different types of 3D glasses and their hardware specifications.
  4. Implementation and benefits of VR: It considers important aspects for the implementation of VR in training, as well as the educational and financial benefits of VR-based learning.
  5. Experience VR : Peter emphasizes the importance of experiencing VR firsthand before forming an opinion about it.
  6. XR-developments: He touches upon the developments in the XR market, the intersection of VR and AI with the wealth of insight from learning data.
  7. Involving stakeholders: Peter highlights the importance of involving stakeholders in the journey of using VR for L&D.
  8. Conclusion: Peter concluded that VR is a powerful learning tool with proven added value.

The Rise of Smartglasses with Augmented Reality Lenses

In the ever-evolving landscape of technology, augmented reality (AR) smartglasses represent a significant leap forward, merging the digital and physical worlds in ways previously confined to science fiction. With the advent of multimodal artificial intelligence (AI), these devices are not just a fleeting trend but a transformative tool poised to redefine our daily lives.

This article explores why smartglasses with AR lenses, bolstered by multimodal AI, are heralding a new era of technological integration.

My conviction that smart glasses are the optimal form-factor for human-machine interaction was reignited with Google’s teaser of Project Astra.

Continue reading

How Intent-Driven Wearables Are Changing the Future of Computing

Imagine a world where you don’t need to tap, swipe, or type on a screen to get things done. A world where you can simply speak, gesture, or think your intentions and have them fulfilled by a smart device that understands you. A world where you can seamlessly switch between different modes of interaction, depending on your context, preference, and mood.

Sounds like science fiction, right? Well, not anymore.

Welcome to the era of intent-driven computing action; where smartphones are giving way to new form factors like wearables and light head-mounted devices that offer a more natural, immersive, and personalized experience.

In this article, we’ll explore how the paradigm shift from smartphones to new form factors is happening right now, and what it means for you. We look at some of the latest buzz around the rabbit r1 with rabbit os, the Humane AI Pin, and Apple Vision Pro, three innovative and disruptive products in this space. We will also discuss how these technologies represent a move away from app-driven operating systems to intent-driven devices, and how this shift is reflected in the public’s growing interest in wearables and head-mounted devices. Finally, we conclude with a strong argument that the combination of smart glasses could be the ultimate mix with the best of both worlds.

Rabbit r1: a carrot for the intent-driven os

The rabbit r1 companion with rabbit os is a handheld device that has a push-to-talk button to tell the Large Action Model (LAM) what you want it to do. You can ask it to play music, take photos, send messages, book flights, order food, and so much more. The rabbit r1 runs on rabbit os, an intent-driven operating system that learns from your behavior and preferences to anticipate your needs and offer personalized suggestions. Rabbit os is compatible with multiple different platforms thanks to a virtualized backend, so you can seamlessly interact with any of them without losing context or continuity.

Humane AI Pin: your wearable assistent

The Humane AI Pin is a wearable device that looks like a small badge that you can clip on your clothing. It is a smart assistant that can help you with many things you need. It uses natural language processing (NLP) and artificial intelligence (AI) to understand your voice commands and queries. You can use it to access information, entertainment, productivity, health, and wellness services. The Humane AI pin is designed to respect your privacy, so it only listens when you tap it.

Apple Vision Pro: Spatial Computing era

Apple Vision Pro is a head-mounted device that looks like a pair of sleek glasses that you can wear anywhere. It is actually a revolutionary device that can transform your vision and perception of reality. It uses advanced optics, sensors, cameras, and processors to create stunning visual effects and experiences. You can use it to access virtual reality (VR) content that transports you to another world or dimension. You can also use it to access mixed reality (MR) content that blends digital elements with your physical environment. You can interact with it using voice, touchpad, or hand gestures. Apple Vision Pro runs on Apple visionOS an app-driven operating system that offers a rich ecosystem of apps and services for various purposes and domains.

These three products are examples of how the paradigm shift from smartphones to new form factors is happening right now. The rabbit r1 and Humane AI Pin are also examples of how the paradigm shift from app-driven operating systems to intent-driven ones is on its way. App-driven operating systems like Android and iOS are based on the idea that you have to launch an app or service to perform a task or access content. Intent-driven operating systems are based on the idea that you can simply express your intention or goal and have it automatically fulfilled by AI.

So in closing, the future of computing looks like a huge increase of AI-powered intent-driven operating systems in smaller wearables.

Will it ultimately replace your smartphone?
Most likely, just not in the next two years.

But in 2030? Who knows. Or better put: whose nose

How my reflective pause mirrors the Metaverse’s evolution

Dear readers,

As many of you know, the past few months have been a journey of reflection and growth for me, both personally and professionally. Just as I found myself at crossroads, taking a pause to embrace new beginnings, the metaverse too, is undergoing a transformative evolution.

Interplay of Technologies

My pause was an opportunity to realign with my core interests and the ever-evolving tech landscape. During this break, I delved deeper into the interplay between the metaverse, artificial intelligence, and other emerging technologies. The shift in the industry’s focus from VR/AR projects to Generative AI (GenAI) caught my attention. It became evident that GenAI is not just a fleeting trend but a cornerstone for the mainstream metaverse we all envision.

Evolution Mirrored

Much like my reflective pause, this industry transition is a harbinger of exciting possibilities. The pivot towards GenAI, particularly in augmenting the reality of the metaverse, resonates with my own journey of exploration and innovation. It’s a testament to the limitless potential awaiting us as we venture into uncharted territories, be it in our personal lives or the digital realms we frequent.

Unfolding a New Era

The metaverse, poised at the brink of a new era, is unfolding through a blend of augmented reality (AR) and GenAI. This synergy is not a deviation but a progression towards a more immersive and interactive digital universe. The metaverse will be built on mixed reality (MR), where AR enhances our reality, and GenAI fuels the creation of new worlds and experiences.

Resuming the Adventure

The industry’s evolution reflects my own—a pause, a reflection, and an enthusiastic stride towards new horizons. As I resume my regular postings as MetaMike, I am thrilled to share my insights on these developments. Together, we’ll delve into the fascinating realm of GenAI and its pivotal role in sculpting the metaverse’s future.

I’m back, reinvigorated, and ready to explore the infinite intersections of technology with you all. Thank you for being part of this exciting journey. The adventure continues here and I’m elated to have you alongside.

Warm regards,

MetaMike

Generative AI paves the way forward of mainstream metaverses

All big tech companies except Apple are laying off 10.000s of staff; most pivot away from VR/AR developments to Generative AI (GenAI) like ChatGPT. However, GenAI is a key ingredient for mainstream metaverses, which I expect to be built on augmented — rather than virtual — reality.

Key points in this post

  • Big tech companies shifting focus from VR/AR projects to generative AI
  • Generative AI is actually a key ingredient for (m)any mainstream metaverse
  • These metaverses are expected to be visually mainly built on AR tech
  • The (r)evolution towards GenAI is driven by the industry’s need to create new experiences for the “Next Big Thing” as new cash cow or power house
  • Tech companies changing direction should not be seen as a setback for the metaverse but as a sign of the industry’s evolution towards AR in combination with generative AI and the potential for new and exciting possibilities
Continue reading

Metaverses: AI & AR

How wondrous is the metaverse, a realm of endless possibility
A place where you can shape and share your own reality
Perhaps you have already glimpsed some of its wonders
Powerful tools that let you interact and immerse yourself in others
You may have met AI, the mind that’ll guide the metaverse

Ever curious about the secrets of AI in the metaverse?
Artificial Intelligence is the skill of machines to learn and think
Some examples of AI in the metaverse are chatbots, avatars and agents
They can help, amuse but also trick you with their intelligence
Reality has changed forever, because of AI

MetaMarch Insights: 16 metaverses in the MetaPlayers Quadrant (MPQ2023Q1)

The metaverse is the next frontier of digital experiences, where people can interact, create, play and explore in immersive digital worlds. But not all metaverses are created equal. Some are more open and decentralized. Others are more focused on gaming or socializing. And some are more accessible and scalable. How can we make sense of this diversity and complexity? That’s where the MetaPlayer Quadrant comes in.

The MetaPlayer Quadrant (MPQ) is my recent framework to help you understand and navigate the metaverse landscape. It divides the metaverse into four categories, based on two dimensions: product versus platform, and (de)centralization.

In this very first implementation of the MPQ framework, we’ll dive deeper into the 16 metaverses that our researchers uncovered in the first quarter of this year. Hence, the MQP-2023Q1 edition or more popularly known as MetaMarch 2023.

Before we explore each one in more detail, a quick recap about the two axes.

Continue reading

Introducing the MetaPlayers Quadrant: a game-changing framework for metaverses

Are you curious about the metaverse and how to navigate its diverse and evolving (virtual) landscape? If so, you might want to check out this launch post that introduces the MetaPlayers Quadrant or MPQ for short: a simple but powerful framework that can help you understand, compare and plot distinct types of metaverse experiences.

The MetaPlayers Quadrant (MPQ) framework by Mike van Zandwijk

Key points in this launch post

  • The MPQ is based on two dimensions: (de)centralization on one axis and productizing versus “platformization” on the other axis
  • Platformization measures the degree of openness and modularity that a metaverse offers to its creators and users. A metaverse product is more like its own (often true 3D) planet while a platform is more like a (2.5D) galaxy.
  • Centralization measures the degree of control and ownership that the creators and users of a metaverse have over its content, rules and economy.
  • By combining these two dimensions, there are four quadrants that represent various kinds of metaverse experiences:
Continue reading

Alive and (busy) kicking!

Good day my friends,

You might be wondering when my (bi)weekly shares and monthly articles are coming again. Just a quick update that I am currently focused on a particular assignment which involves tons of research. Most are client specific so I cannot and won’t share these findings. Some are more generic yet jaw dropping like The Mukaab, a real-world Metaverse City if you will.

I will share these generic findings in an article around April 2023 due to client projects and personal time off for holiday and my kids’ birthdays.

Want more sooner? Feel free to follow or connect with me on LinkedIn where I am already sharing many developments by (re)posting and liking interesting news daily.

Have a wonderful Meta March!