Every year I like to write a detailed report about all the major news from Oculus/Facebook/Meta Connect. This year I want to do something a bit different: instead of reporting a list of all the announcements from there, I want to give you a detailed commentary about the event from a technical strategy standpoint. This is part of what I do for a living: in companies, I’m not only the developer or lead of developers, but I also work on technical roadmaps for products. So in this article, I want to tell you what are the major lessons you have to take from Meta Connect 2024 and that you should take into consideration for the future of your career or your business. At the end of the article, I will also provide some links with a list of all the pieces of news announced there, so you can be sure not to miss any news from this major event.
Orion shows us the future of AR
The most important moment of the Meta Connect 2024, the one everyone is talking about, has been the reveal of the Orion AR glasses. Zuckerberg showed these prototypical AR glasses that look like bulky standard glasses and that provide see-through augmented reality with a quality that is not available in any other commercial AR product on the market. This news created a lot of buzz and also a lot of discussions in tech circles: some people said it’s just smoke, while others are excited and say that AR is finally coming. As usual, the truth is in the middle.
The good thing about Orion is that it finally shows true AR. For the first time, a major company has shown some AR glasses that look ok-ish to wear in public and that have good technical features (good FOV, hand tracking, etc…). It’s a landmark moment for our space: after so much dreaming about how AR glasses could be one day, we finally have a peek at a true first version of them. It’s like starting to see the light at the end of the tunnel. Someone compared this to an iPhone moment… I think the comparison is not correct, because the iPhone was a product, while this is a prototype. But maybe we can compare it to the moment Palmer Luckey launched the Kickstarter of the Oculus DK1: it’s the start of something new. It’s an exciting moment for the whole ecosystem and I can feel that something has changed in our space.
I think the comparison with the early Oculus DK1 prototype holds for many things. This is a prototype with a lot of future potential and can be the beginning of a new technological course (please, let’s not find a new name for this… I’ve enough after VR, metaverse, spatial computing…). It doesn’t hold for others: this is the product of a big corporation, for instance, and it also required a lot of money and time to be developed. Anyway, as Bernard Yee (from Windup Minds) made me notice on Linkedin, if we consider this the DK1 of AR glasses, and we make a parallel with VR, that to go from DK1 to the beginning of mainstream distribution with the Quest 2 took almost 10 years, we can imagine that mainstream AR glasses may still need other 10 years to happen.
I know, it’s a stretched comparison: Meta is bigger than Oculus, the ecosystem now is more mature, AI can speed up all developments, etc… But still, it’s a realistic timeframe: Michael Abrash talked a few years ago about imagining lightweight standalone AR glasses for everyday usage for around 2035-2040. And the rumors talk about Meta releasing its first consumer AR glasses around 2027. Zuckerberg highlighted that the next device will be directed at consumers and not be a devkit, but he also said that he expects the cost to be between a laptop and a smartphone… so maybe we can imagine a $1200 price more or less. At that price, and with the limited content available, I doubt that many people will buy it. This leads us to a timeline for consumer adoption that extends beyond 2030.
Orion has been very good at showing us how AR will be in a few years. It’s a great thing to be excited about, but the practical consequences for all of us in the short term are close to zero. I honestly think that it shouldn’t change much of any roadmap that any tech-savvy XR person should have already thought before this announcement. But on the other side, the good news is that it probably put AR on the radar of some people who did not think about it, which leads us to the next point.
Orion’s positive indirect effects
Orion is a very expensive device: to manufacture one, you need something in the range between $10-20000, not to mention all the problems of yield that you may have on all its experimental components. And from the early tests, it also seems to have its share of issues: some people trying it reported some bugs and crashes. It simply can not go on sale. That’s why Meta’s detractors said that Orion is just smoke and mirrors, a show for the investors and the press. Some people compared it to the Half Dome prototypes that were shown but then never came to the market (RIP).
I think these people do not consider one small huge detail: Orion was worn by Mark Zuckerberg on the stage of the event. You may love him or hate him, but Mark Zuckerberg is one of the most powerful people in the technology world at this moment. And he went on a stage at an event and he basically said: “You see, our mission is AR, we invested lots of money in it, we reached the first amazing result, and we will launch a product about it soon-ish. Ah, and see how I look cool with these things on my face”. How can you not say that this is a super important event?
If you are still not convinced, remember what happened a couple of years ago when the same person said “You know what? I like the metaverse, so I will rename my company to Meta”. The whole tech world was shaken, and we XR people suddenly became rock stars. We had so many job requests in that period that we could spend money on blackjack and hookers and still pay our bills. What Zuckerberg does and says matters, period. He never dedicated much attention to the Half Dome prototypes, that was more a thing to show that Meta was doing R&D, but he proudly wore Orion, and this shows a stark difference between the two things.
I’m pretty sure that Meta’s announcement of Orion will have ripple effects in our ecosystem, not as big as the ones of the Meta renaming, but still noticeable. Meta is showing that it is relentlessly investing in XR, notwithstanding the huge losses and the criticism. And now it showed something tangible about the “metaverse”, something people (and especially, investors) can understand. It will give people in the technology world the impression we are really getting there. A few days before Meta Connect, Snap showed the same attitude: the 5th edition of the Spectacles is still an expensive devkit, but Snap keeps evolving them… and again, Snap’s CEO put them on the stage as well. Apple already committed to doing AR and made clear that the endgame is AR glasses. To me it’s clear that at this point AR is inevitable… it will take time, but it’s coming. And many people are probably realising this: more investors will start considering XR again. Major companies will put again AR in their roadmaps. And so on. They will get it’s a long-term bet, so don’t expect a huge influx of money tomorrow, but at least this technology will be back on their horizon of things to consider in the future.
Furthermore, the showcase of such a device by Meta has surely created some anxiety in its competitors: Apple, Google, etc… may act cool on the outside, but for sure on the inside are trying to get information about Orion, and are working hard to build something better than that. This announcement will probably speed up or modify the internal AR glasses roadmap of some big tech companies.
So the reveal of Orion will not have short-term direct effects, but I’m sure it will create medium-term ripple effects.
A positive moment for XR
Step by step, XR is rising again from its last fall. I see that slowly there is always more optimism in the ecosystem and slowly there are new opportunities coming.
Yesterday I was reading an article from Bloomberg that highlights how Meta’s stock value has risen a lot in the last two years. The article was titled “Zuckerberg’s metaverse gamble pays off with $201 billion fortune”, which is in stark contrast with all the previous articles mocking him and the crazy expenses Meta is doing on XR. Thanks to its futuristic investments in XR and AI, together with the layoffs of the “year efficiency”, Meta’s stock value is so high that now Zuck is the 4th richest person in the world. Bloomberg is an esteemed journal in the business circles, so I guess also this news will have ripple effects.
The perception around XR is improving always more and in fact, as I’ve already reported a few times, the famous VC fund a16z is now actively investing in XR companies. Be careful not to be too optimistic, though: the situation in our space is still quite cold, and finding investors is still difficult. But I see it as slowly getting warmer, and all these recent announcements are for sure going to help in raising the temperature. So if you are looking for investments for your XR product, keep doing it, and also add some references to the Meta Connect in your pitch. But always remember the long timeframes for the adoption of XR we talked about above.
Smartglasses are on the rise
If AR glasses like Orion are the moonshot, what are the products that are more accessible now? The answer is the smart glasses. Ray-Ban Meta is the gadget of the moment, and I’ve heard rumors about a couple million units sold. The reason for this success is that the device… well, it’s first of all a pair of sunglasses and then a device… it is cool to wear. And it offers a few features, but it delivers them well. Everyone I know who has a pair of Ray-Ban Meta is very happy about them and this is a sign of the potential of the product. Meta is now also selling them with transition lenses, so people can not use only them outdoors, but also indoors. The plan is to make people keep these glasses on all the time, a bit like people do with Apple Airpods.
Ray-Ban Meta has not been the first smartglass out there, but it’s the first to be a device truly consumer-oriented, fashionable, useful, and also distributed everywhere thanks to the distribution network of EssilorLuxottica. Meta made a genius move in partnering with Luxottica and in fact, the partnership has been extended until 2030. This means we should expect other fashionable good smartglasses for the next 6 years: considering the usual life cycle of these devices, this may mean 2-3 versions of the product. 2030 is also a time when Meta may be at its 2nd iteration of consumer AR glasses, and I wonder if Luxottica may already be part of their design. Maybe in 2030, we may have fashionable AR glasses, who knows. Zuckerberg said in his interview with Alex Heath that he’s a great believer in Luxottica, so I wouldn’t find it strange if this partnership extended even beyond 2030.
Other companies have understood the potential of smartglasses, and in fact, Google tried to partner with Luxottica as well and Apple is rumored to be working on smartglasses, too. It’s a sector I’m following with much interest, because it’s something that can become popular in the short term, maybe a few years. I do not expect to be “popular” in the sense of billions of units, but in the dozens of millions, yes. I expect that in a few years, it won’t be strange to see some people in the street who have Ray-Ban or Apple smart glasses.
I think a defining moment will be when we can develop applications for these smartglasses. For now, they are purely consumer devices, with only Meta providing features for them. But if they become really popular, then it would make sense for Meta to create an ecosystem for them and provide an SDK, maybe to be integrated into other standard Android apps. You could develop your Android app and add extra features if the user has Meta smartglasses. That is the moment I’m waiting for because I think this may represent an opportunity for all of us developers, a new and interesting market to enter into.
Smartglasses and the races for AR and AI
Smartglasses are also a very important vehicle for the current races towards mainstream AR and AI that the big tech companies are taking part in.
Both implications are easy to get. If smartglasses get more popular, the people wearing smartglasses will more easily transition to AR glasses. These people will already be used to having glasses on their heads, they will already understand their utility, and they have probably also gone through the acceptance of all the concerns of wearing cameras on their faces. So the more people will wear smart glasses, the easier the adoption of AR glasses will be. Of course, the brand that will sell more smartglasses will have an advantage, because its customers may have developed loyalty, but the market is so early that this is not a guarantee.
Regarding AI, the fact that these smartglasses are all now connected to AI will make a big difference. Meta announced LLAMA 3.2, a multimodal AI model: it can work not only with text, but also with voice, images, and videos. People using Meta glasses will get used to using Meta AI for everything they do because they have an assistant always available. They do not even have to take a smartphone out of their pocket, they have just to say “Hey Meta” and they can ask whatever question about what they have in front of them. I’m pretty sure that people using smartglasses will use AI quite a lot because it is just too handy. The advantage of Meta is twofold: first of all, many more people will use its Meta AI product. Then, many people using its product means a lot of new material to train its multimodal AI. Ray-Ban Meta can now read text, and QR codes, it can remember things… this means that the AI you will use will train itself with the texts you read, you images you see, and so on. Meta has the great opportunity of training LLAMA and Meta AI on real data of people actually using it every day. If the AR glasses of Meta and Apple would come out the same day 3 years from now, Meta’s one would have a much better AI assistant, because it would have been trained with 3 years of real data of people wearing glasses on their faces. This may also mean that Meta AI APIs may work better for XR use cases in applications made by developers for glasses. This is a huge advantage in my opinion.
Quest 3S and the increase in VR sales
Quest 3S (I use the uppercase to make my friend David Heaney happy) was the first news of the Meta Connect, but by far not the first by importance. It anyway had its space and Zuckerberg announced it as a very important product inside Meta offerings.
Meta Quest 3S is a headset that mixes the horsepower and the mixed reality capabilities of Quest 3 with the visuals of Quest 2. It only costs $299, and it is able to let people play a good catalog of games. Probably this device will have a good success: the Holiday season is close, and people will want to buy it as a gift for Christmas, especially because amazing games like Batman and Alien are coming soon for it. Kids will love it: it is a new device to have fun, socialize, and play Roblox.
In my opinion, Quest 3S may sell some dozen units over its lifetime. If I had to make a guess, I would say maybe 20-30M like the Quest 2 over the next 2-3 years. This would increase even more the number of people in VR, with a clear benefit for all the companies selling content on the Meta Horizon (a.k.a. Quest) Store. I don’t think it’s going to break the market selling 100M units, but it is part of that step-by-step advancement of XR I was telling you before. I’m very happy it launched because it will bring more people in, and this will bring overall more positivity around XR. Talking about business, the Quest 3S is the headset for which you can increase the market numbers for the next two years in the slides of the pitch deck of your startup.
Mixed Reality and Spatial Computing overtake Virtual Reality
I keep talking about VR, but the trend in Meta, and in the market in general, has clearly shifted from VR towards MR and AR. And also from pure gaming to a more general “spatial computing”.
Zuck announced Quest 3S saying it is the most affordable mixed reality headset and then added that now Quest 3 and 3S are the “best family of mixed reality devices, period”. It was clearly a way to fire shots at Apple, but it shows that now he’s talking more about MR than VR. The reason is clear: while MR is still unripe, it is one of the roads, together with smartglasses, that leads to AR glasses. And AR glasses are the endgame for tech companies because they are the devices that will substitute smartphones.
MR is the big trend and Apple helped a lot in putting the attention on it. Don’t be fooled by the usual attitude of seeing everything black or white: VR is here to stay, too, but probably its products are going to have a smaller scale. Even if we transitioned to standalone VR, PCVR still exists and has a strong community, too. But the numbers are different: if there are 2M PCVR users on Steam, there are maybe 10M on Quest. MR will have a similar destiny: maybe in 5 years, 50M people will use MR software, and 10M will use VR ones every month. maybe the headset will be the same, but the content with MR features will be used more because it is less isolating and more natural.
And among this used content, most of it won’t be about gaming. Because the reality is that the vast majority of us don’t spend the biggest part of our day playing games: we work, we communicate with people, we create stuff, etc… Again Apple has been a pioneer with this, because it shifted the attention of XR headsets heavily to non-gaming things. While I hate the hype around the term “Spatial Computing”, it has an important implication for me: it makes clear the headset is not a toy, it is a new form of computing, of interacting with technology in general. Meta was selling the Quest just as a console, but this was a very limiting vision. Now Meta understood this, and in fact this year the Meta Gaming Showcase has not been held and the company dedicated just a few minutes to announce new games for Quest at Connect. Most of the time they talked about socializing, mirroring the content of your computer, watching YouTube, watching Amazon videos, etc…
Quest 1 was a VR gaming console. Quest 3/3S is an MR computing device where you can also play games. The transition is not complete yet and most titles on the Horizon store are still games, but the fact that now Meta has an accelerator specifically dedicated to non-gaming content is very telling of what are the long-term goals.
There will always be the need for games and other escapist experiences: PC and mobile stores are full of games, too. So if you are making a VR game, don’t be worried, there will still be a market for your product. But my advice for you all is to start thinking in new terms: can you think about non-gaming experiences for XR headsets, too? And also, can you think about something in AR/MR and not only in VR? And if you are making a VR application, can it have an MR mode? These are all underdeveloped markets: while there are very successful VR games, the number of good MR applications is very limited, so it’s an open market where you could find success. An open market in which all major corporates are desperately looking for good content. Think about it.
New opportunities and new problems with 2D apps
Meta is trying in every possible way to attract content to its store. And given its attention to Spatial Computing, now it’s not only interested in VR apps but also in 2D apps that the users can put in their spaces to have some kind of utility. Meta has not an existing 2D store like Apple or Google, so it has to also create an ecosystem of 2D apps from scratch.
This is the reason why it just opened up the store to 2D app content. And it launched its Spatial SDK, which allows existing Android app developers to use their traditional tools (e.g. Android Studio) and languages (e.g. Kotlin) to develop apps that have some sort of 3D or spatial content. Basically, now the Horizon Store is open for 3D immersive apps, 2D apps, but also all the shades of gray that there are in between these two extremes. Meta is heavily investing in this and in fact, it has just started a Meta Horizon Start program for mobile developers. Previously Start was a program just to facilitate teams working on XR apps, while now it is opening up also to mobile developers.
I think this is an opportunity for Android developers to enter a new market. Sure, mobile has bigger numbers, but if a porting to MR does not cost much money, it would be a way to enter a new market with smaller numbers, but also smaller competition. If I had a mobile app that may make sense to be executed on Quest, I would make a thought about that.
As a user, this instead concerns me a bit, though: having both 2D and 3D apps on the store may be very confusing, and I would love to have a clever way to separate them. A very well-made search and recommendation engine would help with that, otherwise, the risk is that when I’m looking for a cool immersive game to play, I’m provided with the suggestion of playing Candy Crush. The problem is that Meta is not proving very good at doing that: now that App Lab has been merged into the main store, people are given recommendations about shovelware. I think this is an issue to be solved as soon as possible.
This is also a problem for me as a VR developer: now I know that I don’t have only to compete with XR content, but also with 2D one. This is something that can reduce my revenues. So at the end of the day, this decision brings both pros and cons.
Iif you think about it, the availability of 2D apps in XR is also an interesting step towards AR glasses. For instance, at first, you use Instagram on your mobile phone, then you use it on your AR glasses connected to your phone, and then just on your AR glasses. If you have both your 2D and your 3D apps on your glasses, in the end, you don’t need your phone anymore.
Passthrough APIs
Finally Meta promised us developers to give us access to the camera images early next year. This is great for two reasons.
The first is that Meta listened to us: it is proof that if some decision is not heavily strategic for Meta (like the hated Facebook login), the company can revert it if the community strongly asks for it. I was asking for it desperately every time I could and like me there were many other people pushing for it. All together we won and that’s proof of the power of the community.
The second reason is that finally, with camera access, we can really create applications that are truly mixed reality. We can use computer vision and artificial intelligence algorithms on the images that come from the cameras in front of the user to understand what is the reality around him and make the application react to it accordingly. One example of why this can be useful is this prototype of an interior design assistant I made: it can look at what you have in the room around you and suggest how to improve it with new (or different) pieces of furniture:
Camera access will make truly MR happen and will also create more bridges between MR and AI. There are new opportunities opening up to developers also in this sense. But of course, it creates some fears about privacy…
Privacy Concerns
Having glasses the whole day connected to AI is great: I can imagine using them myself every day: I could be at a store and ask my glasses if a certain product is cheaper somewhere else; I could watch myself in the mirror and ask if I look cool; I could ask for help when writing a greetings letter and I want some hints on how to make it more original; and I could of course ask them to remind me where I put my keys or my car or everything else I usually forget. But this also means that I’m sending all my data to Meta.
As I’ve said, Meta would love me to keep the glasses on all the time. What if I invoke these glasses to do something related to what I have on my desk when I have also my credit card on it? And how is this data used to train Meta AI? And even more, since Meta is an advertisement company, how all this data is going to be used to create a model around my person and my behavior? This model is very useful to have a personal assistant to help me, but it is also what can be used to show me personalized ads. And in the wrong hand, as Louis Rosenberg said, can also be used to manipulate me.
From big powers come big responsibilities. While on the technical side, I’m amazed at the Ray-Ban Meta glasses, on the privacy side, I’m very scared. Meta in the past has been bad in managing personal data. Now Zuckerberg is trying to give a new image to the company, hoping to fix this reputation issue. This is noticeable also in the different look that Zuck gives to himself: he looks more human in his latest appearances. But still, I will trust Meta only if we don’t see some scary reports about the user data after smartglasses have become popular. The next 2-3 years will be very important in this sense. And in any case, since Meta is an ad company, I’m sure some of my personal data will be sold to the best bidder in one way or another because that’s the business of the company. That’s why I’m very cautious about it.
Apple’s big stance on privacy will pay off in the glasses era. Meta, Google, Snap, and Bytedance are all ad companies, while Apple is primarily a hardware+software company, so it can care about our personal data much more. People will trust wearing Apple glasses much more than wearing Meta glasses… it is to be seen what will be the long-term effects of this difference in reputation.
Of course, I’m just talking about the privacy concerns of the manufacturer. What developers do with the images they get from cameras is another big can of worms… Users had better be educated about how to protect their privacy.
Zuck underplaying the role of small creators
If there is a thing I truly don’t like about what The Zuck did recently has been when he downplayed two times the importance of small creators during his interview with Alex Heath. Zuck, if you are reading this article, you have to know I’m angry with you about this (And then I have to ask you: if you are the 4th richest man in the world… why the hell do you spend your time reading my articles and not with blackjack and hookers?)
The first time he downplayed creators is when he spoke about AI. He said that basically, he doesn’t care if small creators don’t want their data to be used to train AI. The contribution of every single creator in the final model (and so in the answers) is so little, that if someone objects to be part of the training, for him is fine. Plus of course, since the contribution is so minimal, Meta’s main interest is not paying for the provided data. Meta may envision partnering with some important source of content, but not with single small creators. Well, I would like to tell Zuckerberg that if he doesn’t think that our contribution is so relevant, why does he still scrape our data? Let’s remove all our data from all of us small creators from your LLAMA models and let’s train them on only the websites of 2-3 big players you want to pay: let’s see what a great model comes out. Our data as singles is not relevant, but all our data together is very important. I remember a friend of mine sending me a picture of him asking Chat GPT 4 about “Oculus Passthrough” and Chat GPT answering him with data from my blog and suggesting he go reading it (and I’m not even kidding… see the picture below). So it seems to me that we small creators have some importance after all…
The Zuck made the same mistake the second time when speaking about Orion glasses. When Alex Heath asked if this was a devkit, he answered that Meta has enough internal resources and enough close partners that it doesn’t need to distribute Orion as a dev kit to build its ecosystem. That’s very shortsighted. The most successful VR games out there are Gorilla Tag and Beat Saber which are two games made by indie studios no one knew before. A lot of pieces of content that Meta paid millions to its partners to develop instead had just ok sales. I think this says it all. It’s true that Meta can build some foundations of the software for its AR glasses alone, but it had better to make other small creators play with it if it truly wants to make a product people and developers want. If it wants to create a rich ecosystem, it is more probable that some indie creator finds the right mechanic to make a game with Orion funny than an internal team at Meta. So, Zuck, if you are reading this, give me and the other bois of the XR community an Orion devkit. I bought that potato headset of the Quest Pro, I guess I should have a reward for that, don’t I?
I think that this shows the attitude that Meta already showed in the past with the creation of App Lab: it doesn’t consider us small creators seriously, it is not a company that cares about creating a vibrant community. And I think it’s losing a lot of opportunities doing so. If you are a small creator, expect some opportunities offered by Meta (e.g. Start program) but do not expect a huge helping hand.
The role of Italy
I was surprised to hear Mark Zuckerberg talking about Luxottica in such an enthusiastic way during his interview with The Verge. He said that in his opinion, Luxottica will become a huge tech company and since it is based in Italy, Italy may have an important role in the design and manufacturing of XR glasses.
As an Italian, this was pretty exciting to hear: our role in the current XR market is basically inexistent, and hearing that we may have a central role in what concerns glasses, is pretty cool. I wonder what implications this may have for the future of my country and what opportunities it will create. For sure the growing attention towards fashion on wearables may give more power to places where there is an expertise about fashion, like Italy or France. (And by the way, CEO of Luxottica, if you are reading this, too, call me so we can talk about collaborations on your yacht).
Other considerations
Some minor considerations:
- With the discontinuation of the Quest Pro, there is no Meta headset with embedded eye tracking anymore. This means to me that the time for eye tracking, which a few years ago seemed a very important feature, has not come yet. It is probably not as relevant for VR as we used to think. So content using eye tracking still has to wait before being distributed on a big market;
- Meta launched Hyperscape, letting people enter into environments scanned with the phone. I already said in my previous article that I love this feature, and I think it prepares us for a time where every one of us will scan his own room and then let other people enter it. But there is another implication of this demo: the environment showed there are cloud rendered. Meta is using its Avalanche service (that was leaked years ago) to cloud-stream the environment you are looking at. This is the first time I see a major company using cloud rendering for a consumer app, and it may not be the last. I wonder if the time for cloud rendering, at least in beta form, is not starting to come. From the many people that are having issues with the app, it seems that cloud rendering is not ready for prime time, yet, but it may be having its first steps;
- Meta announced a new version of its Avatars system. Meta Avatars are constantly evolving to become always more realistic. The end game is the hyper-realistic Codec Avatars that Meta teased multiple times of course. It’s good that Meta is investing resources in giving people the opportunity to express themselves in the best way possible. And now Meta is also opening up a marketplace of accessories for the avatars: this is another opportunity for independent creators to earn a few bucks. But my question is: do avatars are really that relevant if they belong to a single platform? We need a system for digital identity and that should work in every app I am in, with every headset I am using. If Meta wants to really make their Avatars useful, they should be cross-platform like Ready Player Me, in my opinion.
References
I promised you some useful references for the news discussed above. Here are some great links to learn everything that was announced at Meta Connect 2024:
Some other articles worth a look about Orion:
And that’s it for today: I hope this article has been useful in making you think at a deeper level about the news announced at Connect and I also hope that it can be useful for your business (if you have a company) or your career (if you are an individual). If you like my way of analyzing the XR market and it may be useful for you, feel free to contact me to talk about possible collaborations (sorry for the commercial). Of course, I would love to also hear your opinion about all of this, so please write your take from Connect here below in the comment section or by writing me on social media.
And whoever you are, I wish you the best career in XR possible!
(Header image from a Meta event)
Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I’ll be very happy because I’ll earn a small commission on your purchase. You can find my boring full disclosure here.
This article was originally published on skarredghost.com