AR

Apple AR Glasses - The Future of Reality!

AR is the future and I’m so excited for it! 

Apple is currently the leader in terms of AR Tracking right now. We know that they’re working on their brand new AR Glasses, which we’ve done a number of videos on but you see, this video is quite a bit different, this is because I’ve sort of tried the Apple AR Glasses myself. Well, not the actual unit as no one outside of Apple really has access to those, but I have tried two devices which are essentially early prototypes of what the Apple AR Glasses will deliver, in terms of functionality.

So, without any further ado, get those snacks ready and let’s jump into some AR.


DreamGlass 4K

The first item that I have tried is called the DreamGlass 4K. This thing hasn’t fully launched just yet, it’s on Indiegogo, DreamGlass hasn’t sponsored this in any way, they have sent this over for us to check out. After checking it out, I honestly don’t recommend it but it does have some interesting features that do give us an early taste of what the Apple Glasses can do.

This is basically a Heads-On Display Headset, very similar to the Google Glass where you had this tiny Display in a corner of your field-of-view. That tiny Display would show you some info such as the Time, Date, Weather, Notifications and stuff like that. The first difference here is that the Display on these is Full-Screen, rather than it being a tiny window. Secondly, these Glasses are actually tethered to a Remote, which I’ll get to in just a second, meaning that they’re more like a home-experience device rather than something that you would take on the street or in public.

The way they work is that they have these two tiny Displays in each of the Lenses, which your eye then perceives as one Display. It’s fairly high-res, at least according to DreamGlass. They claim that this is a 4K Panel, but  I’m not so sure about that as the content that I was viewing on this was mostly 1080p. However, I can’t deny that the Display inside of this is pretty decent. You connect the Headset to a Remote, which runs Android. It has Bluetooth, WiFi and all of that, which means you can put movies and a few Apps on here, then preview those on the Headset itself. The reason why this isn’t AR is because the Display itself is fixed, it doesn’t move. If you move your head, it will move as well, just like the Google Glass did. 

The DreamGlass 4K is an early prototype with some flaws to iron out, but it has some promising features.

The DreamGlass 4K is an early prototype with some flaws to iron out, but it has some promising features.

Now, what’s cool about this is that the Remote itself also has a HDMI Port, meaning that you can connect this to a Console and play Console Games on your Headset without anyone even noticing. You can connect your Smartphone and play some movies directly or even stream content wirelessly, that’s doable as well. Plus, you can connect a Controller and basically have a gaming set-up on the go. However, my favourite feature is that you can also connect it to a Laptop and have a full Desktop UI, which you can then control with your Mouse & Keyboard and then when you remove the Glasses, the virtual set-up is all gone.

So, that’s what I really like about these, the fact that they give us a glimpse of what we could do with true AR Glasses in the future. If you’re ever played ‘Horizon Zero Dawn’, humans had these Chips that they would attach to their heads and they were able to see AR Elements in the real world, that’s how they were all using their computers. There were no Monitors, you would just sit on a desk, turn on your AR Chip or Headset and your computer will appear right in front of you. The DreamGlass is essentially an early look at that.

Unfortunately, it does have a lot of issues. For example, even-though DreamGlass says that this is 4K Display, on their Specs sheet they only list 2.5K, which I’m assuming is for each eye. But anyway, once I connected my Laptop to the Headset, the Laptop was only seeing this as a 1080p Display, rather than a 4K Display. Not only that, but there was a significant amount of ghosting and lag when using the Mouse. So, the Response Time on this is extremely high, making it unsuitable for almost anything other than watching movies really. Speaking of that, we tried manually putting some videos and Apps onto the Headset, but our computers just couldn’t see it. Not even a USB stick would work, through that single USB A Port. You can activate 3D Mode using a Button on the Remote, which actually works surprisingly well, that’s cool but realistically, I just cannot recommend this to anyone.

It costs $400 or £300, as much as a Console, but it’s nothing more than a Monitor that’s strapped to your head, which also needs another device to work properly. The DreamGlass isn’t really worth it, but at least we got a good idea of how the Apple Glasses could be used to interact with a virtual Mac.


Oculus Quest

Now, the second device that I have is a bit more special, this is the Oculus Quest. It is something that came out last year and we’ve even made a very detailed Review Video, which you can check out on the channel.

Now, as most of you might now, the Oculus Quest is a VR Headset, meaning that you’re fully immersed into the experience and you cannot see the real world at all. Something quite unique about the Oculus Quest is that it is a fully wireless VR experience so unlike the Smartphone VR systems, where you just slide in your Phone and you can just look around up, down, left or right, the Quest has the full Six Degrees of freedom. This means that you can even walk around your room and the Quest would be able to track that. It’s pretty much like the high-end VR systems, like the HTC Vive, except it’s fully wireless and it does all the tracking on its own, without the need to place large tracking Sensors in your room. In order to achieve this, it uses four Cameras, one in each corner that tracks the environment in 3D. 

A recent update for the Oculus Quest really ups it’s AR game, with features that Apple’s own device could implement.

A recent update for the Oculus Quest really ups it’s AR game, with features that Apple’s own device could implement.

But what has this got to do with the Apple AR Glasses, which are surely AR and not VR? Well, Oculus has actually released a recent update, which allows you to also use the Quest as an AR Headset. So, remember those Cameras that I mentioned that are used to map the environment in 3D? Well, with this recent update, you can actually replace the standard background in the Menus, with the video feed from the Cameras. Even-though this is just black and white, it perfectly matches the environment in real life and you now have all the Menu Elements appear on top of your real world. This is fully tracked, which is really what the Apple AR Glasses will be all about.

The only difference will be that Apple’s Glasses will be smaller and made so that they resemble actual Glasses. Plus, the AR Elements would be overlaid on top of your real-world view, rather than a Camera recording your real-world view and then overlaying the AR Clements on top of that video recording. Everything will be just as sharp and just as colourful as in real life.

This would be cool if you could just use your hands to control the UI. Well, it turns out you actually can! The Oculus Quest has full Hand-Tracking Support, meaning that I can just use my hands to navigate the UI. There are also a few Gestures that I can do to simulate a click or even move the UI around. I have to say, if you do have an Oculus Quest, you have to try this for yourself.

Firstly, you need to make sure that you have the latest updates installed, then go into ‘Settings’, ‘Experimental Features’ and turn on the new Beta HomeScreen UI. Once you’ve done that, the Headset will reboot. After that, you can go into ‘Settings’ again, select ‘Virtual Environments’, select ‘Passthrough’ and there you go, an early look at AR Glasses.


Apple AR Glasses

Ok, so now that I have talked about two pieces of tech that give us an idea of how Apple’s AR Glasses will work, how would they be any different from these two Headsets?

First of all, Apple will designing their AR Glasses to look like regular Glasses. We have heard some rumors that Apple is also working on a VR/AR Headset, which is said to look similar to the Oculus Quest and that one might launch earlier. The main AR Glasses Headset that we’re all waiting for, that would just look like a regular pair of Glasses.

The use of the LiDAR Sensor on the iPad Pro & iPhone 12 Pro’s may be a test-run for something much bigger.

The use of the LiDAR Sensor on the iPad Pro & iPhone 12 Pro’s may be a test-run for something much bigger.

The second difference, like I mentioned before, is that the AR Elements would be displayed on the actual Lenses of the Glasses, rather than those being displayed on a video-stream, like they are on the Oculus Quest. You will get to see the AR Elements projected right onto the real world, with the elements themselves being in 8K Resolution, according to the info that we’ve seen so far. Also, instead of using actual Cameras to map the environment, like the Oculus Quest, Apple is said to be using LiDAR Modules that they’re currently using in the 2020 iPad Pro (this is also what the iPhone 12 Pro’s will be using).

Not only that, but Jon Prosser said that this is Apple’s plan, to implement the LiDAR Module on their iPads and iPhones first so that users get to use it for a bit, then take all the AR data that they’ve gathered from those devices and improve on the AR experience for the Glasses. It seems like the only reason why the LiDAR Modules are there is to create the AR experience of a future product, Apple’s AR Glasses. 

In terms of interacting with the UI Elements, Siri will indeed be present. Just like with the AirPods and pretty much all of Apple’s devices today, you’ll be able to ask Siri different questions and requests certain actions. The interesting part is when it comes to touch interaction. We’ve seen some reports that Apple will have a Touch Panel built into the Frame. That may be the case for things such as Volume Control, but I think the main interaction would be done in a very similar way as to how it works on the Oculus Quest, which is by fully using your hands in the air, flicking and scrolling through the UI. 

In fact, a recent Patent Application found by ‘AppleInsider’ shows that Apple is looking into having interactive AR Elements appear on a real-world surface. What this means is that you would be able to have Buttons, Menus and even Apps on a flat surface, in the real world, and interact with them just by pressing the virtual Buttons. This will be sort of like having a virtual iPad that’s just sitting on a desk, that you can actually interact with. 

In fact, one MacRumors user, ‘AngerDanger’, claims that Apple could potentially use Thermal Imaging to detect hotspots left by finger-taps on surfaces, which they can then register as Touch Input. Not only that, but if you’ve been following Apple’s advancements in AR, with ARKit, you know that they managed to pull off things that were previously considered impossible. They managed to pull of AR Tracking with just a single Camera Lens. Normally, you would need at least two Cameras so you can compare the differences between the two and create a 3D shape. Apple managed to pull this off with just one Camera and by using the data from their built-in Gyroscope and Accelerometer. Not only that, but Apple has also managed to add real-time shadows onto the virtual AR objects. What this means is that, if you have a virtual object and in the real world and you have a light pointed in one direction, that virtual object would blend into the scene perfectly with the light casting a shadow on its other side.  Apple has also managed to pull this off with reflections. If you have a glossy virtual object, objects from the real world, this would actually cast reflections on it.

The bulkier headset that Mike Rockwell advocated for would need external hardware, something that Jony Ive and Tim Cook did not side with.

The bulkier headset that Mike Rockwell advocated for would need external hardware, something that Jony Ive and Tim Cook did not side with.

Apple’s AR glasses are said to be announced by the end of 2021, with a full release in 2022. This can of course be delayed, depending on how the project is going for Apple. Speaking of that, a report that came from Mark Gurman in June 2020 claims that there have been some serious internal debates between Jony Ive and Mike Rockwell, who leads Apple’s AR and VR development on how this Headset should work.

Jony wanted a fully wireless Headset that was thin and sleek, essentially, the Glasses. Mike, on the other hand, wanted something revolutionary in terms of Graphics and experience and because of that, the Headset ended up being bulky and needing an External Box, which would process all the data and then stream it wirelessly to the Headset. This Headset would’ve been an AR and VR Headset, again, very similar to the Oculus Quest. 

Tim Cook apparently sided with Jony on a Headset that doesn’t take people from the real world, but instead adds Elements onto the real world in order to improve it, which I fully agree with. I’m personally team Jony here but do let me know what do you guys think? Would you prefer that Apple develops the AR Glasses or an AR/VR Headset, like the Oculus Quest, instead?

Apple AR Glasses (2020) - The FUTURE!


Some of you might know that Apple is working on a pair of AR glasses. Now in case you don’t know what those are, well I’ll explain everything you need to know in this article, alongside why these glasses have the potential of revolutionising the tech industry entirely, just like the iPhone did back in 2007.

I’m personal extremely excited about these, and you will most certainly be as well, by the end of this.

Grab some popcorn and here’s the real future of the tech industry!


Ok, so for those of you who don’t know what AR is, AR means augmented reality and unlike VR or virtual reality where you put this giant headset on and you’re transported into a fully virtual world, AR keeps you in the real world but it ads certain virtual elements to it.

We’ve recently started seeing AR being used in smartphones, like the iPhone 11 Pro Max. Apple released ARKit which is pretty much an API. In simple English it’s a tool that allows developers to easily make AR apps that would automatically take advantage of all the sensors that the iPhone has, such as the accelerometer, the gyroscope, the dual and now the triple lens camera module and so on.

IKEA’s Place app allows you to see how different items of furniture would fit within your home thanks to AR

IKEA’s Place app allows you to see how different items of furniture would fit within your home thanks to AR

So we started seeing apps such as IKEA’s Place app that allows you to place furniture from their store, inside your real world view. That app works incredibly well, the tracking is spot on, it never lost the tracking and the furniture even had real time shadows as well as the ability to get very close to them and see all the details so I was extremely impressed with what this app can do. I recently moved into a new apartment which is still empty so I’ll be using this app to buy some furniture in the upcoming weeks and it’s been great!

And Apple has put so much resources into ARKit that they got to a point where we have real-time shadows on objects, real-time reflections where lights from the real world will get reflected onto the virual objects which is just nuts. But what’s probably even crazier is that with ARKit 3, virtual objects can detect the presence of a real person and circle around it for example.

While I do see VR as more suitable for games and entertainment in general, I see AR as suited more towards productivity. 

And this is why having this AR experience all the time, without having to use your smartphone would be a game-changer. This is where Apple’s upcoming AR headset comes into play.

And the questions you all probably have regarding this headset are:

1. How will it work?

2. What would it look like? 

3. How much will it cost? 

1. HOW WILL IT WORK?

So let’s start off with how it will work? 

Ok, so there’s multiple ways that you can build such a headset. Usually VR headsets are big, bulky and they connect to your PC via an actual cable. 

There are a few examples such as the Oculus Quest which are completely wireless, and that is where the future of VR is heading towards. When it comes to Apple their AR headset will indeed be fully wireless as well.

One of the patents that Apple has filed for regarding their AR Glasses (Source: Patently Apple)

One of the patents that Apple has filed for regarding their AR Glasses (Source: Patently Apple)

There’s been quite a number of patents that Apple has applied for, all showing a very thin and light device, that looks pretty much like a regular pair of glasses. CNET even released a pretty big report back in 2018, with some details that they got from inside sources, on how Apple’s AR glasses would work like. 

The project is apparently called T288 internally, and even back in 2018 it was still aimed at a 2020 release, just like more recent reports have all pointed towards.

CNET did detail that Apple is actually planning on making this a fully wireless device. So instead of Apple building the processor and everything inside the headset, this would be built into a different device, which would then render the entire scene and transmit the video stream to the glasses. This means the glasses will just act as the display, but a different device will actually render the entire scene. Otherwise, the glasses would have to be very thick, in order to accommodate a large battery that would be required for driving such a powerful processor. In case you’re wondering, the glasses will also be housing multiple cameras for tracking the environment, very similar to the Oculus Quest’s cameras, it’s just that the processing would be done outside of the headset.

Now CNET did also mention that Apple will have a separate box that would be processing all the data from the headset. However, Ming-Chi-Kuo, who’s been pretty much the most reliable source in terms of Apple leaks, reported earlier this year in MacRumors that the separate processing box would actually be the iPhone and that the AR Glasses would be “marketed as an iPhone accessory”. So this would be similar to the Apple Watch or the AirPods, essentially a new product category that would go hand in hand with the iPhone. 

Another report from Bloomberg, that we got back in 2017, pointed towards the exact same thing. It suggested that Apple was working on an AR headset that would release in 2020, and that would eventually even end up replacing the iPhone. In the first few years, it will need the iPhone for processing power, but once the chips get even more power efficient and we hopefully get a new battery technology, that would not be required anymore. 

Now that’s all well and good, but what will you be able to do with the Apple Glasses?

Well essentially you would be able to see everything from your iPhone directly in front of your eyes. So things such as; your messages, your Instagram feed, your emails, all of that would be viewable at all times right in front of your eyes. It would not take your entire feel of view, but instead, a small window overlay would display all of that data, and it’s very likely that you would be able to reposition that window, resize it and so on. The AR Glasses themselves will have Siri integration so most of the interaction will be done by voice and by some touch panels which are said to be embedded into the frame itself. 

So that’s pretty cool but what else will it be able to do?

Google Maps now supports AR where your route will be shown in real life (Source: Darrell as a Service)

Google Maps now supports AR where your route will be shown in real life (Source: Darrell as a Service)

Well, the uses cases that I would personally love to see would be integration with Maps, so that you can see arrows and real time directions on the street itself, right in front of your eyes. Google Maps recently had an update with AR integration, so now you can just lift your phone and you would indeed get real time directions in the real world as to which exact way to go. This is just on smartphones at the moment, but imagine having this on a pair of AR Glasses, that would be incredible!

But the most useful case scenario, for me at least, would be just having loads of displays anywhere I am. Imagine having three big curved monitors surrounding you, or even a room full of displays, or just a gigantic display right on your work-desk or even floating above it that you wouldn’t be able to have normally. 

Speaking of displays, one of the current issues with AR and VR today is the pixel density of the display panels. The lower the resolution of the display is, the lower the pixel density will be and the more pixels you will actually see when you put the headset one, which will create this very blurry and grainy effect.

The Oculus Quest for example, which is by far the best VR headset that I have used, has a resolution of 1440x1600 per each eye which is pretty high, it’s actually on par with the HTC Vive Pro. But you see, even with such a high resolution display, I can still easily see the pixels and the image quality just isn’t realistic at all. We’re still years ahead of extremely high resolution displays on VR and AR headsets.

However CNET did mention in their report that Apple would be using 8K displays in their headset! To give context, VR headsets today have close to a 2K display per eye and Apple wants to use an 8K display per eye! That’s insane! With a resolution like that, per eye, you should be able to see a perfectly clear image, without any pixilation at all, or at least barely even noticeable. 

But can the iPhone drive two 8K displays whilst calculating everything required for the AR Tracking in real time? Well considering that the Apple A10X processor that’s inside the AppleTV 4K can actually handle full 4K output and even 4K games albeit mobile ones, and that the Apple A13 processor that we have in this year’s iPhone 11’s is pretty much twice more powerful than that, and the Apple A14 chip which will be coming out next year will be the first to be based on a 5nm process and it should smoke the A13 chip. Well, it’s looking pretty likely that the A14 will indeed be able to push dual 8K output for the Apple AR Glasses.

2. WHAT WOULD IT LOOK LIKE?

Ok, so that was a fairly long section since how it would work is the most interesting and important part of the headset, but now let’s see what it would look like.

Well, considering Apple’s patents and the fact that the glasses themselves will only be streaming the data from the iPhone rather than calculating that itself, they should look very similar to traditional glasses.

Apple does already have two wearable devices, the Apple Watch and the AirPods, and they both look good. They have an elegant look to them, however we can all agree that they do have a weird/unique look. For example, the Apple Watch is rectangular and you can immediately tell when someone’s wearing an Apple Watch compared to a regular watch, or any other smartwatch for example. The same goes for the AirPods they have this very weird and distinct look to them, and you can always tell when someones wearing them. So I do believe that the Apple Glasses would have a similar look. They would still look like glasses, but they will have Apple’s unique taken on them with a more unusual design.

The ZONEofTECH Concept for Apple’s AR Glasses

The ZONEofTECH Concept for Apple’s AR Glasses

In our concept we actually made it in a realistic way, with a fairly thick frame that houses the battery and the chips for capturing the 3D data and then sending it over to the iPhone. We have three cameras on the front and then two more on each side, for special awareness. We have the wireless charging coils because let’s be real, Apple will very likely use a similar charging system as to what they already use in the Apple Watch and the AirPods, so wireless charging. The speakers would be bone conducting speakers, so the grills that you see there are just for the design actually and for airflow. This is of course our take on it, but I wouldn’t be surprised if Apple does something similar. Like I said, a pair of glasses that looks a bit more unique than the rest, just like the AirPods and the Apple Watch. 

3. HOW MUCH WILL IT COST?

Now this is actually a bit of a tricky one because you see in order for Apple to make the best possible AR Glasses, they would need to charge loads. This is a truly futuristic device that won’t be cheap to make by any means. But at the same time, Apple simply cannot price this at say $3000 since who would really buy it in that case? 

The rumoured price is around $1000-$1500 at this point, which I still think is very high and I don’t see many people buying this considering that it is not a phone, but instead an accessory that you also need an iPhone for.

What I think Apple should do is make the best Apple Glasses that they can, even if it costs them $3000 per unit and then price it very low, at $300 or so, around the same price as an Apple Watch costs. If they did this they would sell loads, and then they can make their money back through software purchases. That’s what I would do. 

But whatever price Apple decides to sell them for, we know that they are definitely happening. We have multiple reports from Ming-Chi-Kuo, DigiTimes, Bloomberg and even actual code found in iOS 13, that points towards Apple actively working on the AR Glasses. Apple even applied for a patent that allows you to adjust the opacity of the display so that AR objects are more or less visible in the real world, apparently. 

However, something that I find to be even more interesting is a recent report coming from DigiTimes that claims that Apple has partnered with Valve to develop the AR headset. Valve has just released their own headset, the Valve Index, which is pretty much the highest end VR headset that you can buy, and the HTC Vive and the Vive Pro had both been made in partnership with Valve. So Valve has been in the VR industry for quite a few years now and it’s definitely been one of the pioneers of VR and Apple partnering with them is just some amazing news! 

The article published on Monday explains that Apple has partnered with Valve (Source: MacRumors)

The article published on Monday explains that Apple has partnered with Valve (Source: MacRumors)

So there you have it, all the latest we know about the upcoming AR Glasses. I am personally really excited to see how these turn out, but we will have to wait until next year at least. Let me know what you guys think in the comments.