Random Comments

New iPad Minis with pencil support and stuff: iPad mini - Apple

I presumed this line was dead.

Yeah, Iā€™m surprised they updated those. Not surprised they are trying to sell more pencils, though.

Appleā€™s number one problem right now is that they need a new product entirely. Theyā€™ve just been iterating on what they have for quite a few years. Those iterations are getting smaller and smaller. I present to the court, exhibit a: animojis.

Itā€™s not surprising, though, because nobody else is coming out with new products either. Apple has always waited for other people to invent some great idea with a bad implementation like laptop, mp3 player, smart phone, tablet, smart watch, and then take it out of the suck-zone. They need someone else to make a new product that sucks so they can make a good one and take all the credit again.

In the meantime I think they should just go for the TV thing. Itā€™s a low hanging fruit, but still good. Get an incredible OLED display, embed an Apple TV right inside of it and youā€™re good to go. If they really want they can put some other stuff in there. Why not a router, wifi access point, and even a time machine all in the same TV? Most people have cable boxes and routers near their TV. Clean everyoneā€™s entertainment center up, and save us from the awful smart TVs with one that we can actually trust.

They are miniaturizing audio and their systems on a chip as fast as possible, what with the Apple Watch and AirPods and the W1 chips.

They are also plugging away hard at the augmented reality technology and image processing. In their next phone they are going for three cameras in a triangle for better room scanning and depth perception.

Put these together, and Iā€™m sure Apple will be the first to non-sucking AR glasses or other face-wearable tech.

VR is a thing they are not doing that they could copy and improve to the point where people actually buy it. I could totally see Apple being the first ones to release VR glasses that are actually as small as glasses.

AR glasses are still primarily a power/heat problem. Same with VR.

VR glasses as small as glasses are still not physically feasible for reasons. I donā€™t think Apple can get ahead of the materials science limitations faster than other players in the space. They could, however, make better use of the next generation and make the user experience better given these very real limitations.

The glasses you want arenā€™t coming for a long while yet. The VR glasses that exist today are one step away from a healthy consumer marketplace beyond a core PC gaming audience. VR will be mass-market before you get the physical form factor you are so hilariously focused on.

As for ARā€¦ Iā€™ve yet to see a glasses-based implementation thatā€™s even plausible for a core audience. The hardware is too far away. Screen/camera-based AR on phones and tablets will blossom before we have even early-adopter AR glasses.

Google Glass wasnā€™t AR. It was at best second-screen. It wasnā€™t even a HUD, since it commanded full separate attention and wasnā€™t meaningfully overlaid on oneā€™s view.

Crazy idea Apple would never do.

Make an AR/VR set that does something minimal on its own. But, if you have more iphones (e.g., have friends), allow them to be set up around a room to provide processing power and the sensor net for the headset. Make it easy to have a ā€œVR Partyā€ as long as at least some of your friends have iphones and one of you gets the head rig.

The reason Apple could possible do this, but no one else can, is that theyā€™re the best at making something complex like this ā€œjust work.ā€

My point about ā€œother face wearablesā€ wasnā€™t so much about AR overlaid across your vision. Iā€™m thinking more like a pair of glasses or a device attachable to a pair of glasses that has a camera that is aware of what you are looking atā€¦ and I mean aware of it before you get your phone out of your pocket and hold it up in front of you.

If you could just tap the frame of your glasses, then get your phone out and the screen, before you even unlock it, tells you all about the thing you just looked at. Like who it is, or how big it is, or how far away it is, or has already picked out the phone number or looked something up on Google for you.

Like the watch is always just keeping tabs on your heartbeat and activity and can just generally keep track of that, and you can look at that data in way more detail on your phone later, even if you donā€™t have your phone on you at the time (I usually go running without my phone). This isnā€™t the obvious job for a ā€œwatchā€, but it can do it with the added sensors.

And Earpods are always just listening in on whatā€™s going on, ready to reply to a question.

Watch is for movement and health, and has a small display for written information. Earpods are for sound, for both hearing audio and talking back to you.

Face-wearable would be the vision-input device which wouldnā€™t be relying on you having a phone with you at that moment.

Whatā€™s extra funny is that Google had this, both in Glass and in their phones, for a long time. Take a photo, and it gives you all the info it can glean from the image. It kept a history of all these analyses exercises. I would go back and look at what it said about images Iā€™d taken over the course of a day, and it was very cool.

Iā€™d literally just take a photo (with the phone or with glass: didnā€™t matter) and then open my phone to see the details at any point after.

It started getting buggy, and eventually just stopped working entirely. Then they discontinued it.

But to your point, yes, if you drop the idea of having a screen on oneā€™s face, face tech is an open field of very real possibilities achievable with current-day tech.

Right. Google Glass had this kind of thing. Even the Snapchat glasses had cameras in them to take photos or video for sharing on your phone (or maybe without even taking your phone out of your pocket?).

But this conversation about who will get to this being actually usable, useful, stylish and fashionable all at the same time. I see no other company except Apple who might get even close to that in the next few years.

They nailed it with the Airpods on their first attempt, and Iā€™m sure Airpods version 2 will be out soon. The Apple Watch took three attempts to really get good.

Will the face-wearable be a hit out of the gates, or take a while to ramp up? It seems like a lot of the tech is already in place with these other tiny products (and the phone as an anchors to hold it all together) so Iā€™m guessing Apple can wait as long as it takes to get it right on the first try (like Airpods) as their is no pressure from other companies or products just yet.

Apple is WAY ahead if you

https://youtu.be/P_FwfSR1xBc

Can you be a bit more specific than ā€œhere is a one hour videoā€?

1 Like

In short they are pushing for mixed reality via iPhone and pad and focusing on this far that require you to move around in physical space.

this looks exclusively marketed to real estate agents

Yikes, no thanks.

An interesting reaction.

Appleā€™s security model is that its devices are always listening, not their servers. This is different to Amazon and Google, who have dumber devices and do the processing on remote servers. That allows Amazon to send everything an Echo has heard in someoneā€™s house to a random person by accident. Not a good thing.

But Apple normally does things on-device. So having a device always see or hear what is going on in your home is more like a local closed camera security solution, not an access point for a data mining company to snoop into your home and keep track of what adverts you are watching on TV.

It also means that some of Appleā€™s services arenā€™t as good as Googles and Amazonā€™s, which is a trade off you may think better or worse.

But I think you are taking the wrong message from me using this as an example. My watch keeps track of my health and fitness, but that data is never exposed to anyone except me. Apple doesnā€™t get to see that data on their servers, and it isnā€™t used to advertise to me nor to impact my health insurance in negative ways. Iā€™d never trust Google or Amazon to treat my health data the same way.

Why not let a company with a good privacy record, and a business model to match, help me
out in similar ways in other parts of my life?

1 Like

I just thought of another example. When Iā€™m somewhere in the city for a while, and I leave, and I walk to my car, I often get a notification like ā€œ14 minutes to drive homeā€ from the Apple Maps app. My phone has learned that if I drive somewhere, park, hang out, then arrive back at the location of my car, it knows to tell me about traffic or show the route home.

This is all happening on my phone though. On my device. It is passively aware of my location, and tries to be helpful as it can. Nothing is being sent to servers for tracking and advertising purposes. The maps data and route finding is all done anonymously and in sections on Appleā€™s servers too.

I paid Apple more money for my phone to make sure they have money to make good maps and mapping services. Google needs my data and to show me adverts to pay for that. My car cost ā‚¬80 extra to have a map too, and has a data connection for traffic routing. Of these three options, I like paying up front and knowing my data isnā€™t being used for non-map-improving reasons.

Why not something similar with sound and vision?

2 Likes

Funnily enough, since I wrote the above, Apple has released exactly what I predicted. Second generation AirPods that are always listening in for ā€œHey Siriā€, so you donā€™t need to tap them to initiate interactions.

Theyā€™re still not noise isolating or cancelling in any way, shape, or form, and thatā€™s a complete non-starter. Otherwise, I would probably buy two.

Well, I have use cases for both isolating and non-isolating headphones. I have one pair of each hanging on my desk for a reason.

For tiny wireless airpods Iā€™m only going to be using them when Iā€™m out and about, and therefore only want isolation.