laitimes

How did this pair of smart glasses, which almost failed, become a hit?

author:Love Fan'er
How did this pair of smart glasses, which almost failed, become a hit?

From the electronic IQ tax to the popular "spicy chicken", Meta Ray-Ban took two years.

In September 2021, Meta and Ray-Ban released the original generation of smart glasses "Ray-Ban Stories", and until February 2023, the device had only 27,000 monthly active users, less than 10% of the total number sold.

How did this pair of smart glasses, which almost failed, become a hit?

In September last year, the second-generation co-branded product Ray-Ban Meta debuted with Quest 3, with a starting price of $299 (about 2166 yuan), which is the same as the price of the first generation of glasses, and the shipment of more than 300,000 pairs in Q4 of 23 alone, almost catching up with the full-cycle sales of the first generation.

How did this pair of smart glasses, which almost failed, become a hit?

How did the two generations of products, which are almost indistinguishable from the appearance, win this battle of sales turnover?

Changed the name, changed the world

Putting the two generations of products together, it is difficult to see at a glance "which is the other", and even the official introduction emphasizes the continuation of the appearance:

Ray-Ban Meta 保留了第一代(Ray-Ban Stories)的时尚形态,采用您熟悉和喜爱的永恒 Wayfarer 风格。

Ray-Ban Meta has more freedom in personalization.

The new eyewear is available in three new clear frame colors, and there are more than 150 frame and lens combinations to choose from on the Ray-Ban Remix platform.

How did this pair of smart glasses, which almost failed, become a hit?

When we spread out the parameters on paper, it's easy to see what's new in the iteration product.

First of all, Ray-Ban Meta's camera and shooting image quality have improved a lot.

Ray-Ban Stories (1st generation) is equipped with a 5 MP camera, and the quality of both photos and videos is relatively low, and it is often difficult to even reach the threshold of "usable".

How did this pair of smart glasses, which almost failed, become a hit?

The Ray-Ban Meta (2nd Gen) is equipped with a 12MP ultra-wide-angle lens, which makes photos much clearer and supports up to 60 seconds of 1080P/60fps video recording.

While stepping into the usable threshold, the storage space has been expanded from 4GB to 32GB, and it can also replace the phone lens in some scenes and record some life "slices".

How did this pair of smart glasses, which almost failed, become a hit?

Another big upgrade is the speakers, with the Ray-Ban Meta having new custom-designed speakers with 2x more bass and 50% more maximum volume.

How did this pair of smart glasses, which almost failed, become a hit?

The sound leakage and sound distortion of open-back headphones often make them not yet comparable to the sound of in-ear and headphones.

The second-generation glasses have been reduced by changes in bass and maximum volume, and the addition of "directional audio" technology has also improved sound leakage.

How did this pair of smart glasses, which almost failed, become a hit?

In addition, Ray-Ban Meta is equipped with a "five-microphone array" that supports immersive audio recording.

Therefore, when recording video slices with glasses, the device can accurately capture the sound that is closest to the current environment, and the multichannel can restore the real sound effect of the original week.

While achieving higher performance, energy consumption is also rising, and Ray-Ban Meta has also paired it with a larger battery life accessory for this purpose.

How did this pair of smart glasses, which almost failed, become a hit?

The new Qualcomm Snapdragon AR1 Gen1 platform offers better energy control, with a single continuous use time of up to 4 hours, and a case that can charge the device up to 8 additional times, for a total of 36 hours.

Ray-Ban Meta's hardware upgrades not only take the digital experience to the next level, but also take the physical weight into account.

How did this pair of smart glasses, which almost failed, become a hit?

The new material used in the glasses reduces the overall silhouette and makes them lighter and more comfortable to wear.

Compared with the first generation, the upgrade of the glasses body makes it lighter, more powerful and more durable, and the embedding of third-party software expands many practical scenarios for Ray-Ban Meta.

After wearing ZDNET, I felt that the second-generation glasses were "surprisingly light and do not cause fatigue like other high-tech wearable devices." 」

How did this pair of smart glasses, which almost failed, become a hit?

▲ Image from: Google

Users can connect to Facebook or Instagram through Ray-Ban Meta to start a live stream, and can also press and hold the button on the right temple to have the device read comments and interact with the audience in the live broadcast room.

Now, Ray-Ban Meta also supports sharing real-time video in video calls on WhatsApp and Messenger, and the other party sees the first view of the glasses.

How did this pair of smart glasses, which almost failed, become a hit?

All of this does not require a mobile phone, it is simple or even rudimentary, but every small function and hardware upgrade can work in use, which is why Ray-Ban Meta has received far more praise than a generation of products.

Currently, Ray-Ban Meta has a highly-rated product rating of 4.3/5 on Amazon, and this rating is expected to continue to rise from next month, because the "complete body" of Ray-Ban Meta is finally here:

On the 23rd of this month, Meta updated the feature for the second generation of glasses - Meta AI.

The glasses are decent, and the AI has no surprises

Meta glasses don't guarantee everything.

That's the conclusion that The Verge came to after testing Meta AI.

Meta AI's main features are the same as most AI hardware: voice interaction, large model support, and "GPT" on the bridge of the nose.

In short, it's an intelligent assistant that sits on glasses, all you have to do is say "Hey Meta" and add a question or command, and Ray-Ban Meta will do it.

How did this pair of smart glasses, which almost failed, become a hit?

First, Meta AI is able to respond to what the user sees (captured by the camera frame) and analyze it through a large model.

This process is actually twofold fun, one is that when you first use it, it can really answer your questions quite accurately, and the other is that after you are familiar with "self-talk", you always want to figure out where the limit of its questions and answers is, and when will it be embarrassing?

The Verge used Ray-Ban Meta to easily identify the cat in its hand, and Meta AI also described the details on the cat, all of which matched.

How did this pair of smart glasses, which almost failed, become a hit?

▲ 图片来自:The Verge

Funnily enough, the AI isn't always online, for example, by mistakenly referring to a "marmot" as a "fat squirrel."

While this description is graphic, mistakes like this will always make you want to keep playing, as long as you don't cause trouble.

How did this pair of smart glasses, which almost failed, become a hit?

However, in some complex scenarios, Meta AI may not always work.

The Verge 记者 Victoria Song 在马路边用 Ray-Ban Meta 识别快速驶过的汽车品牌,他发现:

Like most artificial intelligence, Meta is sometimes accurate, but often wrong.

In addition, Ray-Ban Meta can also translate the foreign language in front of you.

How did this pair of smart glasses, which almost failed, become a hit?

In the official introduction, the second-generation glasses can translate the French menu in front of you into the user's native language through voice.

The efficiency of verbatim translation is relatively low, and the process is a bit cumbersome and embarrassing, but if you put it another way:

AI is just an add-on to glasses, and your Ray-Ban can now be your simultaneous assistant.

Then, the positive significance of AI for wearable devices can be improved a lot in an instant.

How did this pair of smart glasses, which almost failed, become a hit?

Now the built-in large model of Ray-Ban Meta is Llama 2, in addition to the above extended functions, it can also complete the native function scheduling in the device:

  • Broadcast the weather
  • Play the song
  • Listen to the song and read the song
  • Turn on camera shooting/recording

This is the basic literacy of large-scale model terminals, so there are not too many surprises, and it stands to reason that the scheduling of these functions, even without AI, can be easily completed by voice assistants alone.

ZDNET COMMENTED:

Meta AI, the built-in chatbot for glasses, really makes these smart wearables "smart", but there's still room for improvement.
How did this pair of smart glasses, which almost failed, become a hit?

Making devices that claim to be smart become truly intelligent is a common problem faced by the entire AI industry, and we can start by giving some ideas for the future.

Speaking of voice dialogue and Q&A alone, compared with the past, the large model does expand the margin of the scope of Q&A, but it is not enough, or it is not called intelligent Q&A, and the real intelligence should be "like a human".

How did this pair of smart glasses, which almost failed, become a hit?

Now talking to intelligent assistants, including Meta AI, is turn-based, and you're done talking to me, which is polite, but it's also unconventional.

Being able to interject between questions and answers, interrupt the AI's statement at any time, and let it continue at any time, like a friend, can build on the existing AI Q&A and take a step forward towards intelligence.

In this way, it can be said that the glasses "rely on a large model", rather than let the device be equipped with a "large model".

Find the right positioning and do a good job of green leaves

If this is the first time I've seen AI on a variety of wearable devices, I'm likely to have a preconceived headline similar to "AI, Redefining Ray-Ban Meta".

However, after experiencing the so-called new era of smart products such as mobile phone models, Ai Pins, and Rabbit R1, I, like many of my friends, have already developed antibodies to the exaggerated words of AI, such as "subversion", "overturning" and "pioneering".

How did this pair of smart glasses, which almost failed, become a hit?

The contradiction between too high goals, too strong expectations, and the limitations of software and hardware technology development will also intensify because of the "bubble confidence" brought by new technologies to manufacturers.

When we, as users, get the product, the general status quo of the publicity is completely inconsistent with the actual product, which will also make a lot of AI hardware controversial:

  • A mobile app can do it, so why do you need a piece of hardware?
  • Why do you need to change your operating habits for the sake of AI that is not easy to use?
How did this pair of smart glasses, which almost failed, become a hit?

The "flood of bad reviews" of AI Pins has exacerbated such doubts.

It is too early to completely abandon such devices.

I agree with The Verge, because after Humane's erroneous demonstration, Limitless and Ray-Ban Meta still give AI hardware a hope: they're more grounded.

The biggest thing that smart necklaces and second-generation smart glasses have in common is that they do not forcibly create a product form, but upgrade the functions of pre-existing products.

On the basis of the traditional portable recording microphone, Limitless has embedded LLM, no bells and whistles, and LimitlessAI has the icing on the cake experience for dedicated recording, and the battery life is almost unaffected.

How did this pair of smart glasses, which almost failed, become a hit?

Ray-Ban Meta is very aware of this, it did not have All in AI from the beginning, but started from the "technology and fashion" glasses, and slowly added open-back headphones, cameras, and privacy lights to the glasses, and after half a year of running-in with the market, it officially announced Meta AI.

When we don't focus on AI, don't set high expectations, and focus on the product itself, you will find that traditional accessories with AI functions will still bring you a lot of surprises.

The current AI, which is just an accessory and additional feature, is a trial of shampoo and a complimentary bottle of laundry detergent.

The experience of picking up wool is cool, but you expect to bring it into your daily life in the first place, which is not only not enough, but also terrible.

How did this pair of smart glasses, which almost failed, become a hit?

So, don't create hardware forms lightly, the core of AI is to simplify the interaction process between humans and devices, rather than increase the cost of learning.

Li Nan, the founder of Numeow, also mentioned in the podcast:

It's a good product, and it gives you 3 features, and each of them gets 95 points, which will definitely surpass a product with 10 features, but each of them only has 60 or 70 points.
How did this pair of smart glasses, which almost failed, become a hit?

▲ Ray-Ban Meta's after-sales rating on Amazon. Image credit: Amazon

Finally, I'd like to talk about why I'm bullish on Ray-Ban Meta.

In December 2016, the original AirPod was released, and TWS earphones replaced wired earphones in a large area, making it possible to hang the device on the ear for a long time without inductive wearing.

Since then, the strange behavior of "talking to the air" has gradually become the daily life of many people, so there is not much psychological pressure to interact with Ray-Ban Meta voice outside, it is a glasses Bluetooth headset (of course, the privacy hazard must be solved first).

How did this pair of smart glasses, which almost failed, become a hit?

In addition, it is a glasses in its own right, so it does not cost to wear, it is more portable and reasonable than an AI Pin, and it is always a trial-and-error approach with the most advanced features on the most common objects.

Finally, to quote Li Nan's original words:

The closer (AI hardware) is to the head, the better: the first aspect is to consider from the human dimension, since the human sensor is on the face, the AI sensor is close to the face is a reasonable choice, from the other dimension is to consider whether the robot (the sensor) is all on the face?
How did this pair of smart glasses, which almost failed, become a hit?

▲ Image from: Google

These devices are not the center, but they can be an extension of the human senses, the functionality of the phone. So it's the next step for AI tools to play with watches, necklaces, glasses, earrings, etc.

Ray-Ban Meta at least proves that the concept of wearable AI devices has not gone away: we don't need to take out our phones, just look ahead, ask questions, and learn something new.

Smart glasses, first glasses, are smart.

Or to put it another way:

Ray-Ban sunglasses are priced at 1000-3000 yuan in Taobao flagship stores, and now spend the average price to buy a pair of big-name sunglasses, as well as Bluetooth headsets, mini GoPro, five-microphone array, multi-channel audio recording, and a voice assistant Meta AI.
How did this pair of smart glasses, which almost failed, become a hit?

Looking at it this way, even just setting an alarm clock with Ray-Ban Meta seems to be a very cost-effective thing.

Read on