UK News
I wore Meta’s smartglasses for a month – and it left me feeling like a creep | AI (artificial intelligence)
Lately, I’ve been hearing Judi Dench’s voice in my head. She tells me tomorrow’s forecast, when to turn right, that there’s been another message in my group chat. Day or night, Dame Judi is eager to assist. When I ask the eight-time Academy Award nominee what I’m looking at, she answers: a residential area, a person in a pub, daffodils. “They are a bright yellow colour and are often associated with spring.”
This isn’t a delusion. This is, apparently, progress. I am test-driving Meta’s smartglasses and Dench voices its integrated AI assistant: “Here to chat, answer questions, create images and provide advice and inspiration,” said “Judi” when I selected her over the actors John Cena and Kristen Bell. “Shall we begin?”
Over the next decade, predicts the Meta founder Mark Zuckerberg, smartglasses will gradually become “the main way we do computing”, fulfilling many of the same functions as smartphones – taking photos, playing music, making calls, giving directions. For people who wear glasses, Zuckerberg has suggested, the upgrade is a no-brainer, bundling more features into an essential accessory. And for those of us who don’t, it is only a matter of time. In 2025, Meta sold more than 7m pairs globally.
Are they really the future, bringing us the benefits of tech without the tyranny of screens, or will they trap us deeper in the digital world? To see for myself, I wore a pair for a month.
The most common response to my new frames is: “Why?” I don’t usually wear glasses, and these clear-lens Wayfarers (part of Meta’s collaboration with Ray-Ban) are on the heavy side. I look like the nerdy girl in a 90s romcom, or the old guy from Up, but the sunglasses would have made me even more conspicuous if worn indoors. The other question I get asked is: “Are you filming me?” In general, I find people do not like being around someone wearing Meta glasses, not least because sometimes the answer is: “Yes.”
When I arrive home, my boyfriend, Marco, clocks me wearing them, then instantly freezes, as though he has unexpectedly come face to face with a predator. (Marco is not his real name; I have violated his privacy enough.) Although they are still fringe tech, concerns about privacy are mounting, amid reports of users covertly recording in public. Last month, journalists in Sweden found that moderators employed by Meta review intimate footage from the glasses, including of people using the toilet and having sex. (Meta responded that it took the protection of people’s data very seriously and was constantly refining its efforts and tools in that area. It said that unless users choose to share media they’ve captured with Meta or others, that media stays on the user’s device.) That’s not to mention the company’s reported plans to implement facial recognition.
Meta’s newer Display glasses have tiny screens embedded in the lenses for displaying text messages, maps and the like, but they cost $799 (£600) and are not yet available outside the US. My entry-level Gen 1 Wayfarers, which retail for £299, aren’t as functional as a smartphone.
It’s possible that smartglasses could bring about a paradigm shift, as an accessible first step towards wearable AI and augmented reality. Meta is certainly betting big on them, expanding its partnership with the eyewear giant EssilorLuxottica (the parent company of Ray-Ban, Oakley and more than 150 other brands) and heavily investing in its Meta AI. The pitch is that smartglasses remove friction between our physical and digital worlds, enabling the wearer to be more present “in the moment, with your head up and your hands free”, as Alex Himel, Meta’s vice-president of wearables, recently told Radio 4.
A Meta spokesperson said in an emailed statement that the company foresaw smartglasses being increasingly used alongside smartphones, not replacing them. Eventually, they will have more specific, unique functions, the spokesperson said, but for now they aim to be less distracting and more convenient than taking your phone out of your pocket. Wearing a camera does encourage me to take more photos while out with friends, but, when I later download them on to my phone, most of them are unfocused and awkwardly framed. It’s not easy, composing an image with your eyes.
I get more use out of the glasses as headphones. They pipe audio directly into your ear without blocking ambient sound; only someone standing very close to you could tell that music was playing. It’s nifty tech, but the trade-off is continual interruptions. Headphones signal to others that you are not listening; glasses don’t. I am self-conscious while taking calls in public, apparently talking to myself.
So far, so nonessential. Smartglasses’ real promise lies in the integrated AI: what Meta is hailing as “the most natural and seamless way to access an AI assistant in your daily life”. Judi can operate my phone via voice commands (“call Ian”, “take a picture”), answer questions and engage with what I can see – for example, by reading aloud printed text or identifying objects. Blind people and those with low vision have reported this as transformative. Using the Be My Eyes feature, they can even connect with a sighted volunteer, who then “looks through” their glasses’ camera and gives live feedback. It is easy to see smartglasses’ potential as assistive technology; they are already being used to help people with dementia, dyslexia and other needs. But, for now, they are not reliable or functional enough to offer consistent support or replace other aids.
Even Zuckerberg, presenting the new Meta Display on stage last year, repeatedly tried and failed to make a simple video call. My experience is similarly haphazard. Judi frequently mishears me, cuts out midway through answering my questions or reads aloud only some new texts in a chain, meaning I don’t feel confident giving her tasks without then checking them on my phone – defeating the point of “hands-free” tech.
The AI assistance is also limited. Judi can distinguish daffodils from daisies and confirm when I am in a pub (as opposed to outdoors), but she struggles with even slightly more involved requests. When I ask her to ID a friend’s hat, for example, Judi says she can’t yet help “with requests about product pricing or availability”. At Tate Modern, when I ask her to tell me about the artwork I’m looking at, Judi describes “a bed with white bedding and a blue mat underneath it [with] various items scattered”. It’s not wrong, exactly, but it’s not what I want to know about My Bed by Tracey Emin.
Of all the glasses’ selling points, I’d been most enthusiastic about the real-time translation, imagining it smoothing my interactions abroad and allowing me to eavesdrop on Marco, a native Italian speaker. If Marco was initially wary of the glasses, after two weeks he is sick of the sight of them, but indulges my request to role-play tourist and local.
I ask him for directions to “the nearest duomo”, looking directly at him, as instructed, with the Meta AI app open on my phone. Marco rattles off a response with authentic speed and dismissiveness. An English translation appears on my phone’s screen, after a slight lag, but it is incomprehensible. The interaction is significantly less fluid than typing into Google Translate and passing the phone back and forth. This is frequently the case in my month of Meta glasses: instead of a more seamless experience, they add another layer of tech, one that’s more fiendish to navigate.
There’s a reason smartglasses are not all that popular, suggests Iain Rice, a professor of industrial AI at Birmingham City University. As the technology evolves and improves, they may become standard for certain activities, but at present they conspicuously lack clear, valuable “use cases”. His take is that Meta wants to be seen as innovating alongside Google and Apple, but lacks the vision and business acumen to come up with genuinely essential tech. Rice points to the metaverse, Zuckerberg’s long-trailed digital world, from which Meta began shifting resources away last month, after $80bn in investment. Meta’s glasses could prove similarly misjudged, he thinks.
Most people don’t want to wear glasses if they don’t need to, Rice points out – which is partly why Google Glass failed 10 years ago. “Meta has spent time trying to make them well designed and trendy … but they didn’t seem to have the market research at the forefront: do people actually want this? Will they buy into it? Or is the tech just not there?”
It’s no surprise that take-up seems to have been highest among content creators, perpetually in need of more footage and fresh angles on their day-to-day lives. More specifically, Meta glasses have become associated with pick-up artists, social-media pranksters and other public irritants. It has become increasingly common online to see footage shot with glasses, showing the wearer’s point of view, although the person featured may not have consented to being recorded or even been aware.
A blinking LED on the frame alerts others that recording is in progress (and was made more visible for the Gen 2 model), but workarounds are widely shared online. Even as is, the light is easy to miss, especially in bright light. To test this, on the tube home one evening, I guiltily snap a picture of the people across from me. No one notices, absorbed in their own devices. They might have twigged had I angled my phone in that now familiar cagey way, but people don’t yet think to check glasses for a light or a camera.
The number of content creators, and worse, capitalising on this, plus the absence of other obvious uses, has earned Meta glasses the nickname “pervert glasses”. Of course, not all users are ill-intentioned, but I can’t deny: not only do I feel creepy wearing them, but they also lead me to think like a creep. When I see a stranger walking her lookalike dog, and when I run into an ex for the first time in years, both times I have the fleeting thought: I wish I’d been recording.
I’m perturbed by how quickly my initial discomfort passes and the glasses become second nature. Just having a covert camera makes me want to use it, the possibilities of the tech overriding my better judgment and even basic decency.
The Meta spokesperson told me that, as outlined in the terms of service, the user is responsible for using them lawfully and “in a safe, respectful manner”: “As with any recording device, people shouldn’t use them for engaging in harmful activities like harassment, infringing on privacy rights or capturing sensitive information.”
The trouble is, as emerging tech, smartglasses complicate what we consider harassment, privacy and sensitive information. Some users, for example, argue that being able to film makes them feel safer, while others feel threatened by their presence. With no laws against recording in public places in the UK, whose rights take precedence: those who want to wear smartglasses or those who don’t want to be perceived by them?
This comes to a head for me on a solo Sunday-morning jaunt to Ikea. Realising I don’t have to suffer alone, I use the glasses to place my first video call, to Marco. At first, it’s all fun and games, showing him the Swedish snacks and soft toys. Then he tries to order me to fling some around and have a weird interaction with an employee. “Why?” I ask, pointing out that neither of us are typically disruptive in public. Marco apologises, blaming force of habit: the first-person point of view reminds him of playing a video game.
No other shoppers seem to notice me talking to myself, but the knowledge that I am broadcasting without their knowledge is genuinely queasy-making. When a small child crosses my path, I instinctively, immediately whip my head towards the shelves with a sinking feeling. I cannot fathom what sort of person would be untroubled by doing this or, worse, feel entitled to do it. After that, I never make another video call and stop dismissing Meta glasses as a gimmick.
Even in their present shonky state, Rice agrees, they are shaping up as a flashpoint in a bigger, existential discussion about just how much integration we want with tech. “They’re pervading slowly into society … If you see a person wearing them and don’t want to be recorded, unfortunately, the only way to make sure is by moving out of the way.”
Nonconsensual and covert filming are only the tip of the iceberg, he says. For Meta, smartglasses are a way to gather masses of data on individual users and whatever – or whomever – they’re seeing. Some media and interactions captured on Meta glasses could be used to train its AI, the company admits. “If you knew what was being done with all the imagery after you’d recorded it, I don’t think you’d be doing it the way you are,” Rice says.
Meta could make changes to protect bystanders’ privacy, he suggests, such as blurring and removing unapproved faces at the preprocessing stage. His concern is that, as the tech gets cheaper and better, all glasses will be made smartglasses, just as Bluetooth became the norm. “I think they’ve released a technology that the world wasn’t ready for, and definitely hasn’t regulated for, so there has to be some ownership.”
In the meantime, he says, we should feel confident pushing back: by asking friends wearing Meta glasses to take them off around us; and strangers we suspect of filming us to stop. What would Rice say to someone who was thinking about getting a pair? “I’d say: I don’t think you should.”
After a month, I’m glad to return the glasses to the office. I’m sick of seeing friends’ faces fall at the sight of me, of feeling trapped inside the computer, of hearing Judi say I’ve got a new text. Before I wipe the device’s memory, I alert Judi to my intentions and ask if she has any last words. “Thanks for letting me know,” she says.
UK News
NHS Wales major repairs backlog nears £1bn
The cost of fixing the most serious repairs at major hospitals alone is more than £600m, figures show.
Source link
UK News
Artemis II astronauts rocket towards the moon after breaking free of Earth’s orbit | Space
The four Artemis astronauts have fired up their spacecraft’s engine to break away from Earth’s orbit and zoomed towards the moon, a milestone that commits Nasa to the first crewed lunar flyby in more than half a century.
With enough thrust to accelerate a stationary car to highway-driving speed in less than three seconds, the Orion capsule engine blasted on Thursday the astronauts on their trajectory towards the moon, which they now will loop as part of the 10-day Artemis 2 mission.
The burn lasting just under six minutes propelled them on their three-day voyage towards Earth’s natural satellite, the first since 1972.
“Looks like a good burn, we’re confirming,” mission control in Houston said.
“The crew is feeling pretty good up here on our way to the moon,” said astronaut Jeremy Hansen. “Humanity has once again shown what we are capable of.”
The burn came one day after the enormous orange and white Space Launch System (SLS) rocket carrying the Orion capsule blasted off flawlessly from Kennedy Space Center in Florida for the long-anticipated journey around the moon.
Now the astronauts are moonbound, there’s no turning back: they are on a “free return” trajectory, which uses the moon’s gravity to slingshot around it before heading back towards Earth without propulsion.
In the event something goes wrong, the astronauts wear suits that also serve as “survival systems” – in the unlikely case of a cabin depressurisation or leak, they’ll maintain oxygen, temperature controls and the correct pressure for up to six days.
The astronauts – Americans Reid Wiseman, Victor Glover and Christina Koch, and Hansen, a Canadian – spent their first hours in space performing checks and troubleshooting minor problems on the spacecraft that has never carried humans before, including a communications issue and a malfunctioning toilet.
They began the second day of their mission by playing Green Light by John Legend and Andre 3000, Nasa said – a reference to the go signal they would soon get to fire up the engine and move towards the moon.
They also had their first workouts on the spacecraft’s “flywheel exercise device” – each astronaut will carve out 30 minutes a day for fitness to minimise the muscle and bone loss that happens without gravity.
The 10-day Artemis 2 mission is aimed at paving the way for a moon landing in 2028.
The mission marks a series of historic accomplishments: sending the first person of colour, the first woman and the first non-American on a lunar mission. If all proceeds smoothly, the astronauts will set a record by venturing further from Earth than any human before – more than 250,000 miles (402,336km).
It is also the inaugural crewed flight of SLS, Nasa’s new lunar rocket.
SLS is designed to allow the US to repeatedly return to the moon with the goal of establishing a permanent base that will offer a platform for further exploration. It was meant to take off in February after years of delays and massive cost overruns. But repeated setbacks stalled it and even necessitated rolling the rocket back to its hangar for repairs.
The current era of US lunar investment has frequently been portrayed as an effort to compete with China, which aims to land humans on the moon by 2030.
During a post-launch briefing, Jared Isaacman, the Nasa administrator, said competition was “a great way to mobilise the resources of a nation”.
“Competition can be a good thing,” he said. “And we certainly have competition now.”
The Artemis program has come under pressure from Trump, who has pushed its pace with the hope that boots will hit the lunar surface before his second term ends in early 2029. But the projected date of 2028 for a landing has raised eyebrows among some experts, in part because Washington is relying heavily on the private sector’s technological headway.
The crew wrapped up their press conference, in which they spoke about the significance of the mission, adapting to life in space, and the “spectacular” view of Earth.
The Artemis II commander, Reid Wiseman, describing the moment the crew saw Earth as a whole, said: “You can see the entire globe from pole to pole, you can see Africa, Europe, and if you look closely, the northern lights. It was the most spectacular moment and it paused all four of us in our tracks.”
UK News
Crisis staff found 'unforgivable scene' at convicted undertaker's funeral home
An international crisis team was drafted in after police raided Hull’s Legacy funeral home.
Source link
-
Crime & Safety2 weeks agoOxford: ‘Next generation’ LimeBikes in city from today
-
Jobs & Careers2 weeks agoWhy Join Oxford | Oxford University Jobs
-
Oxford Events2 weeks agoOxford News and Events, What’s on in Oxford, Exhibitions
-
Jobs & Careers2 weeks agoExplore our Careers
-
Student Life2 weeks agoThe independent cinema battling Oriel College to stay open
-
Oxford Events3 weeks agoMichelin Guide Oxfordshire Restaurants – The Oxford Magazine
-
Jobs & Careers2 weeks agoInternal Job Board for University vacancies
-
Crime & Safety2 weeks agoCrash partially blocks A40 and causes severe Oxfordshire traffic
