Apple has reportedly scaled back its automotive aspirations, at least for now. Bloomberg’s Mark Gurman says the company’s decade-old vehicle project has pivoted from planning a fully self-driving car to an EV more like Tesla’s. The so-called “Apple Car” is now projected to launch no earlier than 2028 — two years after the company’s last reported target date.
The car’s autonomous features have reportedly been downgraded from a Level 5 system (full automation) to a Level 4 system (full automation in some circumstances) — and now to a Level 2+ one (partial automation). That would mean it offers limited self-driving features like lane centering and braking / accelerating support — while still requiring the driver’s full attention.
Tesla’s Autopilot is categorized as Level 2. Level 2+ isn’t an official designation, but it’s sometimes used informally to describe a more advanced version of Level 2.
What Apple once envisioned as a car without a steering wheel or pedals — and perhaps having a remote command center ready to take over for a driver — now looks more like a Tesla-like market entrance.
Tesla’s Model 3
Photo by Roberto Baldwin / Engadget
Bloomberg says Apple views the project’s downscaling internally as “a pivotal moment.” People familiar with Apple’s plans allegedly believe delivering the pared-down Apple Car with reduced expectations could make or break the entire project. “Either the company is finally able to deliver this product with reduced expectations or top executives may seriously reconsider the project’s existence,” Gurman wrote.
Apple has reportedly talked with potential manufacturing partners in Europe about the updated strategy. Bloomberg says the company still wants to offer a Level 4 autonomous system at some point, even as its debut is on track for something more grounded.
Bloomberg describes the meetings leading up to Apple’s decision as “frenzied,” involving CEO Tim Cook, the Apple board and project head Kevin Lynch. The latter took over after former leader Doug Field left in 2021. (Field was a former Tesla engineering head who now leads Ford’s EV wing.) The board reportedly pushed leadership about the car plan throughout 2023.
Apple’s Project Titan has been the subject of rumors since at least the mid-2010s. The company has spent hundreds of millions of dollars on the initiative. It’s worked on “powertrains, self-driving hardware and software, car interiors and exteriors, and other key components,” according to Gurman. Given how many times the expensive project’s details have changed, don’t be surprised if they do so again.
This article originally appeared on Engadget at https://www.engadget.com/the-apple-car-apparently-still-exists-could-debut-in-2028-with-reduced-autonomy-203458008.html?src=rss
No, NVIDIA's mid-range RTX 40-series GPUs aren't getting any cheaper, but at least the new RTX 4070 Super packs in a lot more performance for $599. We called the original RTX 4070 the "1,440p gaming leader," and that still holds for the Super. It's so much faster, especially when it comes to ray tracing, that it edges close to the $799 RTX 4070 Ti (due to be replaced by its own Super variant, as well). And together with the power of DLSS3 upscaling, the 4070 Super is a far more capable 4K gaming card.
So what makes the RTX 4070 Super so special? Raw power, basically. It features 7,168 CUDA cores, compared to 5,888 on the 4070 and 7,680 on the 4070 Ti. Its base clock speed is a bit higher than before (1.98GHz compared to the 4070's 1.92GHz), but it has the same 2.48GHz boost clock and 12GB of GDDR6X VRAM as the original.
The difference between the RTX 4070 Super and the plain model was immediately obvious. On my desktop, powered by a Ryzen 9 7900X with 32GB of RAM, I was able to run Cyberpunk 2077 in 4K with Ultra graphics and DLSS at an average of 78fps. The RTX 4070 sometimes struggled to stay above 60fps at those settings. NVIDIA’s new GPU showed its limits in Cyberpunk's RT Overdrive mode (which enables intensive real-time path tracing), where I only saw 51fps on average while using DLSS and frame generation. (CD Projekt says that mode is meant for the RTX 4070 Ti and up, or on the 3090 at 1080p/30fps).
While the original RTX 4070 was a card that could occasionally let you game in 4K, the 4070 Super makes that a possibility far more often (so long as you can use DLSS). Of course, you'll need to have reasonable expectations (you’re not getting 4K/120fps) and ideally a G-Sync monitor to smooth out performance.
None
3DMark TimeSpy Extreme
Port Royal (Ray Tracing)
Cyberpunk
Blender
NVIDIA RTX 4070 Super
9,830
12,938/60fps
1440p RT Overdrive DLSS: 157
GPU 6,177
NVIDIA RTX 4070
8,610
11,195/52 fps
1440p RT DLSS: 120 fps
6,020
NVIDIA RTX 4070 Ti
10,624
14,163/66 fps
1440p RT DLSS: 135 fps
7,247
AMD Radeon RX 7900 XT
11,688
13,247/61 fps
1440p FSRT RT: 114 fps
3,516
When it comes to 1,440p gaming, the RTX 4070 Super is truly a superstar. In Cyberpunk's Overdrive ray tracing mode with Ultra graphics settings, I saw an average of 157fps — almost enough to satisfy the demands of a 165hz 1,440p monitor. To my eye, the whole experience looked far smoother than the 4K Overdrive results and, as usual, I found it hard to tell the difference between 4K and 1,440p textures during actual gameplay.
Similarly, I'd rather keep the 160fps/1,440p average I saw in Halo Infinite with maxed out graphics, than the 83fps I reached in 4K. That game doesn't get an assist from DLSS, either, so there's no upscaling magic going on in those numbers.
Across most of our benchmarks, the RTX 4070 Super landed smack dab between the 4070 and 4070 Ti. In 3DMark Timespy Extreme, for example, the new GPU scored 9,830 points, compared to 8,610 on the 4070 and 10,624 on the 4070 Ti. In some cases, like the Port Royal ray tracing benchmark, it leaned far closer to the 4070 Ti (which also bodes well for the 4070 Super's overclocking potential). NVIDIA's advanced cooling setup on its "Founders Edition" cards also continues to work wonders: The 4070 Super idled at around 40 Celsius and typically maxed out at 66C under heavy load.
Photo by Devindra Hardawar/Engadget
The RTX 4070 Super is clearly a big step forward from the original card, and a far better value for $599. It's a solid upgrade if you're running a 20-series NVIDIA GPU and even some of the lower-end 30-series options. The value should hopefully trickle downhill, as well: The original 4070 now sells for $550 on NVIDIA's website and used models are on eBay for well below that.
While we’ll continue to long for the days when “mid-range” described a $300 GPU, NVIDIA is giving gamers more of a reason to shell out for the $599 RTX 4070 Super. It’ll satisfy all of your 1,440p gaming needs — and it’s ready to deliver decent 4K performance, as well.
This article originally appeared on Engadget at https://www.engadget.com/nvidia-rtx-4070-super-review-a-1440p-powerhouse-for-599-160025855.html?src=rss
Rode, the Australian audio company that enjoyed breakthrough success with the Wireless Go and GO II, has unveiled a dual transmitter version of the more affordable Wireless ME mic. If you can do without onboard recording, the dual transmitter version could save you from buying extra gear for a multi-mic setup.
As wireless clip-on digital mics have exploded in popularity with creators, the (single transmitter) Rode Wireless ME has been a popular budget ($149) alternative to the $299 GO II. This dual-transmitter model is otherwise the same as the single-mic version. So, you’ll get the same Series IV 2.4GHz digital transmission, Rode’s GainAssist tech and “universal compatibility” with cameras, phones and computers.
Rode
Also, like the single-transmitter version of the Wireless ME, the new model’s receiver includes an extra “behind-camera” mic for a bonus audio source. In this case, that theoretically gives you a third mic — as long as your setup allows plugging it directly into your recording device. It works with the Rode Capture app (available for iOS and Android), which is aimed at creators.
Given that the Wireless ME is on the budget end of Rode’s lineup, the same compromises from the single-transmitter version apply. That includes the lack of a receiver display, onboard recording / storage or an option to record a safety track at a lower gain level. In return for those tradeoffs, you’ll likely save a few bucks vs. the higher-end GO II.
We say “likely” because Rode hasn’t yet said how much the dual-transmitter version will cost. (The single-mic variant costs $150, so you can probably assume it will be more.) The dual Wireless ME arrives this spring, so expect to hear about pricing as its release date approaches. It will be available in black and (for the first time in the ME series) white.
Rode has growing competition in this space. JBL launched a similar budget product — the $100 Quantum Stream — at CES 2024, and DJI just revealed the Mic 2, including a $349 dual-transmitter variant.
This article originally appeared on Engadget at https://www.engadget.com/rode-reveals-a-dual-transmitter-version-of-the-wireless-me-lapel-mic-181534298.html?src=rss
At CES 2024, ASUS seems to have taken people by surprise with the announcement of its AirVision M1 glasses, with some viewing it as an alternative to Apple’s Vision Pro headset. But I discovered that ASUS’ glasses are much more of a novel alternative to portable monitors than something meant for spatial computing.
The big difference between the AirVision M1 glasses and something like the Vision Pro or even Xreal’s Air 2 Ultras is that it doesn’t really support anything in the way of interactive AR. Sure, the glasses are able to project your desktop or multiple windows into space, but it needs to be tethered to a nearby device and doesn’t recognize hand gestures or other virtual objects.
Photo by Sam Rutherford/Engadget
Instead, I found that its primary purpose is to give you extra screen space, but without the need to carry around big and bulky portable monitors. Featuring built-in microLED displays with a full HD resolution, the AirVisions can display up to six or seven virtual windows or desktops. You can also choose betwe
en a handful of aspect ratios (16:9, 21:9, 32:9 and more), with the glasses three degrees of freedom allowing you to either pin those screens in virtual space or track your head as you move around.
Instead, I found that its primary purpose is to give you extra screen space, but without the need to carry around big and bulky portable monitors. Featuring built-in microLED displays with a full HD resolution, the AirVisions can display up to six or seven virtual windows or desktops. You can also choose between a handful of aspect ratios (16:9, 21:9, 32:9 and more), with the glasses three degrees of freedom allowing you to either pin those screens in virtual space or track your head as you move around.
Instead, I found that its primary purpose is to give you extra screen space, but without the need to carry around big and bulky portable monitors. Featuring built-in microLED displays with a full HD resolution, the AirVisions can display up to six or seven virtual windows or desktops. You can also choose between a handful of aspect ratios (16:9, 21:9, 32:9 and more), with the glasses three degrees of freedom allowing you to either pin those screens in virtual space or track your head as you move around. Instead, I found that its primary purpose is to give you extra screen space, but without the need to carry around big and bulky portable monitors. Featuring built-in microLED displays with a full HD resolution, the AirVisions can display up to six or seven virtual windows or desktops. You can also choose between a handful of aspect ratios (16:9, 21:9, 32:9 and more), with the glasses three degrees of freedom allowing you to either pin those screens in virtual space or track your head as you move around.
During my first demo, I used the AirVision M1s while tethered to a laptop, in which it behaved almost exactly like having a bit floating desktop that appeared to be hovering six feet in front of me. At first, the virtual displays were a little blurry, but after a short adjustment period and some time dialing in my IPD (interpupillary distance), I was pleasantly surprised by how sharp everything looked. When compared to something like Sightful Spacetop, which is billed as the world’s first AR laptop, not only did it have a much larger vertical field of view (up to 57 degrees), it also didn’t require any additional special equipment, as the glasses are essentially plug and play. While I didn’t need them, it’s important to note that the glasses come with a pair of nose pads to help ensure you can get a good fit, plus a prescription insert for people with glasses.
Once set up, it was pretty easy to create additional virtual workspaces. All I had to do was pull up a small command menu, press a plus sign where I wanted a new window to appear and that’s it. You can also freely adjust the overall size of the virtual display by zooming in or out. And one of the best things about the AirVisions is that using the laptop’s touchpad or typing wasn’t difficult at all. Because you can see through the virtual displays, I simply looked down and focused my eyes where they needed to go. That said, if you become distracted by something in the background, ASUS’ glasses also come with magnetic blinders that clip onto the front and provide a clean black backdrop.
However my favorite use case was when I tried a different pair of the AirVisions that were connected to an ROG Ally, where the glasses provided me with a massive virtual screen for gaming. In this way, it’s a lot like wearing a headset such as the Meta Quest 3, but for non-VR games. This is the kind of device I would love to have on a plane, where space is at a premium, especially for something like a portable monitor. That said, I’m not sure I could handle the embarrassment of being a modern day glasshole, at least not until devices like these become a bit more popular.
But perhaps the biggest difference between the AirVision M1s and Apple’s Vision Pro is price. While ASUS has yet to provide an official figure, a company spokesperson told me that ASUS is targeting around $700, versus $3,000 for Apple’s headset. And when you compare that to the price of a portable monitor, which often goes for between $250 and $400, and offers a lot less screen space, suddenly that price doesn’t seem too ridiculous.
So if you’re on the lookout for an alternative to the travel monitor, keep an eye for ASUS’ AirVision M1 glasses when they become available sometime in Q3 2024.
We’re reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.
This article originally appeared on Engadget at https://www.engadget.com/the-asus-airvision-m1-glasses-give-you-big-virtual-screens-in-a-travel-friendly-package-234412478.html?src=rss
Google has officially voiced support for Right to Repair (R2R) legislation. Specifically, the company supports Oregon’s SB 542, championed by State Senator Janeen Sollman (D). Although Google’s motives could be less about newfound altruism and more about shaping regulatory action that seems increasingly inevitable, “a win’s a win,” as they say in sports.
The company expressed its new R2R stance in a blog post and white paper published Thursday. “Today, we’re excited to reaffirm our support for the Right to Repair movement by releasing our first white paper on repair while endorsing proposed Oregon Right to Repair legislation that offers a compelling model for other states to follow,” the company wrote.
Google lobbied against Right to Repair legislation as recently as March 2021 when it opposed the HB21-1199 R2R bill in Colorado. It’s also on record opposing AB1163 in California. The company’s stance had already shifted before today, in line with the direction of regulatory winds. (It partnered with iFixit for self-repairs starting in 2022.) But Google suggesting its announcement today is merely “reaffirming” a value it’s always stood behind (while ignoring documented evidence to the contrary) appears disingenuous.
Google’s suggestions for regulators
Google’s language in the white paper reveals a legislation-shaping tactic. An entire section titled “Policy Perspective” breaks down the language and boundaries the company believes R2R regulations should contain.
Within this policy section of the paper is a passage about “design flexibility,” urging lawmakers not to hamstring device makers by implementing strict design codes. “Well-intentioned regulations that set specific design requirements and standards in an effort to improve repairability may have unintended consequences that inhibit innovation and inadvertently lead to bad outcomes, such as more e-waste,” Google wrote in its white paper. “Design-related policies for repair should focus on defining repairability outcomes rather than setting strict design standards.”
Another item in the policy section, “reasonable implementation period,” calls for regulations that won’t disrupt existing manufacturing schedules. “Consumer electronics operate with lengthy product development timelines, often spanning years,” Google wrote. “New regulatory measures should phase in on a sensible timeline that ensures manufacturers can meet new requirements without undue burden. Regulations should not apply to products that are already designed and launched as such measures are problematic and may have negative unintended consequences, such as creating more e-waste.”
Neither of those requests seems egregiously unreasonable — and the points about e-waste could be taken at face value — but, coincidentally or not, they do also align with Google’s business interests.
An Apple dig and… Project Ara?
Google squeezed in a dig at Apple, too. “Policies should constrain OEMs from imposing unfair anti-repair practices,” the paper reads. “For example, parts-pairing, the practice of using software barriers to obstruct consumers and independent repair shops from replacing components, or other restrictive impediments to repair should be discouraged.”
Of course, Apple is notorious for parts-pairing, the practice of digitally linking part serial numbers to the device serial, locking out third-party repair services (and leaving the people who paid them with obnoxious incompatibility warnings).
Project Ara, which made it to the Google graveyard before hitting store shelves, was shouted out in the white paper.
Google
Google’s paper highlights examples from its history of supporting R2R and similar initiatives, even calling out the (cancelled) Project Ara modular phone from a decade ago as an example of projects that “push the boundaries and better understand our users’ needs for repair.” (If it had only made it to consumers.)
The paper also touts Google’s buildouts of its repair capabilities, seven years of software support for Pixels and seven years of support for hardware parts. All of this can be seen as a resounding victory for the R2R movement, even if corporations’ motives continue to be less noble than they like to let on.
This article originally appeared on Engadget at https://www.engadget.com/google-claims-to-reaffirm-right-to-repair-support-three-years-after-lobbying-against-it-205828956.html?src=rss
CES is the type of show where one is likely to come across all sorts of dorky, Geordi La Forge-esque smart glasses, but some do manage to include some practical features — like ViXion's auto-focus eyewear. The company, which is a spin-off of Japanese optics specialist Hoya, showed off its the ViXion01, at CES 2024, and it's aimed at people who struggle to focus their eyes due to strain, old age or the time of day. After a quick demo, I could also see myself benefitting from it on a daily basis. I even dig the futuristic look on these glasses, which are the work of Japanese design firm, Nendo.
At the heart of the ViXion01 is its front-facing ToF (time of flight) sensor, which lets it measure the distance of your target object and quickly adjust its lenses on the go. The basic concept is similar to a University of Utah prototype that popped up at CES 2017, but ViXion managed to miniaturize and even begin to commercialize it. According to project director Toshiharu Uchiumi, his device will do wonders to enhance fine details in applications like model kit assembly or reading small print.
Photo by Richard Lai / Engadget
Initial setup was straightforward. I had to first manually slide both lenses to get a single centered image, then I moved on to the diopter adjustment dial on the right to bring each of my eyeballs into focus, which worked fine despite my myopia of around -4.5 D (you can also set this up in the app via Bluetooth 5.0 connectivity). While the ViXion01 weighs 50 grams, the main circuitry and battery housed on the right made an apparent imbalance, but I didn't have enough time to tweak my fit on the bendable left arm and nose pads.
Otherwise, it was a surprisingly pleasant experience, visual-wise. My eyesight was sharper than usual for both far and close distances (down to two inches), and I didn't feel as much eyestrain as I would usually get when looking at things up close — seemingly ideal for when I work on my Gundam models. The automatic switch between different focal distances felt swift and seamless as well.
There are some caveats, though. First off, ViXion stresses that this is not a medical device, and you should avoid wearing it while driving or exercising — makes sense given the limited field of view due to the black rings housing the lenses. With that in mind, the 10-hour battery life should be sufficient, and then it's a three-hour charge via USB-C. It's also not waterproof, though it is rated IPX3 for water resistance, which can probably tolerate light rain or sweat.
The ViXion01 is now available for pre-order in Japan for 99,000 yen (about $690), with shipments expected to begin in February. There's no plan on an overseas launch just yet, but if that ever happens, chances are ViXion will need to come up with a wider version, anyway.
We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.
This article originally appeared on Engadget at https://www.engadget.com/vixion01-glasses-reduce-eyestrain-by-doing-the-focusing-for-you-205106281.html?src=rss
Some companies take monitors, TVs or AI-fueled laptops to CES. Others bring a toilet seat you can talk to. The 151-year-old bathroom appliance company Kohler will introduce the PureWash E930 Bidet Seat in Las Vegas next week. The accessory fits onto most elongated toilets, transforming your dumb can into an Alexa- or Google Assistant-powered smart-loo.
Kohler says the PureWash E930 Bidet Seat “brings you the freshness of personal cleansing in a slim, low-profile design.” At $2,149, it isn’t cheap, but it could save you money — and installation hassles — compared to full-on smart toilets. (Kohler has models in the $8,000 to $10,000 range.)
The PureWash E930 opens and closes its cover hands-free when it senses motion. It has a self-cleaning mode, using its built-in UV light. Using Alexa or Google Home, you can control the smart seat hands-free (no small luxury with bathroom gear). Amazon and Google’s voice assistants can turn on the bidet spray, warm air dryer and UV cleaning at your command.
Kohler
The accessory, which supports front and rear wash modes, has a heated seat and a remote control with two programmable presets. The bidet automatically mists the toilet bowl before use “for more effective rinsing while flushing.” Its water temperature and pressure are adjustable, and you can choose between oscillating or pulsating sprays.
A boost spray mode (rear only) will automatically turn up your hindquarters-hosing to the maximum pressure setting. There’s also a child mode, which provides a “soft, gentle wash” for the little ones. When it’s time to dry, it includes a warm-air system with adjustable temperature settings.
The seat has LED lighting to turn your toilet into a nightlight. It even includes Kohler’s “Quiet-Close” technology that prevents seat slamming. The seat has a quick-release function, making it easy to remove for deeper cleanings.
The PureWash E930 is available for order now (in white) from Kohler’s website. A black colorway will be available in late February.
We’re reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.
This article originally appeared on Engadget at https://www.engadget.com/kohlers-voice-controlled-bidet-seat-turns-your-dumb-toilet-into-a-luxurious-smart-throne-174934904.html?src=rss
New Year’s resolutions are usually set with the best intentions – I may have already failed at one of mine already — but the right tools (and resolutions, if I’m honest) can make achieving those goals easier.
Naturally, with all the wearables and smartwatches around, there’s a fitness theme to half of our guide, but smartwatches can help nudge you into better habits and even remind you to meditate, which is something I’ve set up on my Apple Watch this week.
We’ve also got to-do list app recommendations, cable organizing advice (that is a weak-ass New Year resolution) and help on how to cultivate a new reading habit in 2024.
What are your resolutions for the coming year? And what will you do (or buy) to achieve them?
– Mat Smith
You can get these reports delivered daily direct to your inbox. Subscribe right here!
Wireless TV, plug-and-play solar and next-gen headphones.
The Engadget team is prepping itself for another tour (in military terms) of Las Vegas. CES is back again. Alongside the glory of huge TVs in every size, new autonomous car tech and weird robots that will never make it outside of tech trade shows, we’ve compiled a few predictions for all the tech companies and startups planning to attend. We’ve got next-gen headphones, new display technology and more.
The show officially runs from January 9 to 12, though we'll be on the ground well before that. The first CES-related events will kick off on January 7, so get ready!
LG’s latest 4K projector looks a little like, well, I’ll say it: an objet d’arte from TJ Maxx. I say that from a place of love: I’d love a projector that looks like this. I mean, it has a handle that looks like a crank! The style does betray the high-tech insides. The CineBeam Qube can blast 4K images that measure up to 120 inches, with an RGB laser light source, a 450,000:1 contrast ratio and 154 percent coverage of the DCI-P3 color gamut.
The price has dropped permanently to $249 following the launch of the Quest 3.
Meta is permanently cutting the price of its Quest 2 VR headset to $250 following the launch of the Quest 3, according to Meta’s official Quest blog. The Quest 2 has been on sale at that price since Black Friday anyway, but a new official retail price might spell even better deals for the previous-gen model in the next few months.
NASA’s robotic Mars explorers were given some time off, as a natural phenomenon would likely interfere with communications. Leading up to the pause, the Curiosity rover was put in park — but its Hazard-Avoidance Cameras (Hazcams) kept snapping away. By the end of the period, Curiosity recorded the passage of a Martian day over 12 hours from its stationary position, as the sun moved from dawn to dusk.
This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-tech-to-help-you-stick-to-your-new-years-resolutions-121518467.html?src=rss
The last few years have been, to put it mildly, rough. And 2023 continued to bring sad tidings. Amid the humanitarian crisis that is the Palestine-Israeli conflict, plus increased fears around the credibility and reliability of AI and Elon Musk’s ongoing meltdown, tech’s biggest players also suffered their fair share of losses. This year, we saw the demise of the E3 gaming convention, the deterioration of popular online forums and the decline of cryptocurrencies, Silicon Valley banks and financial institutions. Not to mention the poor neighbors of the Twitter office in San Francisco who had to endure obnoxious, potentially epilepsy-triggering lights flashing from the building. While we can happily say “good riddance” to many of these things, it is with some sadness that we bid farewell and condolences to some of this year’s worst developments.
The X, Twitter and Elon Musk fiasco
No “Losers in 2023” list is complete without mentioning the fiasco that is Elon Musk’s Twitter (or X). Last year, shortly after Musk acquired Twitter, some of us were asked to make predictions about how Musk’s new venture would fare. I felt that it was a high-risk, high-reward move that might work due to Musk’s combination of luck and smarts, based mainly on his previous success heading up Tesla and SpaceX.
However, I also said that Twitter might devolve into the most chaotic social media platform around, which is pretty much what happened. In hindsight, what I failed to account for was that unlike Tesla and SpaceX, Musk doesn’t seem to give a crap about running X like a business and has treated the company more as an expensive toy meant to call attention to the sins (at least in his mind) of social media. And when you combine his increasingly unhinged personality with shortsighted decisions, what you get is an organization in turmoil. So while not all of these things occurred in 2023, here are just a few of the dumbest things that Musk and X have done in the last 18 months.
A little over a year ago, Musk blew up Twitter’s verification system, which promptly led to fake accounts sporting seemingly legit handles doing things like posting an image of Mario flipping the bird, the pope spreading conspiracy theories and more. Then earlier this year in June, Musk decided to block users who weren’t logged in from seeing tweets, which caused Google and others to remove Twitter content from search results. That’s not a very smart move for a company that relies heavily on traffic to generate ad revenue, so it wasn’t a big surprise when Musk backtracked a week later.
But perhaps Musk’s biggest blunder was changing Twitter’s name to X in July, a move so silly that most people continue to pretend like the rebranding never happened. Oh and let’s not forget that the name change was commemorated with a sign that was mounted on the company’s HQ in San Francisco that blinded its neighbors and didn’t have proper permits, resulting in an installation that lasted barely more than a weekend. More recently, citing a rise in hate speech, major companies including Apple and Disney decided to pull ads from X, which later prompted Musk to tell Disney CEO Bob Iger to “Go fuck yourself.” Another clearly wise business move made by a very grounded individual. (That’s sarcasm, in case it’s not clear.)
At this point, it’s hard to imagine how much worse X can get, but given everything that’s happened in 2023, it’s plain that the company formerly known as Twitter hasn’t even hit rock bottom yet. — Sam Rutherford, Senior reporter
David Imel for Engadget
Microsoft’s Surface tablet
No offense to the Surface Laptop Studio 2, which is a mighty powerful and uniquely convertible laptop, but this year felt like a low point for Microsoft’s iconic Surface tablets. The Surface Pro 9 hasn’t been upgraded at all since last, so it’s still running either an older 12th-gen Intel chip. There is a 5G-equipped model with a custom ARM-based Microsoft SQ3 chip, but we recommend staying far far away from that thing. And beyond the Laptop Studio 2, we only got the Surface Laptop Go 3 for consumers(the tiny Surface Go 4 tablet is now firmly targeted as business users, it doesn’t even show up on the main Surface site).
It almost seems like Microsoft’s dream of creating a true tablet/laptop hybrid is dead – or at the very least, it’s on pause as the company focuses on shoving its AI Copilot into all of its products. Let’s face it: While the Surface business has earned a bit of money for Microsoft, it’s a pittance compared to what the company sees from its Azure cloud revenue. Instead, the Surface devices proved that Microsoft could produce high-end Windows hardware that occasionally pushed the PC industry forward.
It’s been 11 years since Microsoft announced its first Surface devices, but it turns out most consumers didn’t want to replace their laptops with tablets. Simpler 2-in-1 convertible devices, like HP’s Spectre x360 16, are far less common these days (and notably, they also work best in their notebook modes). And it doesn’t help that Windows 11 is still far from tablet friendly. If you really want to get work done on a slate, it simply makes more sense to get an iPad and a keyboard case instead.
With Microsoft’s Surface visionary, Panos Panay, now at Amazon, there doesn’t seem to be much hope left for the company’s tablet concept. But who knows, maybe the Surface Neo will finally make a return as a true foldable some day. (Remember the Surface Duo, another failure?) A Windows user can only dream. — Devindra Hardawar, Senior reporter
Amazon
Amazon’s Halo hardware products
Speaking of dreams, mine were dashed by Amazon in July this year when the company pulled support for its Halo line of health-related hardware products. In fact, my sleep itself might have been affected, since I had just gotten used to checking my Halo app each morning to see the amount of rest I got the night before.
Amazon’s Halo division has been plagued with controversy since it launched the screenless Halo wearable in 2020. The device was a barebones activity tracker, but stood out for an opt-in feature that used onboard mics to listen to you speaking and tell if you sound stressed, upbeat or emotional. This caught a lot of attention, with people saying this was akin to Amazon trying to police your way of speaking. Many other reviewers, myself included, were more critical of the fact that, though the Tone feature did flag times when wearers sounded happy or sad, it did not present enough information for that data to be useful.
The Halo app also offered a way for you to use your phone’s camera for a body composition scan. You’d have to enter your height and weight, before stripping down to your underwear and posing for four pictures, showing your front, back and sides. The app would then tell you how much of your body is fat or muscle.
If it sounds dubious, it’s probably because it is. Though Amazon said its “Halo body fat measurement is as accurate as methods a doctor would use—and nearly twice as accurate as leading at-home smart scales.” Spoiler: It wasn’t. I used the Body feature every few months for about two years, comparing it to the bio-electrical impedance analysis (BIA) sensor on Samsung’s Galaxy Watch when that became available. Over time, as my body composition changed, I also got BIA scans at the F45 gym I go to, which uses a more sophisticated machine. Amazon’s scans were wildly off, while the Samsung watch came closer to the data gleaned from the machine at my gym.
Photo by: Cherlynn Low / Engadget
All that is to say that Amazon’s Halo products haven’t been great. But that seemed to start to change when the company launched the Halo Rise bedside sleep tracker this year. I loved it for the way it accurately detected when I fell asleep, calculated the different stages I was in (REM, Deep, Light etc) and more importantly how it did all that without requiring me to wear something to bed or install a new mattress. I finally had a feasible way to track my sleep and use that to figure out how hard or easy I should take each day’s workout, along with other activities and stresses.
Alas, that joy was short-lived. Despite Amazon acquiring healthcare companies and clearly investing more into becoming a pharmaceutical provider, it gave up on the Halo business this year. Maybe that’s not such a bad thing, since one good product doesn’t an entire profitable endeavor make. Amazon not having access to my sleep, heart rate, steps and tone is probably for the best, as we contemplate a future where the online shopping giant is also our doctor and pharmacist. — Cherlynn Low, Deputy editor
E3
For as long as I can remember, I’ve been reading and talking about games, but the internet expanded my horizons beyond the confines of the UK magazine industry. In the late ‘90s, at age 13, I started writing (very badly) for a popular game site, covering release dates, special editions and other unimportant things.
Within a couple of years I’d lost interest in writing, but I still hung out in the same IRC channels talking about games with likeminded people. IRC started my obsession with E3 and the Tokyo Game Show; weeks where I’d talk about these huge events with a weird milieu of fans and industry professionals.
In 2000, the fever around Metal Gear Solid 2’s E3 debut was out of this world. The first-person reports from the show were unbelievably positive. When the trailer finally became available to download a few weeks later, it quickly spread across the internet. I can still remember the mix of frustration and excitement as I downloaded it from an IRC bot at 7KB a second to finally get a glimpse of “next-gen” gaming.
MGS2 was peak E3 for me, and in hindsight it was also the moment E3 began to die: Why did I need to read a 1,000-word breakdown of a trailer when I could just download and watch it myself? Why should Konami spend big money on a booth when it could just release a trailer directly to its potential customers?
Back then, I was the only person I knew IRL who was “extremely online.” Now, everyone is. By the 2010s, when I started to attend E3 myself, the role of press and the show had shifted. Nintendo E3 Directs were in full swing, and the big shows from Sony, Microsoft, Bethesda, Ubisoft and EA were all beamed live to fans. Sure, I got to play some games and interview some developers, but that’s something that happens throughout the year now.
E3 remained one of the highlights of my calendar, and there were always some memorable moments — the PS4 and Xbox One reveals were probably the highlight of my in-person years — but by 2019, my excitement was more tied to seeing farflung colleagues and old industry friends than it was the event itself. When the pandemic canceled the 2020 event, it was obviously it would never recover. We’d written about how the industry didn’t need E3 years before.
Summer Game Fest will happen again next year. It will never hit the scale of the show it’s replacing, but I hope that it becomes a strong enough brand to keep the idea of E3 going. There’s still something exciting for fans, and journalists, about a week of gaming announcements to predict and dissect. If more companies spread their events throughout the year, that last bit of E3 magic will be gone. — Aaron Souppouris, Executive Editor
Cryptocurrencies and finance in tech
Much as we pretend mathematics represents an immutable truth, we must remember it’s not without its loopholes. Centuries from now, historians researching crypto may assume humanity forgot that as it decided to substitute math for truth in its entirety. That the prodigies of this world sought to engineer out human fallibility between League of Legends sessions. Uncertain, wooly and hard-to-quantify concepts like “truth” and “trust” would be tossed out in favor of the certainty of pure math. That’s the PR line: The Bitcoin white paper describes the virtual currency as a “system based on cryptographic proof instead of trust.” It’s ironic, then, that so many high-profile people who hitched their mast to crypto are either in prison, or are awaiting trial for fraud.
Those same historians may wonder if crypto was merely a vehicle ripe for hijack by unethical types, or if its inherent fraudiness was written into its DNA. 2023 will offer plenty of material to scrub through given the number of figures who wound up face-to-face with law enforcement. Coinbase started the year accused of leaving gaps in its systems big enough to enable fraud, money laundering and drug dealing. Former Celsius CEO Alex Mashinsky was sued and later arrested — alongside the company’s chief revenue officer, Roni Cohen-Pavon. Not long after, Terraform Labs was charged by the SEC for securities fraud after it wiped out $45 billion or so. Bear in mind, this is a year-in-review story, and I’ve only managed to make it as far as February.
Binance, the world’s largest crypto exchange by volume, dominated headlines this year much as FTX had in 2022. Regulators accused it, and its founder Changpeng “CZ” Zhao of deliberately undermining its own controls and processes to not-so tacitly enable users to break the law. Zhou would plead guilty, step down as CEO and pay a hefty fine which enabled the company to keep running. Oh, and we should mention the Winklevoss Twins, their exchange and its partners, who were accused of defrauding investors to the tune of $1 billion. Ironic then, that Ferrari finally decided to try to appeal to the Lambo-and-Tendies demographic by opening up crypto purchases for its cars just as things started to get tough.
Of course, the real loser in all of this has to be Michael Lewis who, with an MA in Economics and experience as a bond trader for Salomon Brothers in one hand, and a ringside seat with Sam Bankman-Fried in the other, managed to miss what was going on at FTX. Lewis has doubled down in support of his latest muse but now that SBF has been found guilty of fraud, it looks like his reputation as the most credible financial journalist of the age is in tatters. — Daniel Cooper, Senior reporter
STRF/STAR MAX/IPx
Reddit
I’ve been a longtime Reddit lurker, occasional poster and always a first-party app user. But when the drama about the company’s decision to start charging for API access started to unfold in April, my eyes were opened to the wonderful world of third-party Reddit clients. Too bad, though, that the company proceeded to then botch it all.
Because API access was no longer free, many apps like Apollo, RIF, BaconReader and Narwhal had to reconsider their pricing or shut down altogether. Reddit’s policy change didn’t just challenge these apps, which mostly offered superior browsing experiences to the company’s own. It also created problems for clients that were built for more accessible use, rendering them unusable unless their developers ponied up the fees, which could go up as much as tens of thousands of dollars (or, in Apollo’s case, an estimated $20 million a year).
While Reddit did eventually seem to concede that the API fees would shut out some users with disabilities and ended up working with some unnamed developers to give them free access, the company dug in its heels in the wake of public outrage and subreddit blackouts. In the second half of the year, subreddits all over the platform either stopped posting, changed their settings to private or NSFW or dedicated themselves to only putting up salacious images of Last Week Tonight host John Oliver.
Reddit didn’t just ignore the protests and carry on with its planned fees. It went as far as to forcibly take over some communities that went dark, while looking for volunteers to take over certain subreddits that it deemed to have violated its Moderator Code of Conduct.
According to internet analytics company Similarweb in June, Reddit saw a 6.6 percent drop in average daily traffic. We don’t have the latest statistics on how the company is doing now, but I can tell you from personal experience that the first-party app on iOS is a complete shitshow. Like many other Redditors have pointed out before, videos will autoplay unmuted out of nowhere for no reason, while I’ve encountered numerous infuriating bugs, including one where a video on a post was repeatedly going on and off mute while I was also trying to stream Spotify to a speaker. It just sucks.
After the mass subreddit blackouts spawned a bunch of duplicate communities with different moderators, the quality of posts have noticeably fallen, as well. Not to mention the company got rid of trophies and then attempted to bring them back again in a confusing format. Throw in the fact that the community now seems to be a mix of karma-farming bots and commenters who copy and paste the same jokes over and over again, the days of enjoyable Reddit scrolling seem to have come to an end in 2023. — Cherlynn Low
This article originally appeared on Engadget at https://www.engadget.com/techs-biggest-losers-in-2023-170017317.html?src=rss
Researchers from Linköping University in Sweden developed a ‘bioelectronic soil’ that can speed up the growth of plants in , or farms that grow plants without soil in environments made up of mostly water and a place for roots to attach. After integrating the engineered ‘eSoil’ into the framework where seedlings grow, researchers discovered that sending electrical signals through the soil made plants grow 50 percent more on average.
The eSoil is made up of organic substances mixed with a , which can be found in things like . Eleni Stavrinidou, the supervisor of the study, told Engadget that the soil’s conductivity was necessary for stimulating the plant roots. In this particular study, the researchers examined the effect of sending signals to barley seedlings over the span of 15 days before harvesting them for analysis. Applying a voltage as small as 0.5V on the eSoil electrically stimulates the roots, Stavrinidou explained. This, in turn, resulted in a recordable increase in the biomass of the electrically stimulated plants when compared to the non-stimulated seedings.
The stimulation’s effect on the barley seedlings was described as Stavrinidou told Engadget that nitrogen, one of the main nutrients involved in plant growth, was processed more efficiently through the stimulation. "We found that the stimulated plants could process the nutrients more efficiently however we don't understand how the stimulation is affecting this process,” Stavrindou explained, adding that the reason behind the growth process will be a focus of future studies.
PNAS
While hydroponic techniques are mainly used to grow vegetables, leafy greens and some , the eSoil could offer a solution to create new ways to increase crop yields in commercial settings and especially in places where environmental conditions impact plant growth. The study highlights that this technique could minimize the use of fertilizers in farming.
The opportunity for technological innovation in farming is huge considering the number of US farms has steadily declined since 1982, according to the . Last year, the number of US farms reached 2 million, down from 2.2 million in 2007. Not only are farms on the decline, but the US is losing acres of land due to a host of reasons that range to worsening economic outlook for , making farming in controlled environments .
But beyond improving crop yield, the implementation of eSoil to hydroponic farms could make it more energy-conscious. While traditional hydroponic farms use up less water, they to run. “The eSoil consumes very little power in the microwatt range,” Stavrinidou said. Before this technology can be applied to and other types of crops, more studies need to be conducted to observe how electrical stimulation can impact the whole growth cycle of a plant throughout its entire lifespan and not just in the early stages of seedling maturation. Stavrinidou also said that her team plans on studying how the technique affects the growth of other plant species.
This article originally appeared on Engadget at https://www.engadget.com/swedish-researchers-develop-electronic-soil-that-speeds-up-plant-growth-205630538.html?src=rss