This has been a commonplace feature on SOCs for a decade or two now. The comments seem to be taking this headline as out‑of‑the‑ordinary news, phrased as if Oneplus invented it. Even cheapo devices often use an eFuse as anti-rollback. We do it at my work whenever root exploits are found that let you run unsigned code. If we don't blow an eFuse, then those security updates can just be undone, since any random enemy with hardware access could plug in a USB cable, flash the older exploitable signed firmware, steal your personal data, install a trojan, etc. I get the appeal of ROMs/jailbreaking/piracy but it relies on running obsolete exploitable firmware. It's not like they're forcing anyone to install the security patch who doesn't want it. This is normal.
It ain't normal to me. If I bought a phone, I should be able to decide that I want to run different software on it.
Let's say OP takes a very different turn with their software that I am comfortable with - say reporting my usage data to a different country. I should be able to say "fuck that upgrade, I'm going to run the software that was on my phone when I originally bought it"
This change blocks that action, and from my understanding if I try to do it, it bricks my phone.
The whole point of this is so that when someone steals your phone, they can't install an older vulnerable version of the firmware than can be used to set it back to factory settings which makes it far more valuable for resale.
Phone thieves aren't checking which phone brand I have before they knick my phone. Your scenerio is not improved by making Oneplus phones impossible to use once they're stolen.
I find it hard to believe that Oneplus is spending engineering and business recourses, upsetting a portion of their own userbase, and creating more e-waste because they want to reduce the global demand for stolen phones. They only have like 3% of the total market, they can't realistically move that needle.
I don't understand what business incentives they would have to make "reduce global demand for stolen phones" a goal they want to invest in.
It'd be ideal if the phone manufacturer had a way to delegate trust and say "you take the risk, you deal with the consequences" - unlocking the bootloader used to be this. Now we're moving to platforms treating any unlocked device as uniformly untrusted, because of all of the security problems your untrusted device can cause if they allow it inside their trust boundary.
We cant have nice things because bad people abused it :(.
Realistically, we're moving to a model where you'll have to have a locked down iPhone or Android device to act as a trusted device to access anything that needs security (like banking), and then a second device if you want to play.
The really evil part is things that don't need security (like say, reading a website without a log in - just establishing a TLS session) might go away for untrusted devices as well.
> We cant have nice things because bad people abused it :(.
You've fallen for their propaganda. It's a bit off topic from the Oneplus headline but as far as bootloaders go we can't have nice things because the vendors and app developers want control over end users. The android security model is explicit that the user, vendor, and app developer are each party to the process and can veto anything. That's fundamentally incompatible with my worldview and I explicitly think it should be legislated out of existence.
The user is the only legitimate party to what happens on a privately owned device. App developers are to be viewed as potential adversaries that might attempt to take advantage of you. To the extent that you are forced to trust the vendor they have the equivalent of a fiduciary duty to you - they are ethically bound to see your best interests carried out to the best of their ability.
> That's fundamentally incompatible with my worldview and I explicitly think it should be legislated out of existence.
The model that makes sense to me personally is that private companies should be legislated to be absolutely clear about what they are selling you. If a company wants to make a locked down device, that should be their right. If you don't want to buy it, that's your absolute right too.
As a consumer, you should be given the information you need to make the choices that are aligned with your values.
If a company says "I'm selling you a device you can root", and people buy the device because it has that advertised, they should be on the hook to uphold that promise. The nasty thing on this thread is the potential rug pull by Oneplus, especially as they have kind of marketed themselves as the alternative to companies that lock their devices down.
I don't entirely agree but neither would I be dead set against such an arrangement. Consider that (for example) while private banks are free not to do business with you at least in civilized countries there is a government associated bank that will always do business with anyone. Mobile devices occupy a similar space; there would always need to be a vendor offering user controllable devices. And we would also need legal protections against app authors given that (for example) banking apps are currently picking and choosing which device configurations they will run on.
I think it would be far simpler and more effective to outlaw vendor controlled devices. Note that wouldn't prevent the existence of some sort of opt-in key escrow service where users voluntarily turn over control of the root of trust to a third party (possibly the vendor themselves).
You can already basically do this on Google Pixel devices today. Flash a custom ROM, relock the bootloader, and disable bootloader unlocking in settings. Control of the device is then held by whoever controls the keys at the root of the flashed ROM with the caveat that if you can log in to the phone you can re-enable bootloader unlocking.
How is that supposed to fix anything if I don't trust the hypervisor?
It's funny, GP framed it as "work" vs "play" but for me it's "untrusted software that spies on me that I'm forced to use" vs "software stack that I mostly trust (except the firmware) but BigCorp doesn't approve of".
Well I don't entirely, but in that case there's even less of a choice and also (it seems to me) less risk. The OEM software stack on the phone is expected to phone home. On the other hand there is a strong expectation that a CPU or southbridge or whatever other chip will not do that on its own. Not only would it be much more technically complex to pull off, it should also be easy to confirm once suspected by going around and auditing other identical hardware.
As you progress down the stack from userspace to OS to firmware to hardware there is progressively less opportunity to interact directly with the network in a non-surreptitious manner, more expectation of isolation, and it becomes increasingly difficult to hide something after the fact. On the extreme end a hardware backdoor is permanently built into the chip as a sort of physical artifact. It's literally impossible to cover it up after the fact. That's incredibly high risk for the manufacturer.
The above is why the Intel ME and AMD PSP solutions are so nefarious. They normalize the expectation that the hardware vendor maintains unauditable, network capable, remotely patchable black box software that sits at the bottom of the stack at the root of trust. It's literally something out of a dystopian sci-fi flick.
Once they have hardware access who cares? They either access my data or throw it in a lake. Either way the phone is gone and I'd better have had good a data backup and a level of encryption I'm comfortable with.
This not only makes it impossible to install your own ROMs, but permanently bricks the phone if you try. That is not something my hardware provider will ever have the choice to make.
It's just another nail in the coffin of general computing, one more defeat of what phones could have been, and one more piece of personal control that consumers will be all too happy to give up because of convenience.
Sounds like that should be an option in "Developer Options" that defaults to true, and can only be disabled after re-authentication / enterprise IT authorization. I don't see anything lost for the user if it were done this way.
According to OP this does not disable bootloader unlocking in itself. It makes the up-versioned devices incompatible with all previous custom ROMs, but it should be possible to develop new ROM releases that are fully compatible with current eFuse states and don't blow the eFuse themselves.
I understand that there is a nuance somewhere, but that's about it.
Can you explain it in simpler terms such that an idiot like me can understand? Like what would an alternative OS have to do to be compatible with the "current eFuse states"?
Yes, though noting that since the antirollback is apparently implemented by the bootloader itself on this Qualcomm SoC, this will blow the fuse on devices where the new version is installed, so the unofficial EDL-mode tools that the community seems to be most concerned about will still be unavailable, and users will still be unable to downgrade from the newer to older custom ROM builds.
I wonder, is there currently unpublished 0day on the SoC and they're forcing use of the latest firmware to ensure they're not vulnerable once the details become public? That would be a reason for suddenly introducing this without explanation.
This kind of thing is generally used to disallow downgrading the bootloader once there is a bug in chain of trust handling of the bootloader. Otherwise once broken is forever broken. It makes sense from the trusted computing perspective to have this. It's not even new, it was still there on p2k motorollas 25 years ago.
You may not want trusted computing and root/jailbreak everything as a consumer, but building one is not inherently evil.
If the user doesn't trust an operating system, why would they use it. The operating system can steal sensitive information. Trusted computing is trusted by the user to the extent that they use the device. For example if they don't trust it, they may avoid logging in to their bank on it.
> If the user doesn't trust an operating system, why would they use it.
Because in the case of smartphones, there is realistically no other option.
> For example if they don't trust it, they may avoid logging in to their bank on it.
Except when the bank trusts the system that I don't (smartphone with Google Services or equivalent Apple junk installed), and doesn't trust the system that I do (desktop computer or degoogled smartphone), which is a very common scenario.
To trust an Android device, I need to have ultimate authority over it. That means freedom to remove functionality I don't like and make changes apps don't like. Otherwise, there are parts of practically every Android that I don't approve of, like the carrier app installer, any tracking/telemetry, most preinstalled apps, etc.
I recently moved to Apple devices because they use trusted computing differently; namely, to protect against platform abuse, but mostly not to protect corporate interests. They also publish detailed first-party documentation on how their platforms work and how certain features are implemented.
Apple jailbreaking has historically also had a better UX than Android rooting, because Apple platforms are more trusted than Android platforms, meaning that DRM protection, banking apps and such will often still work with a jailbroken iOS device, unlike most rooted Android devices. With that said though, I don't particularly expect to ever have a jailbroken iOS device again, unfortunately.
Apple implements many more protections than Android at the OS level to prevent abuse of trusted computing by third-party apps, and give the user control. (Though some Androids like, say, GrapheneOS, implement lots that Apple does not.)
But of course all this only matters if you trust Apple. I trust them less than I did, but to me they are still the most trustworthy.
Do you actually, bottom-of-your-heart believe that ordinary consumers think like this? They use TikTok and WhatsApp and Facebook and the Wal-Mart coupon app as a product of deep consideration on the web of trust they're building?
Users don't have a choice, and they don't care. Bitlocker is cracked by the feds, iOS and Android devices can get unlocked or hacked with commercially-available grey-market exploits. Push Notifications are bugged, apparently. Your logic hinges on an idyllic philosophy that doesn't even exist in security focused communities.
Yes, I do believe from the bottom of my heart the users trust the operating systems they use. Apple and Google have done a great job at security and privacy which is why it seems like users don't care. It's like complaining why you have a system administrator if the servers are never down. When things are run well the average person seems ignorant of the problems.
Yet, in the big picture Google is doing a good enough job that those information leaks have not caused them harm. When you really zoom in you can find some issues, but the real world impact of them is not big enough to influence most consumers.
What sort of hypothetical harm are you imagining here? Suppose the information leaks were a serious issue to me - what are my options? Switch to Apple? I doubt most consumers are going to consider something like postmarketos.
The carriers in the US were caught selling e911 location data to pretty much whoever was willing to pay. Did that hurt them? Not as far as I can tell, largely because there is no alternative and (bizarrely) such behavior isn't considered by our current legislation to be a criminal act. Consumers are forced to accept that they are simply along for the ride.
Lets say that Google let anyone visit google.com/photos?u=username to see all of the images from their camera roll and left this online not caring about the privacy implications.
People would stop taking photos with their camera that they didn't want to be public.
People would presumably switch away from gcam and the associated gallery app. Or they would simply remove their google account from the phone. They have realistic options in that case (albeit somewhat downgraded in most cases).
If Google did something egregious enough legislation might actually get passed because realistically, if public outcry doesn't convince them to change direction, what other option is available? At present it's that or switch to the only other major player in town.
They used Windows XP when it was a security nightmare and many used it long after EOL. I just talked to someone whose had 4 bank cards compromised in as many months who is almost certainly doing something wrong.
How would we even know if people distrusted a company like Microsoft or Meta? Both companies are so deeply-entrenched that you can't avoid them no matter how you feel about their privacy stance. The same goes for Apple and Google, there is no "greener grass" alternative to protest the surveillance of Push Notifications or vulnerability to Pegasus malware.
They would stop using them, or reduce what kinds of things they do on them if they didn't trust them. No one is forcing you to document your life on these palatforms.
Persistent bootkits trivial to install
No verified boot chain
Firmware implants survived OS reinstalls
No hardware-backed key storage
Encryption keys extractable via JTAG/flash dump
Modern Secure Boot + hardware-backed keystore + eFuse anti-rollback eliminated entire attack classes. The median user's security posture improved by orders of magnitude.
Arguably yes. By preventing entire classes of attack real users are never exposed to certain risks in the first place. If it were possible it would be abused at some rate (even if that rate were low).
It's not that trusted computing is inherently bad. I actually think it's a very good thing. The problem is that the manufacturer maintains control of the keys when they sell you a device.
Imagine selling someone a house that had smart locks but not turning over control of the locks to the new "owner". And every time the "owner" wants to add a new guest to the lock you insist on "reviewing" the guest before agreeing to add him. You insist that this is important for "security" because otherwise the "owner" might throw a party or invite a drug dealer over or something else you don't approve of. But don't worry, you are protecting the "owner" from malicious third parties hiding in plain sight. You run thorough background checks on all applicants after all!
A discussion you don't see nearly enough of is that there is a fundamental tradeoff with hardware security features — every feature that you can use to secure your device can also be used by an adversary to keep control once they compromise you.
In this case, the "adversary" evaluates to the manufacturer, and "once they compromise you" evaluates to "already". This is the case with most smartphones and similar devices that treats the user as a guest, rather than the owner.
Not only can, but inevitably is. Security folks - especially in mobile - are commonly useful idiots for introducing measures which are practically immediately coopted to take away users ability to control their device and modify it to serve them better. Every single time.
Yeah, not disagreeing with you. It's just that, every time we have this discussion, we see comments like GP's rebutted by comments like yours, and vice versa.
All I'm saying is that we have to acknowledge that both are true. And, if both are true, we need to have a serious conversation about who gets to choose the core used in our front door locks.
You can't have that with phones. You are always at the mercy of the hardware supplier and their trusted boot chain that starts with the actual phone processor (the one running GSM stuff, not user interface stuff). That one is always locked down and decides to boot you fancy android stuff.
The fact that it's locked down and remotely killable is a feature that people pay for and regulators enforce from their side too.
At the very best, the supplier plays nice and allows you to run your own applications, remove whatever crap they preinstalled and change to font face. If you are really lucky, you can choose to run practically useless linux distribution instead of practically useful linux distribution with their blessing. Blessing is a transient thing that can be revoked any time.
Nor the Mediatek platforms as far as I know (very familiar with the MT65xx and MT67xx series; not sure about anything newer or older, except MT62xx which also boots --- from NOR flash --- the AP first.)
There are some open firmware, or partially open firmware projects, but they're more proof-of-concepts and not popular/widely-used. The problem is the FCC or corresponding local organization requires cell phones get regulatory approval, and open firmware (where just anybody could just download the source and modify a couple of numbers to violate regulations) doesn't jive with that.
The GSM processor is often a separate chip. You may have read an article about the super spooky NSA backdoor processor that really controls your phone, but it's just a GSM processor. Connecting via PCIe may allow it to compromise the application processor if compromised itself, but so can a broadcom WiFi chip.
OTP memory is a key building block of any secure system and likely on any device you already have.
Any kind of device-unique key is likely rooted in OTP (via a seed or PUF activation).
The root of all certificate chains is likely hashed in fuses to prevent swapping out cert chains with a flash programmer.
It's commonly used to anti rollback as well - the biggest news here is that they didn't have this already.
If there's some horrible security bug found in an old version of their software, they have no way to stop an attacker from loading up the broken firmware to exploit your device? That is not aligned with modern best practices for security.
> they have no way to stop an attacker from loading up the broken firmware to exploit your device
You mean the attacker having a physical access to the device plugging in some USB or UART, or the hacker that downgraded the firmware so it can use the exploit in older version to downgrade the firmware to version with the exploit?
Sure. Or the supply chain attacker (who is perhaps a state-level actor if you want to think really spicy thoughts) selling you a device on Amazon you think is secure, that they messed with when it passed through their hands on its way to you.
The state level supply chain attacker can just replace the entire chip, or any other part of the product. No amount of technical wizardry can prevent this.
Modern devices try to prevent this by cryptographically entangling the firmware on the flash to the chip - e.x. encrypting it with a device-unique key from a PUF. So if you replace the chip, it won't be able to decrypt the firmware on flash or boot.
The evil of the type of attack here is that the firmware with an exploit would be properly signed, so the firmware update systems on the chip would install it (and encrypt it with the PUF-based key) unless you have anti-rollback.
Of course, with a skilled enough attacker, anything is possible.
> You mean the attacker having a physical access to the device plugging in some USB or UART
... which describes US border controls or police in general. Once "law enforcement" becomes part of one's threat model, a lot of trade-offs suddenly have the entire balance changed.
eFuses have been a thing forever on almost all MCUs/processors, and aren't some inherently "evil" technology - mostly they're used in manufacturing when you might have the same microcontroller/firmware on separate types of boards. I'm working on a board right now which is either an audio input or an output (depending on which components are fitted) and one or the other eFuse is burned to set which one it is, so subsequent firmware releases won't accidentally set a GPIO as an output rather than an input and potentially damage the device.
It depends. Usually there are enough "knobs" that adding that many balls to the package would be crazy expensive at volume.
Most SoCs of even moderate complexity have lots of redundancy built in for yield management (e.x. anything with RAM expects some % of the RAM cells to be dead on any given chip), and uses fuses to keep track of that. If you had to have a strap per RAM block, it would not scale.
There's so many ways to do this, but a simpler method is to hide a small logic block (somewhere in the 10 billion transistors of your CPU) that detects a specific, long sequence of bits and invokes the kill switch.
This has been going on for a long, long time. Motorola used to make Android phones that would burn an efuse in the SoC if it thought it was being rooted or jailbroken, bricking the phone.
This is absurdly paranoid with absolutely zero evidence. For embedded and mobile threat models where physical access or bootloader unlock is possible, eFuses are effectively mandatory for robust downgrade prevention
Agreed that robust downgrade prevention is necessary. However it's not paranoid at all and the problem isn't limited to eFuses. A network connected device that the vendor ultimately controls is a device that can be remotely disabled at the vendor's whim. It's like a hardware backdoor except it's out in the open and much more capable.
Unfortunately similar things will be mandated by EU law through cyber resiliance act (CRA) in order to ensure tamper free boot of any kind of device sold in the EU from Dec 2027.
Basically breaking any kind of FOSS or repairability, creating dead HW bricks if the vendor ceases to maintain or exist.
This goes beyond the 'right to repair' to simply the right of ownership. These remote updates prove again and again that even though you paid for something you don't actually own it.
It's basically the same for our automobiles, just try to disable the "phone home" parts connected to the fin on the roof. Do we really own out cars if we can't stop the manufacturer from telling us we need to change our oil through email?
Hahah, I just traded in 2023 (unrelated brand) for 2012 model since it was less of a computer. Computer systems in the newer car kept having faults that caused sporadic electrical issues workshops couldn’t fix. I just want my car to be a car and nothing else.
... and get a Check Engine light+fault code for the built-in emergency SOS feature, thereby making it unable to pass vehicle inspection until you fix the antennae
so either 1) disconnect it most of the time and reconnect it for inspections, or 2) buy a dummy load RF terminator matching the resistance of your antenna
OnePlus and other Chinese brands were modders-friendly until they suddenly weren't, I wouldn't rely on your car not getting more hostile at a certain point
There was a video by MKBHD where he said that every new phone manufacturer starts off being the hero and doing something different and consumer/user friendly before with growth and competition they evolve into just another mass market phone manufacturer. Realistically this is because they wouldn't be able to survive without being able to make and sell mass market phones. This has already happened to OnePlus back half a decade ago when they merged with Oppo, and it's arguably happened with ASUS as well when they cancelled the small form factor phone a couple years ago.
A phone without SIM can still be used to call emergency services (911/999/0118999 8819991197253). The situation we're discussing though is an attack by an extremely-APT. You really think not having the SIM card is going to do anything? If the cell phone hardware is powered up, it's available. All the APT has to do is have put their code into the baseband at some point, maybe at the Volvo factory when the car was programmed, and get the cooperation of a cell-phone tower, or use a Stingray to report where the car is when in range.
My ownership is proved by my receipt from the store I bought it from.
This vandalization at scale is a CFAA violation. I'd also argue it is a fraudulent sale since not all rights were transferred at sale, and misrepresented a sale instead of an indefinite rental.
And its likely a RICO act, since the C levels and BOD likely knew and/or ordered it.
And damn near everything's wire fraud.
But if anybody does manage to take them to court and win, what would we see? A $10 voucher for the next Oneplus phone? Like we'd buy another.
A forced update or continual loop of "yes" or "later" is not consent. The fact that there is no "No" option shows that.
Fabricated or fake consent, or worse, forced automated updates, indicates that the company is the owner and exerting ownership-level control. Thus the sale was fraudulently conducted as a sale but is really an indefinite rental.
It Is not an indefinite rental. A sale can't be "misrepresented". It is a blatant CFAA violation. They are accessing your computer, modifying its configuration, and exfiltrating your private data without your authorization.
If I buy a used vehicle for example, I have exactly zero relationship with the manufacturer. I never agree to anything at all with them. I turn the car on and it goes. They do not have any authorization to touch anything.
We shouldn't confuse what's happening here. The engineers working on these systems that access people's computers without authorization should absolutely be in prison right alongside the executives that allowed or pushed for it. They know exactly what they're doing.
> If I buy a used vehicle for example, I have exactly zero relationship with the manufacturer. I never agree to anything at all with them. I turn the car on and it goes. They do not have any authorization to touch anything.
Generally speaking and most of the time, yes; however, there are a few caveats. The following uses common law – to narrow the scope of the discussion down.
As a matter of property, the second-hand purchaser owns the chattel. The manufacturer has no general residual right(s) to «touch» the car merely because it made it. Common law sets a high bar against unauthorised interference.
The manufacturer still owes duties to foreseeable users – a law-imposed duty relationship in tort (and often statute) concerning safety, defects, warnings, and misrepresentations. This is a unidirectional relationship – from the manufacturer to the car owner and covers product safety, recalls, negligence (on the manufacturer's behalf) and alike – irrespective of whether it was a first- or second-hand purchase.
One caveat is that if the purchased second-hand car has the residual warranty period left, and the second-hand buyer desires that the warranty be transferred to them, a time-limited, owner-to-manufacturer relationship will exist. The buyer, of course, has no obligation to accept the warranty transfer, and they may choose to forgo the remaining warranty.
The second caveat is that manufacturers have tried (successfully or not – depends on the jurisdiction) to assert that the buyer (first- or second-hand) owns the hardware (the rust bucket), and users (the owners) receive a licence to use the software – and not infrequently with strings attached (conditions, restrictions, updates and account terms).
Under common law, however, even if a software licence exists, the manufacturer does not automatically get a free-standing right to remotely alter the vehicle whenever they wish. Any such right has to come from a valid contractual arrangement, a statutory power, or the consent, privity still works and requires a consent – all of which weakens the manufacturer's legal standing.
Lastly, depending on the jurisdication, the manufacturer can even be sued for installing an OTA update on the basis of the car being a computer on wheels, and the OTA update being an event of unauthorised access to the computer and its data, which is oftenimes a criminal offence. This hinges on the fact that the second-hand buyer has not entered into a consentual relationship with the manufacturer after the purchase.
A bit of a lengthy write-up but legal stuff is always a fuster cluck and a rabit hole of nitpicking and nuances.
This is the kind of nitpicking that I love to see on HN, it is establishes the boundaries of the relationship between manufacturers and owners and tries to lay bare the need for (informed) consent and what the legal basis for that is.
Do you mean because the previous "flagship killer" company now needed a "flagship killer" sub-brand, since they could no longer be categorised as such?
Because all midrange phones are "flagship killers" on a features basis now, flagships are just about the exclusivity. The market has adapted and the term no longer makes much sense. OnePlus still leads on custom ROM support though, e.g. no special codes or waiting times needed for unlocking the bootloader, it all works out of the box with standard commands.
yeah, i'd like to know that too. i have a oneplus nord running /e/OS and i am quite happy with it. in fact it's probably the best phone i had so far performance wise (i got it refurbished at a very good price which may have something to do with that though)
What do OnePlus gain from this? Can someone explain me what are the advantages of OnePlus doing all this?
A failed update resulting in motherboard replacement? More money, more shareholders are happy?
I still sometimes ponder if oneplus green line fiasco is a failed hardware fuse type thing that got accidentally triggered during software update. (Insert I can't prove meme here).
My understanding is there was a bug that let you wipe and re-enable a phone that had been disabled due to theft. This prevents a downgrade attack. It's in OnePlus's interest to make their phones less appealing for theft, or, in their interest to comply with requirements to be disableable from carriers, Google, etc.
right, but the stolen phones get sold in other countries where the carriers don't care if the phone was stolen but care that someone is spending money on their service.
Visit eBay and search for "blocked IMEI" or variants. There are plenty of used phones which are IMEI locked due to either: reported lost, reported stolen, failed to make payments, etc.
I the lines between IMEI banning or blacklisting and the modern unlocking techniques they use have been blurred a little bit and so some carriers and some manufacturers don't really want to do or spend time doing the IMEI stuff and would prefer to just handle it all via their own unlocking and locking mechanisms.
Make perfect sense, Thanks kind stranger. Hope it is the reason and not some corporate greed. It on me, lately my thoughts are defaulted towards corporates sabotaging consumers. I need to work on it.
The effects on custom os community is causing me worried ( I am still rocking my oneplus 7t with crdroid and oneplus used to most geek friendly)
Now I am wondering if there are other ways they could achieved the same without blowing a fuse or be more transparent about this.
I don't think so. Blowing a fuse is just how the "no downgrades" policy for firmware is implemented. No different for other vendors actually, though the software usually warns you prior to installing an update that can't be manually rolled back.
> It on me, lately my thoughts are defaulted towards corporates sabotaging consumers. I need to work on it.
You absolutely do not, this is an extremely healthy starting position for evaluating a corporations behavior. Any benefit you receive is incidental, if they made more money by worsening your experience they would.
> It's in OnePlus's interest to make their phones less appealing for theft,
I don't believe for a second that this benefits phone owners in any way. A thief is not going to sit there and do research on your phone model before he steals it. He's going to steal whatever he can and then figure out what to do with it.
Which is why I mentioned that carriers or Google might have that as a requirement for partnering with them. iPhones are rarely stolen these days because there's no resale market for them (to the detriment of third party repairs). It behooves large market players, like Google or carriers, to create the same perception for Android phones.
Thieves don't do that research to specific models. Manufacturers don't like it if their competitors' models are easy to hawk on grey markets because that means their phones get stolen, too.
It actually seems to work pretty well for iPhones.
Thieves these days seem to really be struggling to even use them for parts, since these are also largely Apple DRMed, and are often resorting to threatening the previous owner to remove the activation lock remotely.
Of course theft often isn't preceded by a diligent cost-benefit analysis, but once there's a critical mass of unusable – even for parts – stolen phones, I believe it can make a difference.
Yes thieves do, research on which phones to steal. Just not online more in personal talking with their network of lawbreakers. In short a thief is going to have a fence, and that person is going to know all about what phones can and cannot be resold.
Their low-level bootloader code contains a vulnerability that allows an attacker with physical access to boot an OS of their choice.
Android's normal bootloader unlock procedure allows for doing so, but ensures that the data partition (or the encryption keys therefore) are wiped so that a border guard at the airport can't just Cellebrite the phone open.
Without downgrade protection, the low-level recovery protocol built into Qualcomm chips would permit the attacker to load an old, vulnerable version of the software, which has been properly signed and everything, and still exploit it. By preventing downgrades through eFuses, this avenue of attack can be prevented.
This does not actually prevent running custom ROMs, necessarily. This does prevent older custom ROMs. Custom ROMs developed with the new bootloader/firmware/etc should still boot fine.
This is why the linked article states:
> The community recommendation is that users who have updated should not flash any custom ROM until developers explicitly announce support for fused devices with the new firmware base.
Once ROM developers update their ROMs, the custom ROM situation should be fine again.
thank you for this, I have a follow up question:
Now an attacker can not install an old, vulnerable version.
But couldn't they just install a new, vulnerable version?
Is there something that enforces encryption key deletion in one case and not the other?
That makes sense, but how would an attacker flash an older version of the firmware in the first place? Don't you need developer options and unlocking + debugging enabled?
Open the case and pogo pin on a flash programmer directly to the pins of the flash chip.
Sophisticated actors (think state-level actors like a border agent who insists on taking your phone to a back room for "inspection" while you wait at customs) can and will develop specialized tooling to help them do this very quickly.
> What do OnePlus gain from this? Can someone explain me what are the advantages of OnePlus doing all this?
They don't want the hardware to be under your control. In the mind of tech executives, selling hardware does not make enough money, the user must stay captive to the stock OS where "software as a service" can be sold, and data about the user can be extracted.
A bit overdramatic, isn't it? Custom ROMs designed for the new firmware revisions still work fine. Only older ROMs with potentially vulnerable bootloader code cause bricking risks.
Give ROM developers a few weeks and you can boot your favourite custom ROMs again.
Not really dramatic IMO. Basically mirrors everything we have seen in other industries like gaming consoles, etc. that have destroyed ownership over time in favor of "service models" instead.
Sure if you want to compete against Google or Samsung. Maybe that is the plan that one plus has. My understanding was that they were going after a different Market of phone users that might want a little bit more otherwise why not just go with one of the other people that will screw you just as hard for less.
Note that Google also forces this indirectly via their "certification" - if the device doesn't have unremovable AVB (requires qualcomm secure boot fuse to be blown) then it's not even allowed to say the device runs Android.. if you see "Android™" then it means secure boot is set up and you don't have the keys, you can't set up your own, so you don't really own the SoC you paid for..
I was talking about different keys and different fuses. I know about "avb_custom_key" (provisioned by GrapheneOS), but all this AVB is handled by abl/trustzone and I can't modify those because those need to be signed with keys that I don't own.
I know that all these restrictions might make sense for the average user who wants a secure phone.. but I want an insecure-but-fully-hackable one.
It is the same concept on an iPhone, you have 7 days to downgrade, then it is permanently impossible. Not for technical reasons, but because of an arbitrary lock (achieved through signature).
OnePlus just chose the hardware way, versus Apple the signature way
Whether for OnePlus or Apple, there should definitively be a way to let users sign and run the operating system of their choice, like any other software.
(still hating this iOS 26, and the fact that even after losing all my data and downgrading back iOS 18 it refused to re-sync my Apple Watch until iOS 26 was installed again, shitty company policy)
I'm not sure if this is the case anymore, but many unbranded/generic Androids used to be completely unlocked by default (especially Mediatek SoCs) and nearly unbrickable, and that's what let the modding scene flourish. I believe they had efuses too, but software never used them.
> When the device powers on, the Primary Boot Loader in the processor's ROM loads and verifies the eXtensible Boot Loader (XBL). XBL reads the current anti-rollback version from the Qfprom fuses and compares it against the firmware's embedded version number. If the firmware version is lower than the fuse value, boot is rejected. When newer firmware successfully boots, the bootloader issues commands through Qualcomm's TrustZone to blow additional fuses, permanently recording the new minimum version
What exactly is it comparing? What is the “firmware embedded version number”? With an unlocked bootloader you can flash boot and super (system, vendor, etc) partitions, but I must be missing something because it seems like this would be bypassable.
It does say
> Custom ROMs package firmware components from the stock firmware they were built against. If a user's device has been updated to a fused firmware version & they flash a custom ROM built against older firmware, the anti-rollback mechanism triggers immediately.
and I know custom ROMs will often say “make sure you flash stock version x.y beforehand” to ensure you’re on the right firmware, but I’m not sure what partitions that actually refers to (and it’s not the same as vendor blobs), or how much work it is to either build a custom ROM against a newer firmware or patch the (hundreds of) vendor blobs.
Firmware (XBL and other non OS components) are versioned with anti rollback values. If the version is less than the version burned into the fuses the firmware is rejected. The “boot” partition is typically the Linux kernel. Android Verified Boot loads and hashes the kernel image and compares it to the expected hash in the vbmeta partition. The signature of the hash of the entire vbmeta metadata is compared to a public key coded into the secondary boot loader (typically abl (fastboot before fastbootd was done in user space to support super partitions))
The abl firmware contains an anti rollback version that is checked with the eFuse version.
The super partition is a bunch of lvm logical partitions on top of a single physical partition. Of these, is the main root filesystem which is mounted read only and protected with dm-verity device mapping. The root hash of this verity rootfs is also stored in the signed vbmeta.
Android Verified Boot also has an anti rollback feature. The vbmeta partition is versioned and the minimum version value is stored cryptographically in a special flash partition called the Replay Protected Memory Block (rpmb). This prevents rollback of boot and super as vbmeta itself cannot be rolled back.
>What exactly is it comparing? What is the “firmware embedded version number”? With an unlocked bootloader you can flash boot and super (system, vendor, etc) partitions, but I must be missing something because it seems like this would be bypassable.
This doesn't make sense unless the secondary boot is signed and there is a version somewhere in signed metadata. Primary boot checks the signature, reads the version of secondary boot and loads it only if the version it's not lower than what write-once memory (fuse) requires.
If you can self-sign or disable signature, then you can do whatever boot you want, as long as it's metadata satisfies the version.
Damn, I just saw that update yesterday on my phone and did not update it for no reason. Turned off auto-update right now until I figure out what to do.
OnePlus has pretty much become irrelevant since Carl Pei left the company. Its more or less just a rebranded Oppo nowadays. I'm not an android user anymore but I'm rooting for his new(ish) Nothing company. Hopefully it carries the torch for the old OnePlus feel.
As an early OnePlus user (1, 3, 5, 7, 13) i find myself unimpressed with what Nothing is proposing, feels more like a design exercise than a flagship killer
They consistently have allowed bootloader unlocking without extra fuss and have had good LineageOS support. That is their main appeal, IMO. Nothing phones had no LineageOS support until recently (spacewar is now supported, unsure about other models), and it's not clear if there's enough of a community/following to keep putting LineageOS on them. I do not want any phone where I'm stuck with the stock ROM.
Nothing phones also allow seamless bootloader unlocking, just like OnePlus. There's been some rumors that OnePlus might be about to exit the market altogether, if so Nothing will probably expand into their niche and beyond their current approach based on "unique" design.
I've been with OnePlus since the beginning, and am not at all impressed by the Nothing. Primary missing feature which I've come to depend on, off screen gestures, is missing. And the device just comes across as foreign in general; makes me think of the iPhone, which is not something I want to think of.
Blind speculation: I wonder if this is in some way related to DRM getting broken at a firmware level, leading to a choice being made between "users complain that they can't watch netflix" and "users complain that they can't install custom ROMs".
From what I understand this does not prevent use of custom ROMs, it just means ROMs built before it was done will not work anymore. I assume they can re-package old versions to work with the new configuration, I am not entirely sure though. There are discussions elsewhere in this thread with more informed people.
Does anyone know if it has been confirmed that this only applies to the "ColorOS" branded firmware versions? Because I currently have an update to OxygenOS 16.0.3.501 pending on my OnePlus 15, which is presumably built from the same codebase.
I've been dismayed by how fast the "we should own our hardware" crowd has so quickly radicalized into "all security features are evil", and "no security features should exist for anyone".
Not just "there should be some phone brands that cater to me", but "all phone brands, including the most mainstream, should cater to me, because everyone on earth cares more about 'owning their hardware' than evil maid attack prevention, Cellebrite government surveillance, theft deterrence, accessing their family photos if they forget their password, revocable code-signing with malware checks so they don't get RATs spying on their webcam, etc, and if they don't care about 'owning their hardware' more than that, they are wrong".
"No security features should exist for anyone" is itself fanatically hyperbolic narrative. The primary reason this event has elicited such a reaction is because OnePlus has historically been perceived as one of the brands specifically catering to people that wanted ultimate sovereignty over their devices.
As time goes on, the options available for those that require such sovereignty seem to be thinning to such an extent that [at least absent significant disposable wealth] the remaining options will appear to necessitate adopting lifestyle changes comparable to high-cost religious practices and social withdrawal, and likely without the legal protections afforded those protected classes. Given the "big tech's" general hostility to user agency and contempt for values that don't consent to being subservient to its influence peddling, intense emotional reaction to loss of already diminished traditional allies seem like something that would reasonably viewed compassionately, rather than with hostility.
I’ve posted about this on HN before; I think that there’s a dangerous second-order enshittification going on where people are so jaded by a few bad corporate actions that they believe that everyone is out to get them and hardware is evil. The most disappointing thing to me is that this has led to a complete demolition of curiosity; rather than learning that OTP is an ancient and essential concept in hardware, the brain-enshittification has led to “I see hardware anti-*, I click It’s Evil” with absolutely no thought or research applied.
Given how the opposition has radicalized into "you should own nothing and be happy", it's not surprising.
None of the situations you mentioned are realistic or even worth thinking about for the vast majority of the population. They're just an excuse to put even more control into the manufacturer's hands.
The attack is simple: the attacker downgrades the phone to a version of firmware that has a vulnerability. The attacker then uses the vulnerability to get at your data. Your data is PIN-protected? The attacker uses the vulnerability to disable the PIN lockout and tries all of them.
There's over a 10x difference in fence price between a locked and unlocked phone. That's a significant incentive/deterrent.
That's insane. If the CPU has enough fuses (which according to the wiki it does) why the h*ck can't they just make it impossible to reflash the >= minimum previously installed version of the OS after preventing the downgrade? Why the hard brick?
If so, is this 'fuse' per-planned in the hardware? My understanding is cell phones take 12 to 24 months from design to market. so, initial deployment of the model where this OS can trigger the 'fuse' less one year is how far back the company decided to be ready to do this?
Lots of CPUs that have secure enclaves have a section of memory that can be written to only once. It's generally used for cryptographic keys, serials, etcetera. It's also frequently used like this.
Fuses are there on all phones since 25+ years ago, on the real phone CPU side. With trusted boot and shit. Otherwise you could change IMEI left and right and it's a big no-no. What you interact with runs on the secondary CPU -- the fancy user interface with shiny buttons, but that firmware only starts if the main one lets it.
This is industry standard. Flashing old updates that are insecure to bypass security is a legitimate attack vector that needs to be defended against. Ideally it would still be possible up recover from such a scenario by flashing the latest update.
Standard?? The standard is for the upgrade to be refused or not boot until you flash a newer one, not to brick the phone permanently. It's not an "ideally" thing for the manufacturer to not intentionally brick your device you bought and paid for.
They make it clear that this feature is unsupported and it's possible to mess things up. The reason why it's an ideal and not an expectation is that flashing alternate operating systems is done at one's own risk and is unsupported. They have already told the users that they bear no responsibility for what may go wrong if they flash the wrong thing on that device. Flashing incompatible operating systems to the device requires people to be careful and proper care to ensure compatibility before going through with flashing was not done.
The phone. It's the same attacks that secure boot tries to protect against. The issue is that these old, vulnerable versions have a valid signature allowing them to be installed.
Very hard. FIB is the only known way to do this but even then, that's the type of thing where you start with a pile of SoCs and expect to maybe get lucky with one in a hundred. A FIB machine is also millions of dollars.
It's Google's fault. I want to buy a smartphone without AVB at all. With no "secure boot" fuse blown (yes I DO know that this is not the same fuse) and ideally I'd want to provision my own keys.
But vendors wouldn't be able to say the device runs "Android" as it's trademarked. AVB is therefore mandatory and in order for AVB to be enforced, you can't really control the device - unlocking the bootloader gives you only partial control, you can't flash your own "abl" to remove AVB entirely.
But I don't want AVB and I can't buy such device for money.. this isn't free market, this is Google monopoly..
The closest thing you can get is probably the Pixel, ironically. You can provision your own keys, enroll it into AVB, and re-lock the bootloader. From the phone hardware's perspective there is no difference between your key and Google's. No fuse is ever blown.
That's not really true, there will be a warning shown that "the phone is loading a different operating system" - I've seen that when installing GrapheneOS on my pixel.
But it's not just about that, it's about the fact that I can't flash my own "abl" or the software running in the TrustZone there at all as I don't control the actual signing keys (not custom_avb_key) and I'm not "trusted" by my own device.. There were fuses blown as evident by examining abl with its fastboot commands - many refuse to work saying I can't use it on a "production device". Plus many of those low-level partitions are closed source proprietary blobs..
Yes yes - I DO understand that for most people this warning is something positive, otherwise you could buy a phone with modified software without realizing it and these modifications could make it impossible to restore the original firmware.
Ah, I forgot about the warning. Are the blown fuses you're talking about related to to your unlocking though? Or did they just remove the debug functions. I guess it reduces the attack surface somewhat.
I do agree it's far from ideal though. But there are so many, much worse offenders that uses these fuses to actually remove features, and others that do not allow installing a different OS at all. The limited effort should probably be spent on getting rid of those first.
I'm not sure I'd agree with your last conclusion, we as consumers can choose what to buy, so for me the situation where there's one brand that produces open devices (with competing specs, not like pinephone..) where I could install postmarketos/ubuntu touch without any parts of android would be better than there being many brands producing smartphones allowing only basic unlocking and without open firmware.
Of course there are bigger problems in the ecosystem, like Play Integrity which actively attempt to punish me for buying open hardware. Unfortunately that's the consequence of putting "trusted" applications where they IMO don't belong - there are smartcards with e-ink displays and these could be used for things like banking confirmations, providing the same security but without invading my personal computing devices. But thanks to Android and iOS, banks/governments went for the anti-user option.
So this article isn't about a kill switch, just blocking downgrades and custom ROMs.
But to answer your question: we know iPhones have a foolproof kill switch, it's a feature. Just mark your device as lost in Find My and it'll be locked until someone can provide your login details. Assuming it requires logging in to your Apple account (which it does, AFAIK; I don't think logging in to a local account is enough), this is the same as a remote kill switch; Apple could simply make a device enter this locked-down state and then tweak their server systems to deny logins.
I'd say for commercial hardware it is a near certainty even if you won't ever know until it is much too late.
Realize that many of these manufacturers sell their hardware in and employ companies in highly policed societies. Just the fact that they are allowed to continue to operate implies that they are playing ball and may well have to perform a couple of favors. And that's assuming they are fully aware of what they are shipping, which may not be always the case.
I don't think it is a bad model at all to consider any cell phone to be compromised in multiple ways even though you don't have hard proof.
It's there on all phones since forever lol. Apple can ship an update that adds "update without asking for confirmation" tomorrow and then ship another one that shows nothing but a middle finger on boot and you would not be able to do anything, including downgrading back.
The M-series CPUs found in iPads (which cannot boot custom payloads) are the same as the M-series CPUs found in Macbooks (which can boot custom payloads) - just with different fuses pre-burnt during manufacturing.
Pre-prod (etc.) devices will also have different fuses burnt.
iPhones already cannot be downgraded, they can only install OS versions signed by apple during the install time. (search SHSH blobs) They also can't run unsigned IPA files (apps). Not sure if they have a physical fuse, but it's not much different.
The significant difference is that if it were placed into DFU mode and connected to an appropriate device that had access to appropriately signed things, it could be "unbricked" without replacing the mainboard.
They patched a low-level vulnerability in their boot process. Their phones' debug features would allow attackers to load an old, unpatched version of their (signed) software and exploit it if they didn't do some kind of downgrade prevention.
Using eFuses is a popular way of implementing downgrade prevention, but also for permanently disabling debug flags/interfaces in production hardware.
Some vendors (AMD) also use eFuses to permanently bond a CPU to a specific motherboard (think EPYC chips for certain enterprise vendors).
They can kill custom roms and force the latest vendor firmware. If they push a shitty update that slows down the phone or something, users have no choice other than buying a new device.
Almost every modern SoC has efuse memory. For example, this is used for yield management - the SoC will have extra blocks of RAM and expect some % to be dead. At manufacturing time they will blow fuses to say which RAM cells tested bad.
Samsung uses this for their Knox security feature. The fuse gets broken in initial bootloader unlock, and all features related to Knox (Samsung Pay, Secure Folder, etc) gets disabled permanently even after reverting to stock firmware.
I use them in an esp32 to write a random password to each of my products, so when I sell them they can each have their own secure default wifi password while all using the same firmware.
This is the only way I could come up with that would allow an end user to do a full factory reset, and end up back in a known good secure state afterwards.
Storing it in the firmware would mean every user has the same key. Storing it in eeprom means a factory reset will clear it. This allows me to ship hardware with the default key on a sticker on the side, and let's a non technical user reset it back to that if they need to.
Its high time we start challenging these sorts of actions as the "vandalization and sabotage at scale" that these attacks really are. I dont see how these aren't a direct violation of the CFAA, over millions of customer-owned hardware.
They are no different than some shit ransomware, except there is no demand for money. However, there is a demonstrable proof of degradation and destruction of property in all these choices.
Frankly, criminal AND civil penalties should be levied. Criminally, the C levels and boars of directors should all be in scope as to encouraging/allowing/requiring this behavior. RICO act as well, since this smells like a criminal conspiracy. Let them spend time in prison for mass destruction of property.
Civally, start dissolving assets until the people are made whole with unbroken (and un-destroyed) hardware.
The next shitty silly-con valley company thinks about running this scam of 'customer-bought but forever company owned', will think long and hard about the choices of their network and cloud.
This is absolutely cracked. I've been with OnePlus since the One, also getting the 2, 6 and now I have the 12. Stuck with them all these years because I really respected their - original - take on device freedom. I really should've seen the writing on the wall given how much pain it is to update it in the first place, as I have the NA version which only officially allows carrier updates, and I don't live in NA (and even if I did I'd still not be tied to a carrier).
Now I have to consider my device dead re updates, because if I haven't already gotten the killing update I'd rather avoid it. First thing I did was unlock the bootloader, and I intend to root/flash it at some point. Will be finding another brand whenever I'm ready to upgrade again.
Yeah I'm surprised that they announced it but not the vendor name. I'm sure Google with their infinite resources already know which vendor it is. So who are they hiding it from?
Let's say OP takes a very different turn with their software that I am comfortable with - say reporting my usage data to a different country. I should be able to say "fuck that upgrade, I'm going to run the software that was on my phone when I originally bought it"
This change blocks that action, and from my understanding if I try to do it, it bricks my phone.
I don't understand what business incentives they would have to make "reduce global demand for stolen phones" a goal they want to invest in.
We cant have nice things because bad people abused it :(.
Realistically, we're moving to a model where you'll have to have a locked down iPhone or Android device to act as a trusted device to access anything that needs security (like banking), and then a second device if you want to play.
The really evil part is things that don't need security (like say, reading a website without a log in - just establishing a TLS session) might go away for untrusted devices as well.
You've fallen for their propaganda. It's a bit off topic from the Oneplus headline but as far as bootloaders go we can't have nice things because the vendors and app developers want control over end users. The android security model is explicit that the user, vendor, and app developer are each party to the process and can veto anything. That's fundamentally incompatible with my worldview and I explicitly think it should be legislated out of existence.
The user is the only legitimate party to what happens on a privately owned device. App developers are to be viewed as potential adversaries that might attempt to take advantage of you. To the extent that you are forced to trust the vendor they have the equivalent of a fiduciary duty to you - they are ethically bound to see your best interests carried out to the best of their ability.
The model that makes sense to me personally is that private companies should be legislated to be absolutely clear about what they are selling you. If a company wants to make a locked down device, that should be their right. If you don't want to buy it, that's your absolute right too.
As a consumer, you should be given the information you need to make the choices that are aligned with your values.
If a company says "I'm selling you a device you can root", and people buy the device because it has that advertised, they should be on the hook to uphold that promise. The nasty thing on this thread is the potential rug pull by Oneplus, especially as they have kind of marketed themselves as the alternative to companies that lock their devices down.
I think it would be far simpler and more effective to outlaw vendor controlled devices. Note that wouldn't prevent the existence of some sort of opt-in key escrow service where users voluntarily turn over control of the root of trust to a third party (possibly the vendor themselves).
You can already basically do this on Google Pixel devices today. Flash a custom ROM, relock the bootloader, and disable bootloader unlocking in settings. Control of the device is then held by whoever controls the keys at the root of the flashed ROM with the caveat that if you can log in to the phone you can re-enable bootloader unlocking.
With virtualization this could be done with the same device. The play VM can be properly isolated from the secure one.
It's funny, GP framed it as "work" vs "play" but for me it's "untrusted software that spies on me that I'm forced to use" vs "software stack that I mostly trust (except the firmware) but BigCorp doesn't approve of".
Well I don't entirely, but in that case there's even less of a choice and also (it seems to me) less risk. The OEM software stack on the phone is expected to phone home. On the other hand there is a strong expectation that a CPU or southbridge or whatever other chip will not do that on its own. Not only would it be much more technically complex to pull off, it should also be easy to confirm once suspected by going around and auditing other identical hardware.
As you progress down the stack from userspace to OS to firmware to hardware there is progressively less opportunity to interact directly with the network in a non-surreptitious manner, more expectation of isolation, and it becomes increasingly difficult to hide something after the fact. On the extreme end a hardware backdoor is permanently built into the chip as a sort of physical artifact. It's literally impossible to cover it up after the fact. That's incredibly high risk for the manufacturer.
The above is why the Intel ME and AMD PSP solutions are so nefarious. They normalize the expectation that the hardware vendor maintains unauditable, network capable, remotely patchable black box software that sits at the bottom of the stack at the root of trust. It's literally something out of a dystopian sci-fi flick.
Once they have hardware access who cares? They either access my data or throw it in a lake. Either way the phone is gone and I'd better have had good a data backup and a level of encryption I'm comfortable with.
This not only makes it impossible to install your own ROMs, but permanently bricks the phone if you try. That is not something my hardware provider will ever have the choice to make.
It's just another nail in the coffin of general computing, one more defeat of what phones could have been, and one more piece of personal control that consumers will be all too happy to give up because of convenience.
Can you explain it in simpler terms such that an idiot like me can understand? Like what would an alternative OS have to do to be compatible with the "current eFuse states"?
> The anti-rollback mechanism uses Qfprom (Qualcomm Fuse Programmable Read-Only Memory), a region on Qualcomm processors containing one-time programmable electronic fuses.
What a nice thoughtful people to build such a feature.
That’s why you sanction the hell out of Chinese Loongson or Russian Baikal pity of CPU — harder to disable than programmatically “blowing a fuse”.
You may not want trusted computing and root/jailbreak everything as a consumer, but building one is not inherently evil.
Because in the case of smartphones, there is realistically no other option.
> For example if they don't trust it, they may avoid logging in to their bank on it.
Except when the bank trusts the system that I don't (smartphone with Google Services or equivalent Apple junk installed), and doesn't trust the system that I do (desktop computer or degoogled smartphone), which is a very common scenario.
I recently moved to Apple devices because they use trusted computing differently; namely, to protect against platform abuse, but mostly not to protect corporate interests. They also publish detailed first-party documentation on how their platforms work and how certain features are implemented.
Apple jailbreaking has historically also had a better UX than Android rooting, because Apple platforms are more trusted than Android platforms, meaning that DRM protection, banking apps and such will often still work with a jailbroken iOS device, unlike most rooted Android devices. With that said though, I don't particularly expect to ever have a jailbroken iOS device again, unfortunately.
Apple implements many more protections than Android at the OS level to prevent abuse of trusted computing by third-party apps, and give the user control. (Though some Androids like, say, GrapheneOS, implement lots that Apple does not.)
But of course all this only matters if you trust Apple. I trust them less than I did, but to me they are still the most trustworthy.
What do you mean by this? On both Android and iOS app developers can have a backend that checks the status of app attestation.
Users don't have a choice, and they don't care. Bitlocker is cracked by the feds, iOS and Android devices can get unlocked or hacked with commercially-available grey-market exploits. Push Notifications are bugged, apparently. Your logic hinges on an idyllic philosophy that doesn't even exist in security focused communities.
https://arstechnica.com/information-technology/2024/10/phone...
https://peabee.substack.com/p/everyone-knows-what-apps-you-u...
About Apple I just don't know enough because I haven't seriously used them for years
The carriers in the US were caught selling e911 location data to pretty much whoever was willing to pay. Did that hurt them? Not as far as I can tell, largely because there is no alternative and (bizarrely) such behavior isn't considered by our current legislation to be a criminal act. Consumers are forced to accept that they are simply along for the ride.
People would stop taking photos with their camera that they didn't want to be public.
If Google did something egregious enough legislation might actually get passed because realistically, if public outcry doesn't convince them to change direction, what other option is available? At present it's that or switch to the only other major player in town.
...and not because, in truth, they don't care?
How would we even know if people distrusted a company like Microsoft or Meta? Both companies are so deeply-entrenched that you can't avoid them no matter how you feel about their privacy stance. The same goes for Apple and Google, there is no "greener grass" alternative to protest the surveillance of Push Notifications or vulnerability to Pegasus malware.
Would they? Nobody that I know would.
It's not that trusted computing is inherently bad. I actually think it's a very good thing. The problem is that the manufacturer maintains control of the keys when they sell you a device.
Imagine selling someone a house that had smart locks but not turning over control of the locks to the new "owner". And every time the "owner" wants to add a new guest to the lock you insist on "reviewing" the guest before agreeing to add him. You insist that this is important for "security" because otherwise the "owner" might throw a party or invite a drug dealer over or something else you don't approve of. But don't worry, you are protecting the "owner" from malicious third parties hiding in plain sight. You run thorough background checks on all applicants after all!
See also:
https://github.com/zenfyrdev/bootloader-unlock-wall-of-shame
We just had the Google side loading article here.
All I'm saying is that we have to acknowledge that both are true. And, if both are true, we need to have a serious conversation about who gets to choose the core used in our front door locks.
The fact that it's locked down and remotely killable is a feature that people pay for and regulators enforce from their side too.
At the very best, the supplier plays nice and allows you to run your own applications, remove whatever crap they preinstalled and change to font face. If you are really lucky, you can choose to run practically useless linux distribution instead of practically useful linux distribution with their blessing. Blessing is a transient thing that can be revoked any time.
Why not?
Obviously we don't have that. But what stops an open firmware (or even open hardware) GSM modem being built?
https://hackaday.com/2022/07/12/open-firmware-for-pinephone-...
The governments can ban this feature and ban companies from selling devices with that.
I’m sure CIA was not founded after covid :-)
Any kind of device-unique key is likely rooted in OTP (via a seed or PUF activation).
The root of all certificate chains is likely hashed in fuses to prevent swapping out cert chains with a flash programmer.
It's commonly used to anti rollback as well - the biggest news here is that they didn't have this already.
If there's some horrible security bug found in an old version of their software, they have no way to stop an attacker from loading up the broken firmware to exploit your device? That is not aligned with modern best practices for security.
You mean the attacker having a physical access to the device plugging in some USB or UART, or the hacker that downgraded the firmware so it can use the exploit in older version to downgrade the firmware to version with the exploit?
The evil of the type of attack here is that the firmware with an exploit would be properly signed, so the firmware update systems on the chip would install it (and encrypt it with the PUF-based key) unless you have anti-rollback.
Of course, with a skilled enough attacker, anything is possible.
... which describes US border controls or police in general. Once "law enforcement" becomes part of one's threat model, a lot of trade-offs suddenly have the entire balance changed.
Most SoCs of even moderate complexity have lots of redundancy built in for yield management (e.x. anything with RAM expects some % of the RAM cells to be dead on any given chip), and uses fuses to keep track of that. If you had to have a strap per RAM block, it would not scale.
I assume that's also why China is investing so heavily into open source risc-v
Basically breaking any kind of FOSS or repairability, creating dead HW bricks if the vendor ceases to maintain or exist.
When you really need it, like to download maps into the satnav, you can connect it to your home WiFi, or tether via Bluetooth.
OnePlus and other Chinese brands were modders-friendly until they suddenly weren't, I wouldn't rely on your car not getting more hostile at a certain point
Shhh. Nobody tell him where his phone, computer, and vast majority of everything else in his house was made.
My ownership is proved by my receipt from the store I bought it from.
This vandalization at scale is a CFAA violation. I'd also argue it is a fraudulent sale since not all rights were transferred at sale, and misrepresented a sale instead of an indefinite rental.
And its likely a RICO act, since the C levels and BOD likely knew and/or ordered it.
And damn near everything's wire fraud.
But if anybody does manage to take them to court and win, what would we see? A $10 voucher for the next Oneplus phone? Like we'd buy another.
Fabricated or fake consent, or worse, forced automated updates, indicates that the company is the owner and exerting ownership-level control. Thus the sale was fraudulently conducted as a sale but is really an indefinite rental.
If I buy a used vehicle for example, I have exactly zero relationship with the manufacturer. I never agree to anything at all with them. I turn the car on and it goes. They do not have any authorization to touch anything.
We shouldn't confuse what's happening here. The engineers working on these systems that access people's computers without authorization should absolutely be in prison right alongside the executives that allowed or pushed for it. They know exactly what they're doing.
Generally speaking and most of the time, yes; however, there are a few caveats. The following uses common law – to narrow the scope of the discussion down.
As a matter of property, the second-hand purchaser owns the chattel. The manufacturer has no general residual right(s) to «touch» the car merely because it made it. Common law sets a high bar against unauthorised interference.
The manufacturer still owes duties to foreseeable users – a law-imposed duty relationship in tort (and often statute) concerning safety, defects, warnings, and misrepresentations. This is a unidirectional relationship – from the manufacturer to the car owner and covers product safety, recalls, negligence (on the manufacturer's behalf) and alike – irrespective of whether it was a first- or second-hand purchase.
One caveat is that if the purchased second-hand car has the residual warranty period left, and the second-hand buyer desires that the warranty be transferred to them, a time-limited, owner-to-manufacturer relationship will exist. The buyer, of course, has no obligation to accept the warranty transfer, and they may choose to forgo the remaining warranty.
The second caveat is that manufacturers have tried (successfully or not – depends on the jurisdiction) to assert that the buyer (first- or second-hand) owns the hardware (the rust bucket), and users (the owners) receive a licence to use the software – and not infrequently with strings attached (conditions, restrictions, updates and account terms).
Under common law, however, even if a software licence exists, the manufacturer does not automatically get a free-standing right to remotely alter the vehicle whenever they wish. Any such right has to come from a valid contractual arrangement, a statutory power, or the consent, privity still works and requires a consent – all of which weakens the manufacturer's legal standing.
Lastly, depending on the jurisdication, the manufacturer can even be sued for installing an OTA update on the basis of the car being a computer on wheels, and the OTA update being an event of unauthorised access to the computer and its data, which is oftenimes a criminal offence. This hinges on the fact that the second-hand buyer has not entered into a consentual relationship with the manufacturer after the purchase.
A bit of a lengthy write-up but legal stuff is always a fuster cluck and a rabit hole of nitpicking and nuances.
I still sometimes ponder if oneplus green line fiasco is a failed hardware fuse type thing that got accidentally triggered during software update. (Insert I can't prove meme here).
I have however experienced that a ISP will write to you because you have a faulty modem (some Huawei device) and asks you to not use it anymore.
Not surprisingly, stolen phones tend to end up in those locations.
The effects on custom os community is causing me worried ( I am still rocking my oneplus 7t with crdroid and oneplus used to most geek friendly) Now I am wondering if there are other ways they could achieved the same without blowing a fuse or be more transparent about this.
Google pushed a non-downgradable final update to the Pixel 6a.
I was able to install Graphene on such a device. Lineage was advertised and completely incompatible, but some hinted it would work.
You absolutely do not, this is an extremely healthy starting position for evaluating a corporations behavior. Any benefit you receive is incidental, if they made more money by worsening your experience they would.
https://en.wikipedia.org/wiki/Samsung_Knox
This makes sense and much less dystopia than some of the other commenters are suggesting.
I don't believe for a second that this benefits phone owners in any way. A thief is not going to sit there and do research on your phone model before he steals it. He's going to steal whatever he can and then figure out what to do with it.
Thieves don't do that research to specific models. Manufacturers don't like it if their competitors' models are easy to hawk on grey markets because that means their phones get stolen, too.
Thieves these days seem to really be struggling to even use them for parts, since these are also largely Apple DRMed, and are often resorting to threatening the previous owner to remove the activation lock remotely.
Of course theft often isn't preceded by a diligent cost-benefit analysis, but once there's a critical mass of unusable – even for parts – stolen phones, I believe it can make a difference.
Android's normal bootloader unlock procedure allows for doing so, but ensures that the data partition (or the encryption keys therefore) are wiped so that a border guard at the airport can't just Cellebrite the phone open.
Without downgrade protection, the low-level recovery protocol built into Qualcomm chips would permit the attacker to load an old, vulnerable version of the software, which has been properly signed and everything, and still exploit it. By preventing downgrades through eFuses, this avenue of attack can be prevented.
This does not actually prevent running custom ROMs, necessarily. This does prevent older custom ROMs. Custom ROMs developed with the new bootloader/firmware/etc should still boot fine.
This is why the linked article states:
> The community recommendation is that users who have updated should not flash any custom ROM until developers explicitly announce support for fused devices with the new firmware base.
Once ROM developers update their ROMs, the custom ROM situation should be fine again.
Sophisticated actors (think state-level actors like a border agent who insists on taking your phone to a back room for "inspection" while you wait at customs) can and will develop specialized tooling to help them do this very quickly.
They don't want the hardware to be under your control. In the mind of tech executives, selling hardware does not make enough money, the user must stay captive to the stock OS where "software as a service" can be sold, and data about the user can be extracted.
Give ROM developers a few weeks and you can boot your favourite custom ROMs again.
To be fair, they are right: the vast majority of users don't give a damn. Unfortunately I do.
Specifically GrapheneOS on Pixels signs their releases with their own keys. And with the rollback protection without blowing out any fuses.
I know that all these restrictions might make sense for the average user who wants a secure phone.. but I want an insecure-but-fully-hackable one.
OnePlus just chose the hardware way, versus Apple the signature way
Whether for OnePlus or Apple, there should definitively be a way to let users sign and run the operating system of their choice, like any other software.
(still hating this iOS 26, and the fact that even after losing all my data and downgrading back iOS 18 it refused to re-sync my Apple Watch until iOS 26 was installed again, shitty company policy)
There is a good reason to prevent downgrades -- older versions have CVEs and some are actually exploitable.
What exactly is it comparing? What is the “firmware embedded version number”? With an unlocked bootloader you can flash boot and super (system, vendor, etc) partitions, but I must be missing something because it seems like this would be bypassable.
It does say
> Custom ROMs package firmware components from the stock firmware they were built against. If a user's device has been updated to a fused firmware version & they flash a custom ROM built against older firmware, the anti-rollback mechanism triggers immediately.
and I know custom ROMs will often say “make sure you flash stock version x.y beforehand” to ensure you’re on the right firmware, but I’m not sure what partitions that actually refers to (and it’s not the same as vendor blobs), or how much work it is to either build a custom ROM against a newer firmware or patch the (hundreds of) vendor blobs.
The abl firmware contains an anti rollback version that is checked with the eFuse version.
The super partition is a bunch of lvm logical partitions on top of a single physical partition. Of these, is the main root filesystem which is mounted read only and protected with dm-verity device mapping. The root hash of this verity rootfs is also stored in the signed vbmeta.
Android Verified Boot also has an anti rollback feature. The vbmeta partition is versioned and the minimum version value is stored cryptographically in a special flash partition called the Replay Protected Memory Block (rpmb). This prevents rollback of boot and super as vbmeta itself cannot be rolled back.
This doesn't make sense unless the secondary boot is signed and there is a version somewhere in signed metadata. Primary boot checks the signature, reads the version of secondary boot and loads it only if the version it's not lower than what write-once memory (fuse) requires.
If you can self-sign or disable signature, then you can do whatever boot you want, as long as it's metadata satisfies the version.
From the article:
Any subsequent attempt to install older firmware results in a permanent "hard brick" - the device becomes unusable
This implies that not only does an older custom ROM not work, but neither does attempting to recover by installing a newer ROM.
Edit: It seems that this does apply to OxygenOS too: https://xdaforums.com/t/critical-warning-coloros-16-0-3-501-...
This is ultimately about making the device resistant to downgrade attacks. This is what discourages thieves from stealing your phone.
Not just "there should be some phone brands that cater to me", but "all phone brands, including the most mainstream, should cater to me, because everyone on earth cares more about 'owning their hardware' than evil maid attack prevention, Cellebrite government surveillance, theft deterrence, accessing their family photos if they forget their password, revocable code-signing with malware checks so they don't get RATs spying on their webcam, etc, and if they don't care about 'owning their hardware' more than that, they are wrong".
It is objectively extremist and fanatical.
As time goes on, the options available for those that require such sovereignty seem to be thinning to such an extent that [at least absent significant disposable wealth] the remaining options will appear to necessitate adopting lifestyle changes comparable to high-cost religious practices and social withdrawal, and likely without the legal protections afforded those protected classes. Given the "big tech's" general hostility to user agency and contempt for values that don't consent to being subservient to its influence peddling, intense emotional reaction to loss of already diminished traditional allies seem like something that would reasonably viewed compassionately, rather than with hostility.
None of the situations you mentioned are realistic or even worth thinking about for the vast majority of the population. They're just an excuse to put even more control into the manufacturer's hands.
I don't care if they can downgrade the device, just that I boot into a secure verified environment, and my data is protected.
I also think thieves will just grab your phone regardless, they can still sell the phone for parts, or just sell it anyway as a scam etc.
There's over a 10x difference in fence price between a locked and unlocked phone. That's a significant incentive/deterrent.
It has some increasing timer for auth, and if you try and factory reset it - it destroys all the data?
As I said its less important that the thief can boot a new os, the security of my data is more important. How is that compromised?
It feels like a thief is just going to opportunistically grab a phone from you rather than analyse what device it is.
If so, is this 'fuse' per-planned in the hardware? My understanding is cell phones take 12 to 24 months from design to market. so, initial deployment of the model where this OS can trigger the 'fuse' less one year is how far back the company decided to be ready to do this?
You can still change the IMEI on many phones if you know how to.
https://service.oneplus.com/us/search/search-detail?id=op588
They make it clear that this feature is unsupported and it's possible to mess things up. The reason why it's an ideal and not an expectation is that flashing alternate operating systems is done at one's own risk and is unsupported. They have already told the users that they bear no responsibility for what may go wrong if they flash the wrong thing on that device. Flashing incompatible operating systems to the device requires people to be careful and proper care to ensure compatibility before going through with flashing was not done.
https://news.ycombinator.com/item?id=30773214
https://www.eag.com/services/engineering/fib-circuit-edit-de...
Costs are what you'd expect for something of this nature.
But vendors wouldn't be able to say the device runs "Android" as it's trademarked. AVB is therefore mandatory and in order for AVB to be enforced, you can't really control the device - unlocking the bootloader gives you only partial control, you can't flash your own "abl" to remove AVB entirely.
But I don't want AVB and I can't buy such device for money.. this isn't free market, this is Google monopoly..
But it's not just about that, it's about the fact that I can't flash my own "abl" or the software running in the TrustZone there at all as I don't control the actual signing keys (not custom_avb_key) and I'm not "trusted" by my own device.. There were fuses blown as evident by examining abl with its fastboot commands - many refuse to work saying I can't use it on a "production device". Plus many of those low-level partitions are closed source proprietary blobs..
Yes yes - I DO understand that for most people this warning is something positive, otherwise you could buy a phone with modified software without realizing it and these modifications could make it impossible to restore the original firmware.
I do agree it's far from ideal though. But there are so many, much worse offenders that uses these fuses to actually remove features, and others that do not allow installing a different OS at all. The limited effort should probably be spent on getting rid of those first.
Of course there are bigger problems in the ecosystem, like Play Integrity which actively attempt to punish me for buying open hardware. Unfortunately that's the consequence of putting "trusted" applications where they IMO don't belong - there are smartcards with e-ink displays and these could be used for things like banking confirmations, providing the same security but without invading my personal computing devices. But thanks to Android and iOS, banks/governments went for the anti-user option.
But to answer your question: we know iPhones have a foolproof kill switch, it's a feature. Just mark your device as lost in Find My and it'll be locked until someone can provide your login details. Assuming it requires logging in to your Apple account (which it does, AFAIK; I don't think logging in to a local account is enough), this is the same as a remote kill switch; Apple could simply make a device enter this locked-down state and then tweak their server systems to deny logins.
Millions of fully working apple devices are destroyed because of that even - Apple won't unlock them even with proof of ownership.
Realize that many of these manufacturers sell their hardware in and employ companies in highly policed societies. Just the fact that they are allowed to continue to operate implies that they are playing ball and may well have to perform a couple of favors. And that's assuming they are fully aware of what they are shipping, which may not be always the case.
I don't think it is a bad model at all to consider any cell phone to be compromised in multiple ways even though you don't have hard proof.
Pre-prod (etc.) devices will also have different fuses burnt.
Using eFuses is a popular way of implementing downgrade prevention, but also for permanently disabling debug flags/interfaces in production hardware.
Some vendors (AMD) also use eFuses to permanently bond a CPU to a specific motherboard (think EPYC chips for certain enterprise vendors).
At the moment they're 'older' and would class as a rollback, which this fuse prevents.
Storing it in the firmware would mean every user has the same key. Storing it in eeprom means a factory reset will clear it. This allows me to ship hardware with the default key on a sticker on the side, and let's a non technical user reset it back to that if they need to.
It gives you a 256bit block to work with - https://docs.espressif.com/projects/esp-idf/en/stable/esp32/...
They are no different than some shit ransomware, except there is no demand for money. However, there is a demonstrable proof of degradation and destruction of property in all these choices.
Frankly, criminal AND civil penalties should be levied. Criminally, the C levels and boars of directors should all be in scope as to encouraging/allowing/requiring this behavior. RICO act as well, since this smells like a criminal conspiracy. Let them spend time in prison for mass destruction of property.
Civally, start dissolving assets until the people are made whole with unbroken (and un-destroyed) hardware.
The next shitty silly-con valley company thinks about running this scam of 'customer-bought but forever company owned', will think long and hard about the choices of their network and cloud.
There is when the device becomes hard bricked and triggers an unnecessary need for a new one.
Now I have to consider my device dead re updates, because if I haven't already gotten the killing update I'd rather avoid it. First thing I did was unlock the bootloader, and I intend to root/flash it at some point. Will be finding another brand whenever I'm ready to upgrade again.
[1] https://dontkillmyapp.com/oneplus