Ottomateeverything

joined 2 years ago
[–] Ottomateeverything 1 points 11 months ago

That's why Pixels and some others have a "smart charge" feature that will wait to charge your phone until just before your alarm time so that it will finish right before you take it off the charger.

why am I going backwards to needing to babysit my phone when it's charging, and why would anyone want to charge their phone when they want to be using it vs when they're asleep?

I honestly don't understand why people have such trouble with this. I can throw my phone on a charger when I go to shower and it's at 80 percent when I get out, and that's enough for my day. I could leave it while I get dressed and eat or something and it'd be at 100 if I needed. I don't need my phone 24 hours a day. And there are many points in my day where I'm not using my phone for an hour that I could spare to charge it. I don't need to leave it burning away permanent battery capacity for hours and hours every night.

[–] Ottomateeverything 2 points 11 months ago

Yes, the battery doesn't charge to "dangerous - could explode" levels. But they very much do still charge to levels that are damaging to long term health/capacity of the battery.

Yes, they tune the batteries so that 100% isn't the absolute cap. But even with that accounted for, many batteries will be above values that would be considered good for the long term health of a lithium cell. 80 percent on most phones is still very much at levels that are considered damaging to lithium batteries.

To put it another way, the higher you charge a lithium battery, the more stress you put on it. The more stress you put on it, the fewer charge cycles those components will hold. It's not like there's a "magic number" at 80 percent, it's just that the higher you go the worse it is. Yes, some manufacturers have tweaked charge curves to be more reasonable. But they've also increased limits. Many batteries now charge substantially higher than most people would consider sustainable.

And after such changes, 80% lands pretty close to the general recommendations for improved battery longevity. Every percent will help, but it's not a hard and fast rule.

Calibrations have gotten a little better in some ways, but all you have to do is look at basic recommendations from battery experts and look at your phones battery voltage to see that almost every manufacturer is pushing well past the typical recommendations at 90 or even 85 percent.

[–] Ottomateeverything 1 points 11 months ago

It hasn't been in a long time. Charge controllers still charge to damaging voltages anyway. 100% isn't 100% but you can very easily check the voltage on phones and many still are into damaging territory beyond 80%.

[–] Ottomateeverything 10 points 11 months ago* (last edited 11 months ago) (17 children)

Can't answer the rest of your question because I don't use a one plus but:

aren't you supposed to charge the phone overnight?

No, you aren't "supposed" to charge your phone overnight. Leaving your phone on the charger at 100% is actually pretty bad for long term battery health. Hence why the notification exists in the first place. Modern phones also full charge in like an hour, so this leaves your phone in that state for many hours.

The longer story is it's actually best to stop charging your phone at 80 percent unless you really need the extra juice, because any time your phone spends above that is potentially damaging, but that tends to be hard to deal with for most people.

Most of the phones I've seen with this feature have a "battery warning" or "charge notification" or "protect battery" type setting somewhere you can turn off. But again, I've never used a one plus so Idk if they do or where it is.

[–] Ottomateeverything 2 points 11 months ago (1 children)

Yes, but Chromebooks are far from "newcomers" these days. They've been out a while. Many people who grew up using them in schools are now making their own purchasing decisions, etc.

[–] Ottomateeverything 8 points 11 months ago (2 children)

Like he said as the second sentence of his comment...

You've never worked in enterprise then.

These solutions are skipping the majority of the core problems he mentioned. And even the problem you're trying to solve here isn't even fully solved by this solution. You're taking a narrow sliver of one point in his argument and arguing about that and just tossing out the rest. Even if we accepted your proposal, Linux still isn't enough of an answer here.

[–] Ottomateeverything 3 points 11 months ago (3 children)

While I don't really disagree, look at the market share of Chromebooks. If "most people" only needed internet access, "most people" would be on Chromebooks by now. It's not like they're unknown anymore.

[–] Ottomateeverything 2 points 11 months ago* (last edited 11 months ago)

Yeah, and it's likely way less costly to the company to just buy a new win 11 computer than it is to pay an employee to train on new software. Not to mention the cost of paying someone to find someone to do a Linux conversion, paying the person doing the conversion, and the loss of productivity as the person learns. Not to mention the cost of changing IT infrastructure, hiring new IT people to manage those machines, etc.

There's a reason companies don't just switch at the drop of the hat. There's too much commitment and institutional knowledge already and moving is not a simple change.

[–] Ottomateeverything 17 points 11 months ago (6 children)

I can't agree with this more. People like to sell Linux as a magic bullet, but it does not and will not everything everyone needs without maintenance and people really like to hand wave or downplay that need.

Sure, you could find a solution for what they're using now. What happens when they need something else and they're so tech illiterate that they don't even know what you did to their machine? They wouldn't even know how to install new software, and if they did, they wouldn't know they need to click the Linux version, etc. It's not always about feasibility and available options, it's often about the fact that people just won't fucking know what to do. Even if you assume there are enough options available, they won't know how to do so.

And every step Microsoft takes to shoot themselves in the foot, and every step Linux takes to make this easier, everyone comes screaming about how much this could change things.

But until Linux has a HUGE market share - like in the 30-70 percent range - developers are not going to take it seriously and alleviate this process. Even with how well MacOS does, this is not even a solved problem entirely there - there are still hang ups and still software that doesn't get released for mac. Linux would have to pass where Apple is today for this to become remotely accessible to an every day person.

And even THEN there's the question of different Linux distros.

[–] Ottomateeverything 15 points 11 months ago

I really do not understand how server anti cheat is not way easier.

In a clean slate, it is. It's also way more effective (except for things like wall hacks, aim bots, recoil suppressors, etc, but most of those things are only really important and popular in competitive FPS). It's also much simpler to understand and to leave no "holes" behind. It also lives in the developers domain so it can't be "compromised" or circumvented.

The thing is that client side "anti cheat" can be commoditized. Every game with server authority/anti cheat needs specific server software to run their game logic. Client anti cheat is basically "look at everything else running on the system and see if any of it seems suspicious". As such, there's not really anything "game specific" to these - they basically are just a watch dog looking for bad actors - so as such, one company can come along, make one, and sell it to other devs.

This being "off the shelf" and not something the dev team has to think about besides a price tag means that management is just going to buy a third party solution and check off the "anti cheat" box on their task list.

I feel like devs are caught up on realtime anti cheat and not willing to do anything asynchronous.

First, this is a management problem and not the devs. Any dev worth their salt knows this isn't really a good solution.

But I'd say the more relevant and prominent thing here is that game companies just don't want to have to run servers anymore. It's a cost, requires dev time, and requires maintenance, and they don't want to do that. If these games had servers running the game world like games used to, they'd inherently have their own "anti cheat" built in for free that wouldn't necessarily catch everything but would do a better job than some of these. And it could be enhanced to cover more bases.

But studios don't want to do this anymore. It's easier to make the game p2p and slap an off the shelf anti cheat and call it a day.

Some games still require matchmaking servers etc, but the overhead there is way lower.

Or they really like paying licensing fees for client-side anticheat.

Not that I agree with the decision, but it is definitely cheaper and faster than the alternative. But picking something like nprotect totally fucking baffles me. There are better options.

I just don't understand how any competent software engineer or systems admin or architect trusts the client so fervently.

In some ways, same. Every project I've been on that has gotten anywhere near client side trust I've fought adamantly about avoiding it. I've won most arguments on it, but there are some places where they just utterly refuse.

But then there are things like New World.... I don't know how the fuck that shit released like it did. The number of things trusted to the client were absolutely baffling. I expected Amazon's first foray into gaming to be a fucking joke, but I was totally appalled at how bad it turned out. They even touted hiring ex blizzard talent to get my hopes up first.

[–] Ottomateeverything 91 points 11 months ago (5 children)

I don't know if this makes me "a redditor" somehow or what, but....

As a dev, I am deeply troubled by the gaming industry so calmly walking into kernel anti cheats. It's insane and being tossed around like it's nothing.

Helldivers especially, since they picked one of the sketchiest ones and it's a game that entirely doesn't need it.

I have no idea if Reddit has suddenly picked up on this, but I've been pissed since at least Valorants release, but have seen more YT videos talking about it recently.

view more: ‹ prev next ›