On the final episode with my rewatch
On the final episode with my rewatch
And if it was an issue on github:
Closed: “couldn’t reproduce” 10 seconds after that last comment.
Lol, you literally quoted me, didn’t actually read what you quotes, and then did something completely different.
Do you know that battery life ≠ battery capacity? That is not the same measurement as I have already tried to teach you 3 times.
Please state the calculation that you would use to “determine how often you have to recharge” that is valid for Wh and not for Ah.
What is its idle power draw? What is its power draw under load? Playing video? Sleep mode? That source gives nothing which determines battery life. All it gives is a nearly useless capacity number, just like all other manufacturers. So not valid at all. You still have exactly 0 more information about battery life.
If I am wrong, please state your calculations of what the battery life is with that 54Wh battery.
Your entire argument was “Ah is useless and Wh gives consumers the information to determine battery life” So go ahead, determine the battery life.
How is this any different at all if they said that it is a 5.8Ah battery? They don’t give any current or power draw.
As an exercise:
can you tell me the battery life difference between an arbitrary Laptop A with a 54Wh battery and Laptop B with a 27Wh battery?
Please explain to me what the difference is between battery life if you have a 5000mAh battery and an 18Wh battery.
Please state the calculation that you would use to “determine how often you have to recharge” that is valid for Wh and not for Ah. I am all for it. If you can cite a single source where the manufacturer gives a specification that would give battery life in Wh, and not in Ah, I will concede the entire argument and say that you were right the whole time in every comment make a note that you were right. Please show your calculation work.
The thing is, it does not matter how much charge the battery holds, it does matter how much energy it holds. Without knowing the Voltage the Ah is useless.
This is patently, objectively misinformation and completely false. That is a direct quote of your words, today. That was your last comment. I have already laid out multiple examples of how Ah is a useful measurement and what you can do with it. Therefore, it is misinformation. It is not disinformation, but stating untrue things as fact is misinformation, even if you have no idea you are wrong.
Sorry, but you are simply wrong. Simple math says that you are wrong.
You can buck or boost convert nearly any voltage to any other voltage.
Then measure the current output of the battery, boom you have battery life.
Also electrical charge can be used in many, many very valuable calculations without involving voltage at all.
Let’s take an arbitrary example with an arbitrary battery powered device. Let’s say the battery is somewhere between 1V and 10000000V. You can’t measure it because you might blow up your multimeter.
You know that the battery is 5000mAh. You can safely measure that all of the circuitry is draining 1000mA because sense resistors or contactless magnetic current measurements don’t have anywhere near dangerous voltages. You know that the battery will last about 5 hours. What is the voltage? Doesn’t matter.
Yes, charge and the flow of charge is not the entire story, but to say it is useless or does not matter is just a straight lie. It is fine if you don’t understand electronics, but then don’t spit out misinformation.
Yes Watt-hours would give a more complete picture to slightly tech-inclined consumers (makes 0 difference for 99% of consumers), but then it returns to not mattering because you can do the 5s calculation yourself because single cell lithium batteries are overwhelmingly 1 nominal voltage.
Literally 90% of calculations related to efficiency are JUST as valid using mA as W.
Your device uses 12mA at idle with a 5000mAh battery has the same relevance as your 18.5Wh battery using 45mW at idle.
I disagree with 5.
I am an electronics engineer, so admittedly only ever worked with C and Python scripting (and not a programmer by any means) but I literally stopped learning rust for embedded because every single tooling setup step was wrong or failed for both chips I was testing out (NRF chip and an esp32-C3). Maybe only embedded rust was still a mess tooling-wise, but I have no use case for learning userspace rust first. It would just be a waste of my limited free time 😅
I believe it actually has to do more with historical conventions in electronics or math. (This is just what I remember from heresay when I was in university as an electronics engineer), but there is also a mathematical reason.
history hearsay theory
The easiest way to measure power draw is by measuring current draw (voltage across a sense resistor) way back before there were affordable, quality ICs to measure voltage and current and pretty much joule count.
To add to this, current sensors are much easier and cheaper than test machines that do the calculations for you.
When lithium batteries and NiCAD batteries became standard compared to the earlier lead-acid (which are measured in Wh), they had an extremely flat voltage curve compared to lead acid. They could be considered to be at a constant voltage.
Now cheaper electronics were being made and if a designer wanted to know how long a battery would last, they could take the nominal battery voltage that the battery would be at a vast majority of the time, and they could just measure the current draw over a short time of the circuit, 10s of calculations, and you have your approximate battery life. There is a joke that engineers approximate π to 3.
Even designing electronics today, everything is specced to current draw, not power draw. ICs take X current in mA during Y operations. Your DCDC converters have Z quiescent currents and from there you can calculate efficiency. It is much easier to work in current for energy running through the circuit.
Math units
Ah is a measure of electrical charge.
Wh is a measure of energy
Batteries and capacitors hold charge so are measured in Ah, generators that power the grid generate energy and use of that energy is measured in Wh (it also isn’t a “constant” voltage source like batteries as it is AC)
But piracy is a product of their free market, don’t they want their mythical free market to be a free market?
Or maybe that was always just bullshit and they rely on using their money to suppress competition while they deliver a terrible, inferior product.
Google keep used to (don’t use it anymore) store your notes “backed up” by email. You could view all your notes in gmail.
Maybe it was something like that?
Depends. If someone is gaming with new hardware, don’t use a distro that doesn’t update the kernel quickly and regularly.
Almost every problem with hardware on mint is solved by going through the process of updating the kernel or switching to a distro with up to date libraries.
It’s fine for a lot of people, but it doesn’t “just work” outside of the use case of only browsing the internet and word documents.
This is coming from someone who used mint for 4 years. There was about a dozen times where the software on the software center was so out of date that it simply didn’t work and I had to resort often to using random ppa’s which often broke other things. Definitely not user friendly.
That being said, Cinnamon is probably one of the most user friendly DEs for people switching from window. It is very nice.
Title of the book? I am looking for some fantasy or sci fi to read.
Or I guess that might be doxxing yourself…
Sadly it doesn’t work almost at all with the *arr suite even with flaresolverr.
I have had to move mainly to the small private tracker and knaben because torrentgalaxy and 1337x both stopped working in prowlarr because they had to up their bot fighting game…
True, but the UI reflects that they still use source forge lol. Still the best open source camera.
Bandcamp has so much vinyl that I want from artists that I want to support, but shipping it overseas double or triples the cost (even if you buy 10 different LPs at a time, shipping is seperate for many of them) and I can’t afford it. AFAIK, there is nothing like band camp on this side of the ocean
True, meanwhile my HP printer had a hell of a time trying to work on windows much less finding an actual downlosd for the scanner tool on HP’s websitr for a printer ovrr 5 years old and on Linux I typed yay HP
, 1
, then I was ready to print and scan.
Plus KDE discover is the convenience if the Microsoft store was actually good.
Settings are ACTUALLY in setting instead of being split between settings, control panel, individual tool auto diagnoses, powershell, and registry edits.
KDEconnect works seamlessly and I can also locate my phone if I lost it in the house.
Hilti is what the pros use.
I got a 2nd hand old model from my girlfriend’s dad (still twist lock) and it is a damn beast!
The Bosch professional line of hand jackhammers (don’t know the english word) can’t hold a candle to hilti to be honest. Hiltis can go all day and not overheat. It is just damn expensive.
You definitely don’t use CUDA then. That is hardware accelerated machine learning basically.
For you usecase then it doesn’t make much of a difference. DLSS 3.0 is indeed better than FSR, but there are few games that use it i guess. DLSS 2.x and FSR are about on par with each other and FSR is enabled in all games. . Many/most of people don’t even realize that DLSS/FSR is disabled when gaming as the vast vast majority of games don’t even have it and and most don’t think about it, I have no idea if you are in the same boat, but then it makes no sense to base a decision based off of features you don’t use, in my opinion.
Games simply don’t benefit enough for the cost of a new processor, let alone new motherboard and ram.
A new GPU will almost always the best bang for your buck improvement in games.
Then you should definitely go AMD. There is literally no reason not to unless you are already using cuda or ray tracing a ton. AMD is the best value for the money by far, has a MUCH better software interface (never thought I would say that), comparable or less driver issues than nvidia now, and it also works flawlessly on Linux, including full undervolting support (important on any GPU, but on AMD it is much easier).
That being said, if comparable performance GPUs are the same price in your region and you use windows, nvidia is also fine to grab.
Always undervolt your GPU. My 5700XT that ran on 200W before now maxes out at 150W and usually is at 140W with a 1% performance difference. That is like a 9C temp difference.
https://wiki.archlinux.org/title/GnuPG
You can create keys with a gui:
https://www.openpgp.org/software/kleopatra/