• 2 Posts
  • 21 Comments
Joined 1 year ago
cake
Cake day: June 17th, 2024

help-circle

  • oh dont get me wrong. as I said I agree with most of your original (and now second post).

    my gripe with grain was not about av1 per se. it was with movie makers that add it just because they think it is how movies should be

    this is retarded to me: “Reasons to Keep Film Grain On: Artistic Effect: Film grain can add a nostalgic or artistic quality to video and photography, evoking a classic film look” because the reason is just “nostalgic” that the director has, as in if he was born after digital era, he would have an issue with it and not add it (usually).

    about h264 and transparency, the issue is not that h264 can get that but at high bitrate, the issue is that av1 (as I read) can’t get it at any bitrate.

    but overall I agree with you.

    I even recently was shocked to see how much faster av1 encoding has gotten. I would have thought it was still orders of magnitude, but with some setting (like x265 slow setting) av1 is has the same encoding speed.


  • I want to agree with you and I do to a large extend. I like new codecs and having more opensourcy coded is better than using a codec that has many patents. long term patents(current situation) slows technological progress.

    what I don’t agree with you is some details.

    first, Netflix youtube and so on need low bitrate and they (specially google/youtube) don’t care that much about quality. google youtube video are really bit starved for their resolutions. netflix is a bit better.

    second, many people when they discuss codecs they are referring to a different use case for them. they are talking about archiving. as in, the best quality codec at a same size. so they compare original (raw video, no lossy codec used) with encoded ones. their conclusion is that av1 is great for size reduction, but cant beat h264 for fidelity at any size. I think that h264 has a placebo or transparent profile but av1 doesn’t.

    so when I download a fi…I mean a linux ISO from torrents, I usually go for newest codec. but recently I don’t go for the smallest size because it takes away from details in the picture.

    but if I want to archive a movie (that I like a lot, which is rare) I get the bigger h264 (or if uhd blueray h265).

    third: a lot of people’s idea of codec quality is formed based on downloading or streaming other people’s encoded videos and they themself don’t compare the quality (as they don’t have time or a good raw video to compare).

    4th: I have heard av1 has issues with film grain, as in it removes them. film grain is an artifact of physical films (non-digital) that unfortunately many directors try (or used to) to duplicate because they grew up watching movies on films and think that movies should be like so they add them in in post production. even though it is literally a defect and even human eyes doesn’t duplicate it so it is not even natural. but this still is a bug of av1 (if I read correctly) because codec should go for high fidelity and not high smoothness.


  • you didn’t do the wrong thing.

    what many people don’t notice is that support for a codec in gpu(in hardware) is two part. one is decoding and one is encoding.

    for quality video nobody does hardware encoding (at least not on consumer systems linux this 3050 nvidia)

    for most users the important this is hardware support for decoding so that they can watch their 4k movie with no issue.

    so you are in the clear.

    you can watch av1 right now and when av2 becomes popular enough to be used in at least 4 years from now.



  • maybe, maybe not.

    when h264 was introduced (Aug 2004), even intel had HW encoding for it with sandybridge in 2011. nvidia had at 2012

    so less than 7 years.

    av1 was first introduced 7 years ago and for at least two years android TVs require HW decoding for it.

    And AMD rdna2 had the same 4 years ago.

    so from introduction to hardware decoding it took 3 years.

    I have no idea why 10 years is thrown around.

    and av1 had to compete with h264 and h265 both. ( they had to decide if it was worth implementing it)






  • I get the intel lower power when not doing stuff (wish amd had high/low config for cores too) but what I mean was, in laptop cpus that are not on battery (just connected to power) does amd do more with the same power usage?

    if the comparation can not be done with the same gen cpus from two companies, then maybe a similar power usage cpu from amd and one from intel (laptop of course), do they for example have similar geekbench benchmark results? (for lack of better tool)

    so what I am asking is I dont care that is the same gen amd and intel laptop cpu with both connected to socket power, if amd is better. I want to know if for the same power usage (not idling but working) amd is better or not.








  • Time to fund /e/OS GraoheneOS

    no.

    those are just android with some modification. two years from now google can easily disrupt them too.

    phones need a copyleft new OS. not a foss one, an actual copyleft one. with an independent group managing it.

    an OS that a company can decide what app I can run on it is just a surveillance apparatus gadget.

    google never wanted user to have control of their phone even 10 years ago.

    the easiest way to check this is to see if you can stop an installed app to ever do stuff without you explicitly opening it. they are so many “triggers” that apps can register and run based on them that user cant do anything about them. “wifi connected” “wifi disconnected” and so on.

    if an app can “listen” to these triggers and I cant disable it from listening to them (even for non-system apps) them I don’t really own my phone. then android is just a attention stealing spam machine at best and spying and terror gadget for world’s supremacist regimes too.

    I think even apple iOS has that option (disabling backgournd refresh per app ) and in that regard is better than android. If I wasn’t against non-foss software and I didn’t live in Iran, at this point apple iOS is not that different fro google and is more polished too.





  • what you mean?

    the new paper app is faster and that is from someone who usually hates gnome apps because of lack of setting option. (even this app has that issue with gnome-brain UI that put the night mode on the left side sidebar option removed from all other options that are on the right side)

    I understand the motivation behind people that think rust is a good language to write software in as it has better memory safety but I don’t think everythink needs to get rusted.

    but I understand even less the people that make fun of passionate people that like rust and like safer software.

    at first the meme was rust is a cult. now the meme is those who made fun of them are a meme.

    example: https://techrights.org/n/2025/03/19/Sami_Tikkanen_Explains_on_Rust_Language_and_Its_Goals.shtml

    I read this article with the expectation that it was a fair pro-con examination. and it had some bit that was very good point (like the discord part) but the theme of the article was just rust is bad because I say rust people are pod people.

    this is not a good technical discussion.

    and I ,as a non-programmer that have used linux for that last 20 years and like to follow software trends, like that rust is making a large part of security problems of apps be less severe.

    I don’t that make me a cult member.