Before you complain about 10-bit vs. 8-bit:

  • For most shows we don’t have a raw capper, so we have no control over the encoding.
  • For the rare instances we do have a raw source, not everyone can play 10-bit yet, and there are still tons of people who want 8-bit and won’t be happy if we release only in 10-bit.
  • Either way, without a .ts source, we can’t encode 10-bit even if we want to. And we’re not against doing so in later seasons, or releasing two versions. However, we pure and simple lack the capability at the moment.
  • We work really hard on our releases, and it’s a bit of a bummer when people rag on them over a minor detail that we usually can’t control, and that the community still hasn’t reached a consensus on.
  • Did I mention we lack a raw capper so this is mostly moot point?
  • Oh yeah. Since some people seem to think that’s our fault, let me clarify: I’m sorry that we’re not prepared to pay for somebody to fly to Japan, live there, and record anime for us, but quite frankly our lack of a capper is completely out of our control. We’d like a .ts source as much as you do, but connections with people willing to record anime for us don’t pop up out of nowhere. We put in dozens of man-hours each week for our releases, but no amount of hard work is going to magically acquire us a .ts.

We’ve only gotten a few complaints so far, so I’m just putting this out there for now. If it becomes a huge issue, Neibs has a piece of her mind that she’d like to share.

50 thoughts on “Before you complain about 10-bit vs. 8-bit:

  1. FredCDobbs

    If anyone dares to call you guys “8-bit faggots” in *MY* presence, I’m going to go all Heiwajima Shizuo on them. Look for their remains atop the nearest tree or high-power line…

    I’ve been converting anime for *YEARS* (with Xvid4PSP) to watch on rewritable DVDs in my standalone DiVX-certified player – in my comfortable bed, on my large TV – and I want to _THANK YOU_ for not caving in to the latest elitist t3kbh0y craze sweeping the community now.

    This here Hi10P shite is the same shite we went through with .MKV only ON STEROIDS: it is not needed, it is not wanted, and it only displays the immaturity of those who need to feel ‘k3vv1’ by being so ‘elite’ that they can run a process the rest of us don’t have any flipping interest in. Kudos to you guys for standing firm and supporting the Installed Base 8-bit viewers. You *rock*!

    1. DmonHiro

      Are you calling .mkv “not wanted”? Please tell me I understood wrong, because you can’t be that silly.

  2. Ikh

    thanks for the decision. i appreciates 8bit release coz my laptop spec is not enough for playing 10bit whatsoever.

    And for 10bit supporter, please give little understanding to our (8bit supporter) condition, you can still play 8bit like previous release but we can’t play 10bit.

  3. Adelhied

    can someone really explain to me the difference between the two, 8 bit and 10 bit, because i can’t comprehend to that all the funsub site always say it in their post and I can’t really imagine what the heck is going on.

    sorry for the noob question I’m not an encoder

    and more power to evetaku^^

    1. Kagecode

      I’ll give you a simple pros/cons thing.

      Less “banding” and other quality improvements. This means the picture in the video is distorted less. The degree to which this is improved is mediocre – there :is: an improvement, but it’s not some godsend perfect quality.

      Smaller file size with the same settings – Reduces the file size by 5-15% (give or take more, depends on source material).

      Compatibility, compatibility, compatibility – 10bit is not widely supported, it is an obscure variant of the current h264 (mkv) codec. Where 8bit is supported by more than just computers, 10 bit is not.

      People now are making it out to be like the huge jump from XviD -> h264 (avi ->mkv), however that is not the case. Avi->mkv is akin to going from a stick to a sword. 8 bit to 10bit is like going from from a semi-blunt sword, to a sharpened one. The real next thing will be going from a blunt sword, to a gun.

  4. Mikenter

    My computer actaully can not play 10bit encoded videos. Actually I’m surprised it can run 720 8bit even. Can’t run anything at 1080 though… 10 bit is not even close to a standard yet. I think it’s just the hipster who want the newest thing ever ;)

  5. MahaiOne

    Hi10P has been a standard for ages, you have no idea what you’re talking about, do you?

    1. DmonHiro

      He’s actually right. From a tech point of view, Hi10P has been available for years. But we lacked a way to decode it properly until a few months back.

  6. mzry

    I think that it is important that technology evolves over time, and when these important milestones are reached (ie, when we went from avi to mkv etc) people really need to bite the bullet and upgrade their computers, or invest in quality (and somewhat inexpensive) media players for their TV’s. Obviously my recommendation to people who can only play 8 bit is: it is only a matter of time, you really need to look into upgrading and/or buying different hardware, 8 bit will be around for a little while longer but it’s inevitable that the technology will continue to improve and progress. As the famous saying goes, ‘Time waits for no man’.

  7. Badboll

    The main thing is to be able to watch it, if 10bit makes people unable to watch it, then it’s the wrong choice.

    I can barely play 10bit on my computer, but it’s watchable, i get better quality with 8bit.

    But I’m happy either way since I get to watch it!

    That’s my 2 cents.

  8. sakito

    until hi10p is widely supported for all computers, I don’t see a reason why people should start using it. Sure 10bit will become mainstream in the future, but for the time being, 8bit can reach more people.

  9. Squiggy

    Going along with the flow of time, anyone?
    Any source I get into my hands will be encoded into 10bit and if I ever become a capable encoder, I will only encode 10bit because the advantages are obvious and people should really stop playing their LQ shit on HD+ televisions and whining about 8/10bit instead of fucking buying a HDMI cable.
    Just my opinion..

    1. diamondsw

      And… you’re an asshat. The quality difference is minimal as are the space savings. In a year or so, sure, but right now unless you’re using Windows to watch it on your computer screen, you can’t play 10-bit. Linux support is a bitch to get working, Mac OS X support is non-existant, and NONE of the HTPC platforms (XBMC, Boxee, Plex, any dedicated streaming box, etc) support it.

      So, 10-bit excludes two major platforms, all remote and spouse friendly methods of getting content onto a TV, all so some encoding jerks (like yourself) can get an extra millimeter of e-peen.

      (If you haven’t gathered, let me also be one of the ones thanking EveTaku for not jumping on this asinine fad.)

  10. Bill

    I can see why those who can’t play 10-bit would complain, but I don’t get why anyone would complain when they don’t get a 10-bit. If extra 5-15% per ep is too much, forget HD.

    I support EveTaku! :D

  11. FredCDobbs

    Here’s the thing: all the 10-bit t3kbh0is out there denigrating an installed base of 8-bit users are IMMATURE @$$holes. They think they’re “bleeding edge,” but in reality they are merely ANNOYING. Once they grow up and the snot stops leaking from their noses, they will understand this – but that’s likely to be a decade or more from now, judging by some of the crap they’ve been spewing all over the web. ;o)

  12. dae-kun

    FredCDobbs has 10-bit envy. Don’t be so defensive, little 8-biter. 10-bit is superior, no one can deny it. For those of us that output to an HDTV capable of displaying 30-bit color and higher, it’s a beautiful sight. Now if you’ll excuse me, I’m going to go watch Ika Musume in delicious 10-bit. ;o)

      1. dae-kun

        Haha. I’m just teasing. I do understand and empathize with 8-bit users. Computers are expensive and it’s impractical to upgrade components just to comply with a color fidelity that has barely penetrated the computer monitor market so far. The only benefit to all users is the smaller file size of 10-bit encoded video.

        Then you have users like me, who use their gaming PC as an all-inclusive multimedia station hooked up to an HDTV via high speed HDMI. So when I get those wonderful 10-bit 1080p/720p videos, I can’t help but nerdgasm.

        1. random guy B

          you’re not a nerd……. just an annoying brat who obviously gets too much spending money (parents/school) on behalf of all those here that work for what they have. bite me. until integrated video/audio cards that can handle this type of encoding become consumer friendly for everyone…… not just you and your elitist snob friends…… 8-bit isn’t going anywhere. and unless you’re downloading a massive show, that 5-15% doesn’t mean anything.

          1. Imako

            You’re new here, so I’ll give you a tip. When the admins/mods say no flaming, they bloody well mean no flaming.

          2. dae-kun

            Who’s new? ( ゚∀゚)アハハ八八ノヽノヽノヽノ \ / \/ \

            I ain’t even mad.

  13. lygerzero0zero

    Since a couple people are demonstrating the amazing inability to read one of the shortest posts I’ve ever written, I’ve added two bullet points to hopefully clarify our stance a bit.

  14. A. Crush

    Are there really people complaining about things not being in 10-bit? All the commotion seems to be from those who want to stick with 8-bit but are finding their favorite groups going 10-bit only.

    I don’t even see it as an issue. 10-bit only really makes sense for BD/1080p releases at the moment.

    Maybe in a few years 10-bit will be mainstream, but for right now, really good 8-bit encodes are where it’s at.

    1. yepperoni

      Unfortunately, yes there are. They might just be trolling, in attempt to make fun of Evetaku, but they’re still complaining.

  15. Selecao

    Simple thing to do: download/install latest CCCP version. It works for Hi10P quite nicely.

    That being said, I tried to play 10-bit videos on my old laptop, yeah… apparently 256MB video cards get raeped by 10-bit video. But now, with a new laptop w/ a 3GB video card, it plays nicely.

    I see Hi10P being useful for BD rips, and okay for everything else. All in due time until it takes over. Until then, I don’t give a fuuu (except for hard drive space for archiving 8-bit & 10-bit versions).

    1. diamondsw

      Your solution does not work, as (check all that apply):
      [ ] I use Linux, which requires custom-compiled alpha versions of playback software.
      [ ] I use Mac OS X, which has no 10-bit-capable players presently.
      [ ] I use a media-streaming box (Roku, AppleTV, Tivo, etc) whose transcoder does not support 10-bit.
      [ ] I use XBMC, Boxee, or Plex, which do not support 10-bit.
      [ ] I use an older system that cannot handle the additional load, but plays 8-bit files just fine.
      [ ] Other (please specify): ______________________________

      So… good for you that you like using Windows and have said Windows PC connected to your TV. For any other use-case, you’re boned by 10-bit.

  16. well...

    8-bit vs 10-bit doesnt really have much to do with me but please continue doing 480 xvid versions
    my comp doesnt handle 720p very well (crappy lappy that lack a dedicated video card)

  17. Kairi

    I have a media server with over 600 titles about 5 TB plus of programing , tucked away that is accessed by various set top boxes through out my house. None of them can handle 10 bit . Thur far there is no hardware support for 10 bit . I should not have to spend 300 to 400 bucks for a computer for each TV . . I don not want to dig out my server and hook it to what ever TV I am watching . Plus having a bright white folder window open between episodes using MPC in a theater environment is hell on the eyes. So I am thankful that Eve Taku is still releasing 8 bit .

  18. Anonymous

    All I have to say is: Thank god almighty! Please don’t change. This whole 10bit craze is ridiculous.

  19. w0lf

    >No capper

    The woes of non-super-large fansub groups.

    Also 10bit is bullshit anyway. so much more annoying shit to deal with and what do you get out of it? Mayyybeee 50mb less on filesize and quality changes you can’t find unless you’re LOOKING for them and not just enjoying the damn episode?

    No thanks.

  20. someone

    For me, 10bit anime episodes in widescreen take about an hour or so to code to 8bit. Rather time consuming, and I feel like I’m just collecting anime series on my laptop. 10bit also makes t hard to watch anything on my laptop (Compaq Presario, have had it for 3 years, and all my codecs are updated). I’ve been watching a few episodes of a certain series, then I shelve it, to watch whenever I damn well feel like it.

    In other words, 10bit is stupid, and fansubbers should stick to 8bit. Then again, what do I know.

  21. Jay

    Well, sure, someday 10bit will be mainstream. But how long are we supposed to wait to view videos posted now? Yes, we can watch 10bit using VLC player on our computer, but I’m not aware of any standalone media players with processors than can play 10bit. The manufacturers I’ve emailed said someday they will, but at the moment, most use the Realtek chips that don’t.

    We find it a bummer to be able to watch hundreds of films and a great many anime on television using our standalone ACRyan player, but have to either watch on our computer monitor or go through the hassle of converting each episode’s video to 8bit and remux so we can watch on the big screen.

    While 10bit may be the wave of the future, I wish encoders would wait until the future brings general availability of the hardware needed to play it with before releasing 10bit versions only. The remainder of the mkv usership doesn’t seem to be excited enough in promoting the ‘latest and greatest’ for hardware manufacturers to want to scramble onboard with new chipsets. The result is simply a nuisance for the vast majority of anime on video viewers, with minimal gains.

    1. twinkle

      Just in case this wasn’t some kind of time-delayed spam post… 10 bit h.264 isn’t the future. h.265 (HEVC) is, and chipmakers are making new products with hardware support for that. Eventually, encoders will switch to that.

      Also, it’s the softsubs that break compatibility anyway. At most bitrates, even many mobile devices can handle software 10 bit h.264 decoding just fine, but complex typesets and kfx pretty much don’t work on a wide variety of platforms.

      If you want a more compatible release, get the 480p hardsub encodes (whenever Kagecode isn’t behind on uploading them) or get a 720p hardsub re-encode from a group like DeadFish. You just need to look at the release description to find whose subs they used for a particular episode of a show.

      As for us and most groups, considering the options, it’s not worth the extra workload to include 8 bit h.264 720p encodes these days, sorry.

  22. MrNobodyWTB

    This forum is pretty funny to read in 2015.

    This is seriously a chicken or the egg situation. Companies won’t start putting 10bit support in smaller devices such as standalone media players, or add native support inside TVs (a lot of smartTVs will now play MKV H264 just fine), until there is a ubiquitous demand amongst users.

    Well, it looks like, still, most of the users are unwilling to demand such a feature from general media device manufacturers, so this will never happen.

    I do agree HEVC (H265) is a much more commonly wanted tech advance. Though I do hope there is such a thing as a H265 10 bit standard or equivalent. I’m all for deep color, less banding etc etc. Most everyone should want this, why wouldn’t you? Just because your less capable setup can’t play it well doesn’t mean better color representation isn’t something you shouldn’t demand to be standardized.

Comments are closed.