Giter Club home page Giter Club logo

hap's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hap's Issues

What does HAP stand for?

Sorry if this is an insane issue but I just spent like 15 minutes researching and I can't find this anywhere.

I'm so curious!

HAP HDR missing docs

Hello,

With the recent add of the HAP HDR flavour, a part of the doc is not clear about what is allow in term of texture.

The choice was made to use one name, for two kind of texture (unsigned and signed).
Does it means, an HAP HDR can store depending of the frame Unsigned and signed texture ?
Or is an HAP HDR file, need to always have the same kind of texture for each frame ? (in that case checking the texture format for the first frame, will be enough to know which kind of texture is used inside the file)

Decoding not happening on Graphic Card

Hi, new to Github, registered for this issue, hope I am doing this correctly :)

So, I have an issue where Hap videos seems to be decoded with the GPU in my CPU rather than with my AMD Firepro W9100.

I drew this conclusions because monitoring the GPU with GPU-Z I noticed the amount of memory reserved on my AMD card was 304MB which correspond to the 249MB required for HD output and then some. The more outputs I used on the card the higher the amount of reserved memory on my GPU went up. However no amount of concurent HAP files being played changed that amount. Also I tested two system one with an 8GB card and the other with a 16GB card, both resulted in graphic memory error at the exact same amount of concurent Hap files.

The host software is Watchout 6 (6.1.5) by Dataton. So I don't know if the issue is with Watchout not dispatching the decoding on the good hardware of if it is a problem with the Hap decoder itself, hence me posting the issue here.

I use i7 6900 on X99 with 16GB of DDR4 ram 2400, the video card is AMD W9100 for the 16GB card and 480 Fury for the 8GB card, the 8GB card is in a system with an i76800. Everything else is the same. The OS is win7pro.

Hope I did this right :)
regards

Custom player support

I'm interested in the possibility of developing a lightweight player library based on Hap, aimed at high-performance cross-platform applications. Many of the goals are the same as mentioned in #16, but go a bit further. Typical media libraries (QT, AVFoundation, DirectShow, libavcodec) are difficult to integrate with, have a lot of dependencies and limitations, and fall down when you try to do much more than just watch one video straight through.

I'm curious about writing an extremely stripped down player library that only reads Hap encoded frames from a simple, common container format like MPEG or maybe OGG. With FFMPEG's Hap encoding support, the main requirements of a container would be that it is easy to parse, and that FFMPEG can write to it. The library itself should have minimal dependencies, compile on most platforms that support C++11, and be extremely easy to integrate into C++/OpenGL projects.

Writing this from scratch, while possibly hubristic, would not only allow easier cross-platform support, but could also open the door to features that are in line with the original goals of Hap. Specifically, for my purposes, things like: reliable, accurate, performant random frame access; low memory footprint; control of caching frames in RAM; 64-bit support; multiple playheads in a single video; possibly a metadata track that is synced to video frames to support, eg., CV pre-processing.

Does anyone have any thoughts on this approach? Is there any prior work to build off of, or is this a foolish way to go about it?

[Question] Is there any way to reduce the transcoded file size ?

Hello,

We are trying to make an open source cross-platform high-resolution 360 VR video player with Unity, and we ended up using HAP codec since it's the one offering the best playback performances.

However, when we transcoded our videos we got huge files e.g. with:

ffmpeg.exe -i "Cycling [email protected]" -c:v hap "Cycling [email protected]"

Input file: H265 76803840px @ 30fps is 2.12GB for 2min video
Output file: HAP 7680
3840px @ 30fps is 36.6GB for 2min video

Is there a way to reduce HAP file size ?

Thanks

Slow Rendering in AE

I'm finding renders to HAP out of AE CC are going really slow.
A 10 second test file at 1280x720 with white noise is taking 3m43s to render to HAP.
Rendering to HAP-Q, DXV, ProRes, Animation, or a TGA sequence takes between 10-15s.

Is this normal?
(Using Mac OS X, tested on two machines w/ HAP codec v3, v4, and v5)

ffmpeg Codec Update (HAPR specifically)

The recent updates to HAP encoding methods are really interesting and do seem to have an impact on image quality in the nuanced areas where you can see the difference between a master and a HAP encode (some noise/chroma aliasing).

It would be amazingly useful to have access to these in ffmpeg. Is there any chance of Tom or Vittorio kindly updating their work in ffmpeg to make this happen?

Many thanks,

James

conversion to hap with FFmpeg

Hi,

I am having problems playing back DirectShow hap files (.avi) in either Cinder or openFrameworks DirectShow blocks/addons.

I've asked the developers of openFrameworks addon and they are having the same issue.

Would you perhaps have any suggestion on how to approach this problem?

Thanks!
MP

Transfer Ownership of Repo

Not really a code issue but as it currently stands it seems my hap-directshow repo is the top one transferred from RenderHeads google code archive. I was wondering if you would be interested in taking over ownership since I don't have time to really keep up with it now that I have started my PhD program. I feel guilty not being able to answer questions or addressing issues in a timely manner so I figured it would be best to transfer it.

[Formats] BC6 integration

Hello, just another one, but since nowadays BC6 tends to offer a pretty decent quality, with reasonably fast encoders as well, it could totally be worth adding (In Unsigned/Signed version)

For unsigned version we can simply use... 6 ;)
(so 0xA6, 0xB6, 0xC6 )

For signed version, we can use 9 I suppose.
(so 0xA9, 0xB9, 0xC9 )

Website link to ffmpeg on windows does not work

Hi

I am not sure that this is the correct place to post this, but here goes.

If you go to Using HAP, the link to download ffmpeg for windows does not work.
An updated link to a project that does work would be a good idea along with instructions on which version to download (shared works for me)
As far as I can tell, there are two projects for windows, where https://github.com/BtbN/FFmpeg-Builds/releases looks good (it's on github).

Please point me in the right direction in case I am posting this in the wrong place.

Sune

HapGetFrameChunkCount

Hi Hap team !
Thanks you very much for sharing this wonderful codec library.

I think it could be nice having a HapGetFrameChunkCount function like HapGetFrameTextureFormat, allowing the client application to get the hap chunk's count of a file without decoding it.

This can be useful during the workflow to check if the graphists correctly encode their video files for quick decoding.

This can also help client application to fit the number of thread to decode hap files.

I wrote a version of this function who seems to work very well.
It's greatly copy pasted from hap_decode_single_texture function.
So a refactoring should be certainly needed.

Here my little contribution:

unsigned int HapGetFrameChunkCount(const void *inputBuffer, unsigned long inputBufferBytes, unsigned int index, unsigned int *chunkCount)
{
  unsigned int result = HapResult_No_Error;
  const void *texture_section;
  uint32_t texture_section_length;
  unsigned int texture_section_type;
  unsigned int compressor;

  /*
  Check arguments
  */
  if (inputBuffer == NULL
    || index > 1
    || chunkCount == NULL
    )
  {
    return HapResult_Bad_Arguments;
  }

  *chunkCount = 0;

  /*
  Locate the section at the given index, which will either be the top-level section in a single texture image, or one of the
  sections inside a multi-image top-level section.
  */
  result = hap_get_section_at_index(inputBuffer, inputBufferBytes, index, &texture_section, &texture_section_length, &texture_section_type);

  if (result == HapResult_No_Error)
  {
    // copy past from hap_decode_single_texture
    compressor = hap_top_4_bits(texture_section_type);
    if (compressor == kHapCompressorComplex)
    {
      const void *section_start;
      uint32_t section_header_length;
      uint32_t section_length;
      unsigned int section_type;
      size_t bytes_remaining = 0;

      const void *compressors = NULL;
      const void *chunk_sizes = NULL;
      const void *chunk_offsets = NULL;

      result = hap_read_section_header(texture_section, texture_section_length, &section_header_length, &section_length, &section_type);

      if (result == HapResult_No_Error && section_type != kHapSectionDecodeInstructionsContainer)
      {
        result = HapResult_Bad_Frame;
      }

      if (result != HapResult_No_Error)
      {
        return result;
      }

      section_start = ((uint8_t *)texture_section) + section_header_length;
      bytes_remaining = section_length;

      while (bytes_remaining > 0) {
        unsigned int section_chunk_count = 0;
        result = hap_read_section_header(section_start, bytes_remaining, &section_header_length, &section_length, &section_type);
        if (result != HapResult_No_Error)
        {
          return result;
        }
        section_start = ((uint8_t *)section_start) + section_header_length;
        switch (section_type) {
        case kHapSectionChunkSecondStageCompressorTable:
          compressors = section_start;
          section_chunk_count = section_length;
          break;
        case kHapSectionChunkSizeTable:
          chunk_sizes = section_start;
          section_chunk_count = section_length / 4;
          break;
        case kHapSectionChunkOffsetTable:
          chunk_offsets = section_start;
          section_chunk_count = section_length / 4;
          break;
        default:
          // Ignore unrecognized sections
          break;
        }

        /*
        If we calculated a chunk count and already have one, make sure they match
        */
        if (section_chunk_count != 0)
        {
          if (*chunkCount != 0 && section_chunk_count != *chunkCount)
          {
            return HapResult_Bad_Frame;
          }
          *chunkCount = section_chunk_count;
        }

        section_start = ((uint8_t *)section_start) + section_length;
        bytes_remaining -= section_header_length + section_length;
      }

    }
    else if (compressor == kHapCompressorSnappy)
    {
      // no chunk ?
    }
    else if (compressor == kHapCompressorNone)
    {
      // no chunk.
    }
    else
    {
      return HapResult_Bad_Frame;
    }
  }
  return result;
}

Hoping this can help
Best regards,

HAP WebCodec

It would be amazing if there were a WebCodec package for a HAP to allow playback in Chrome and Electron applications.

Name for BC7 format

Hap has supported BC7 textures for a while but never had a name for the format. We need one.

How to make sure is Hap decode on GPU or CPU?

Hi,
In my opinion, when using a computer's graphics hardware to decode, open the "Task manager" -> "Performance" -> "GPU" dialog box, we can find the "Video Decode" usage rate is not-zero. but when i play the media with the VLC software, the usage rate always zero. video file is downlode from "Test Materials" ("https://github.com/Vidvox/hap").

Test environment:
OS: Windwos 10 20H2
GPU: Nvidia GTX 1650
CPU: Intel(R) Core(TM) i7-9750H
SoftWare: VLC 3.0.14
video file: Test Materials -> Hap_Test_FFmpeg->Hap_ffmpeg_64.mov

New lossless formats

First of all thanks for the awesome work developing this codec and making it open source. Second, it would be very handy to add some uncompressed texture formats for high quality multi-threaded playback for those cases when HAP Q is not enough. I would propose HAP LS (lossless) using BGR and HAP LSA (lossless with alpha) using BGRA. A lot of folks are using image sequences to achieve this level of quality and it would be better using chunked HAP LS(A) instead.

It might also be very useful to add support for single-channel/monochrome texture formats for achieving 4:2:2 or 4:2:0 chromatic compression using a shader.

Best regards and keep up the good work!

Miroslav Andel
Chief Software Developer
Dataton

Hap HDR Alpha

It might be useful to have an alpha variant of Hap HDR.
There is no single-channel compressed format to match the bit-depth of BC6U, so this would require either an uncompressed plane (see #14) or something like splitting into an RG format.

Decoding of Non Multiple of 4 Resolutions

Hello

I was wondering since DXT are 4x4 pixel blocks based, how does the decoding of non multiple by 4 dimensions happens ? Is the DXT texture (whose dimensions are multiple of 4) decoding into a bitmap with personalized resolution an extra step that takes extra time (compared to having a multiple by 4 resolution right from the beginning) ?{

Thanks !

Cross platform support

Hey!

Very happy with HAP results on a recent installation.
I've been looking into Density * BC1 and BC4 for similar performance without the particular caveat of Hap...
Quicktime outside of OSX is a pile of crap :)

So HAP is nice, the technologies inside (DXT / Snappy) are cross-platform and easy to build/find tools for. But the .mov wrapper is a huge problem in the wider ecosystem. Do you have any thoughts about HAP outside of OSX?

Currently HAP doesn't support:

  • Linux
  • openFrameworks on Windows (unless you count legacy versions)
  • Windows x64 (as far as i know the DirectShow playback doesn't support GPU decompression stage)

Ideally all we'd need is a cross-platform free Quicktime demuxer to get to the frame data then the rest is easy, e.g. ffmpeg/libav probably could help here, have you tried this?

Elliot

Determining optimal decompression buffer size

What is the best way to determine the size of the output buffer passed to HapDecode? Looking at HapDecode itself, it seems like there is no real way to know the correct size ahead of time, so one simply has to make a good guess. After a little experimentation, I settled on simply multiplying the size of the compressed buffer by a factor of 2, but that may fail with a different encoder (this source is FFMPEG), or a different Hap codec (this is Snappy + DXT5 YCoCg). Setting it too high obviously impacts performance, as the runtime needs to find too much contiguous memory.

https://github.com/heisters/libglvideo/blob/master/src/decoders/hap.cpp#L24

Hap Alpha

Hello,

As I was having some test renders to see how Hap Alpha performs I've encountered some artifacts alogn the edge lines.

I have attached a closeup of a graphic with alpha channel. You may see the red artifacts on both transparent background and with added a white background. Any workaround?

Thanks!

screen shot 2015-04-14 at 23 58 04
screen shot 2015-04-14 at 23 58 13
screen shot 2015-04-15 at 00 04 29

BC7-compatible encoders

to the best of my knowledge, libsquish doesn't support BC7 (nor do any of the forks i've looked at so far- somebody please correct me if i'm mistaken). support for the BC7 variant of hap will require use of another/a different texture compression library (potentially replacing libsquish if it supports BC1/4/5, if the metrics are good, and if it would result in a simplified codebase). this issue is meant to provide a place for assembling a list of all potential encoders for consideration.

requirements:

  • support for enc/decoding of BPTC/BC7 UNORM textures
  • bonus: support for enc/decoding of DXT1/BC1, DXT5/BC3, and Alpha RGTC1/BC4

File examples

As explained in the title, one reference video for every format (in both uncompressed/compressed/chunked version) would be really helpful for testing implementations.

Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.