Giter Club home page Giter Club logo

avanimator's Introduction

AVAnimator

AVAnimator is an iOS library that makes it easy to implement non-trivial animated/video content in iOS.

See the project homepage for more information and example xcode projects:

http://www.modejong.com/AVAnimator/

This is an Open Source project, it is free as in "liberty" but that does not mean it is free as in "free beer". The specific goal of this project is to provide a professional library that can be incorporated into real iOS apps. Please take a moment to understand the dual license and how legal use of software is important.

http://www.modejong.com/AVAnimator/license.html

Example iOS projects:

http://www.modejong.com/AVAnimator/examples.html

avanimator's People

Contributors

mdejong avatar vphamdev avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

avanimator's Issues

Add @3x checks in AVOfflineCompositiion.m

See support for 1x and 2x, add 3x

if (compScale == 0) {
  NSAssert(screenScale != 0, @"screenScale is zero");
  compScale = screenScale;
} else if (compScale == 1 || compScale == 2) {
  // Nop

Reduce the size of "mvid" file

First of all thanks to AVAnimator such a good framework allows me to synthesize beautiful video.
I used a video  with alpha chanel as a filter to cover an image, so users will always switch filter(video) to preview,
But as a middleware generated file,mvid is too large, and some even reached 200MB.
I have done zip compression, the size will be reduced to 1/10, but the speed in decompression is unstable, can not reach a  quick preview.
Is there a better way to reduce the size of the "mvid" file?

Crash when attempting preload of AVAnimatorMedia objects

I am trying to preLoad AVAnimatorMedia objects for playback later.

The way I am experimenting is decoupling the resource loading and creation of AVAnimatorMedia object from the playback where it is attached to the AVAnimatorLayer.

When the two are coupled together it works, but decoupled it crashes.

Video blit operation optimizations

The following optimizations are needed for the existing 16 and 32 bit video blit ARM ASM code.

  1. Destination address must be double word aligned if writing 64 bit values to avoid writing a 64 bit value over the end of a cache line. Currently, the input address is aligned to an 8 word bound for a COPY, but that is not as important as the output pointer. For a DUP fill operation, 64 bit writes could be used by the output buffer would need to be aligned first.
  2. The current impl is optimal for armv6, but a NEON impl runs more quickly on armv7 with a Cortex-8 processor (iPhone3G and iPhone4). The problem is that NEON code is slower than plain ARM ASM on a Cortex-9 processor (iPhone4S and iPads). The libc code deals with this by turning memcpy() into a runtime bound address that is resolved to a processor specific implementation. It is not an issue for small copies and even copies up to 1 page in size, but very large copies are a real performance problem. One reasonable approach might be to invoke memcpy from inside the ASM code via a function call in the case where the COPY range is over a certain larger size like 1 page of memory. This would avoid having to implement different large copy modules and it would mean that newer processors with weird issues could be dealt with by libc.

Random failure in testAttachTwoDifferentMedia in AVAnimatorViewTests.m

2012-11-25 16:01:26.789 RegressionTests[43693:11f03] *** Assertion failure in +[AVAnimatorViewTests testAttachTwoDifferentMedia], /Users/mo/Development/QTAnimationiPhone/QTFileParserApp/Classes/Tests/AVAnimatorViewTests.m:1533

frame #3: 0x0002da9a RegressionTests`+[AVAnimatorViewTests testAttachTwoDifferentMedia] + 4986 at AVAnimatorViewTests.m:1533

The issue is that the two returned objects are the same UIImage object:

(lldb) po beforeImage
(UIImage *) $1 = 0x07c34470 <UIImage: 0x7c34470>

(lldb) po afterImage
(UIImage *) $2 = 0x07c34470 <UIImage: 0x7c34470>

frameInterval is deprecated in iOS 10, use preferredFramesPerSecond

I just tried out the QTFileParserApp and the StreetFighter demo on the latest XCode, running against iOS 10.2. Got a couple compile errors indicating that displayLink.frameInterval is deprecated, use preferredFramesPerSecond. I made the change, and the app seems to run. Works on my app as well, love the product.

Figured I'd let you know you're one change away from running on the latest thing ;)

mmap error when building on 64-bit device

This seems to work fine on 64-bit simulator but crashes right away on device. Note, I am using the mvid format for animations.

Assertion failure in -[SegmentedMappedData mapSegment], .../AVAnimator/SegmentedMappedData.m:330

Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'mmap result EINVAL'

mvid version error

used the utility MvidMovieMaker provided to convert mov to mvid and when trying to use the mvid in our ios app with AVAnimator, getting this:

Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'only .mvid files version 2 or newer can be used, you must -upgrade this .mvid from version 1'

AVOfflineComposition should support single images in addition to clips

The AVOfflineComposition class supports mvid and h264 clips as input, but it should be trivial to add support for still images in PNG or JPEG format. An image would have an x,y width x height and time bound like a clip, but loading and image would be significantly easier since only 1 image would need to be loaded for the entire time bound.

mvidmoviemaker 24 vs 32 bpp detection renders 24bpp images with 0xFF as alpha value

Currently, the mvidmoviemaker command line app needs to render at 32bpp and then mark the video as 24bpp post rendering when no pixels with alpha or partial alpha values were found. The result of this logic is that black pixels are represented as 0xFF000000 instead of 0x0. This may or may not be optimal WRT data compression. It is handy for the case where 24bpp pixels are converted to 32bpp pixels, because the values can remain the same, but that is not clearly needed at runtime.

What is more of a concern is that if black pixels are 0x0, then a large number of black pixels can be represented as a run of 0x0 bytes, as opposed to pixels at binary data compression time. Previously, all black pixels were always explicitly marked as 0x0 in a 24BPP file. If this logic is changed, then new code that will rewrite each pixel and then calculate the adler again after all frames have been written will be needed to support detection and explicit setting of the bpp. At 32BPP, the 0xFF will always be needed for a fully opaque pixel.

The complication is that once the pixels are rendered and written, then it is too late to go back and rewrite them because the delta logic is already done and delta pixels are more complex to deal with than keyframes. One approach would be to scan the input but not actually write it in one loop, then do the actual write. Another approach would be to emit keyframes into an .mvid and then when all that was done, actually write the keyframe data as deletas with the proper modifications (0xFF alpha set to 0x0) done before passing the data to be written. After a first attempt, it seems like a scan would be the most simple to implement, then do the actual write.

mmap error when building on 64-bit device for iOS 9

Hi there
same issue appearing again when new iOS 9 came up. Please advice!!
What we did is we updated AVAnimator files the latest version. We've marked 32-bit devices as
a valid architecture. It works fine for simulator but whenever we try to run it on iPhone 6, iPad air crash appears
This seems to work fine on 64-bit simulator but crashes right away on device. Note, I am using the mvid format for animations.

Assertion failure in -[SegmentedMappedData mapSegment], .../AVAnimator/SegmentedMappedData.m:330

Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'mmap result EINVAL'

Delay in media object causes slight glitch in looped video

This patch needs to be evaluated in terms of how it impacts looping with audio. The audio needs to stop before the next loop otherwise audio playback fails. But, the artificial delay means that there is a slight visual glitch in a looping video with no audio. This change needs to not reintroduce the problem with the audio playback.

diff --git a/Classes/AVAnimator/AVAnimatorMedia.m b/Classes/AVAnimator/AVAnimatorMedia.m
index cde252f..32a8401 100644
--- a/Classes/AVAnimator/AVAnimatorMedia.m
+++ b/Classes/AVAnimator/AVAnimatorMedia.m
@@ -631,11 +631,14 @@
NSAssert(self.currentFrame == 0, @"currentFrame must be zero");

// Schedule delayed start callback to start audio playback and kick

  • // off decode callback cycle.
  • // off decode callback cycle. This callback is scheduled so that
  • // the event loop is entered again, but note the 0.0 delay, we do
  • // not want to introduce an issue with a slight delay appearing
  • // at the end of an animation cycle.

[self.animatorDecodeTimer invalidate];

  •   self.animatorDecodeTimer = [NSTimer timerWithTimeInterval: 0.01
    
  •   self.animatorDecodeTimer = [NSTimer timerWithTimeInterval: 0.0
                                                  target: self
                                                selector: @selector(_delayedStartAnimator:)
                                                userInfo: NULL
    

Thread safety

Invoking AVAnimatorMedia's prepareToAnimate from a background task doesn't seem to work at all (silently fails).

Not a problem in my case, I have a static splash screen, but someone might want to use an UIActivityIndicator for example. Besides, doing heavy I/O synchronously on the UI thread just doesn't feel right.

arm64 issue with -no-integrated-as

When I try to archive my project in Xcode for submission there seems to be this error. I think this is happening because of this flag. Removing the flag causes the app not to compile as you have already explained. When I try to just run (no archive) on 64 bit device in simulator it works.

/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/as: can't specifiy -Q with -arch arm64
clang: error: assembler command failed with exit code 1 (use -v to see invocation)
Command /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang failed with exit code 1

Crash backtrace

(lldb) bt

  • thread #21: tid = 0x1be8c7, 0x0000000195213bc8 libobjc.A.dylibobjc_msgSend + 8, stop reason = EXC_BAD_ACCESS (code=1, address=0xc) frame #0: 0x0000000195213bc8 libobjc.A.dylibobjc_msgSend + 8
    frame #1: 0x0000000195210b94 libobjc.A.dylib`objc_setProperty_nonatomic_copy + 52
    • frame #2: 0x0000000100194410 TheMove-[AVAssetFrameDecoder setAssetURL:](self=0x00000001788c97d0, _cmd=0x0000000191acc63f, assetURL=0x000000000000000c) + 52 at AVAssetFrameDecoder.m:62 frame #3: 0x00000001001ad188 TheMove+[AutoPropertyRelease releaseProperties:thisClass:](self=0x0000000100923568, _cmd=0x00000001007045f6, obj=0x00000001788c97d0, thisClass=0x0000000100923090) + 2592 at AutoPropertyRelease.m:197
      frame #4: 0x0000000100190bc4 TheMove-[AVAssetFrameDecoder dealloc](self=0x00000001788c97d0, _cmd=0x0000000189b0c52d) + 128 at AVAssetFrameDecoder.m:97 frame #5: 0x0000000195219724 libobjc.A.dylib(anonymous namespace)::AutoreleasePoolPage::pop(void*) + 564
      frame #6: 0x0000000100196538 TheMove+[AVAssetJoinAlphaResourceLoader decodeThreadEntryPoint:](self=0x00000001009230e0, _cmd=0x0000000100705a96, arr=0x0000000178c6b500) + 932 at AVAssetJoinAlphaResourceLoader.m:534 frame #7: 0x0000000185a8e60c FoundationNSThread__main + 1072
      frame #8: 0x0000000195a23e80 libsystem_pthread.dylib_pthread_body + 164 frame #9: 0x0000000195a23ddc libsystem_pthread.dylib_pthread_start + 160
      (lldb)

"Instruction requires : Not 64-bit mode" error on iPhone 5s simulator.

In CpuArch.c,
asm volatile("pushl %%ebx \n\t" /* save %ebx /
"cpuid \n\t"
"movl %%ebx, %1 \n\t" /
save what cpuid just put in %ebx /
"popl %%ebx \n\t" /
restore the old %ebx _/
: "=a"(_a), "=r"(_b), "=c"(_c), "=d"(*d)
: "a"(function)
: "cc");

While executing for iPhone 5s, "Instruction requires : Not 64-bit mode", appears on that line. So I thought the command should be pushq and popq in 64 bit mode.
Then it says "invalid operand for instruction".

Well since I don't know much asm I thought I shouldn't do anything more and ask about it here right away.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.