Giter Club home page Giter Club logo

haishinkit.swift's Introduction

Hi there 👋

haishinkit.swift's People

Contributors

adamnemecek avatar akolov avatar allenlinli avatar anotheren avatar cardoso avatar dawidvdh avatar dependabot[bot] avatar devxoul avatar geeee avatar ibenjamin avatar iphong-epiens avatar jramer-vidflex avatar jvlppm avatar leo150 avatar levs42 avatar mfclarke avatar milesegan avatar mn-d128 avatar mwawrusch avatar netizen01 avatar octu0 avatar pvinis avatar ra1028 avatar shogo4405 avatar spllr avatar taehyeon-kim avatar troy-lamerton avatar weitieda avatar wolfcon avatar zhuker avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

haishinkit.swift's Issues

Audio problem with RTMP and iPhone6s not iPad

First, Thank you for this amazing project.

I'm trying to stream RTMP to Wowza, The video streaming very good my problem with audio it sent very fast and doesn't sync with the video.

This problem with iPhone 6s, on iPad it's work fine.

iPhone debug:

2016-05-20 23:54:27.445 [Debug] [VideoIOComponent.swift:31] VideoIOComponent > cicontext use hardware renderer
2016-05-20 23:54:29.907 [Warning] [Function.swift:7] IsNoErr > -1
2016-05-20 23:54:29.935 [Warning] [Function.swift:7] IsNoErr > -1
2016-05-20 23:54:29.977 [Warning] [Function.swift:7] IsNoErr > -1
2016-05-20 23:54:30.019 [Warning] [Function.swift:7] IsNoErr > -1
2016-05-20 23:54:30.063 [Warning] [Function.swift:7] IsNoErr > -1
2016-05-20 23:54:30.106 [Warning] [Function.swift:7] IsNoErr > -1
2016-05-20 23:54:30.149 [Warning] [Function.swift:7] IsNoErr > -1
2016-05-20 23:54:30.191 [Warning] [Function.swift:7] IsNoErr > -1

This warning came from AACEncoder.swift > func onInputDataForAudioConverter

iPhone print(currentBufferList)
Optional(__C.AudioBufferList(mNumberBuffers: 1, mBuffers: __C.AudioBuffer(mNumberChannels: 1, mDataByteSize: 1882, mData: 0x00000001034322c0)))

iPad print(currentBufferList)
Optional(__C.AudioBufferList(mNumberBuffers: 1, mBuffers: __C.AudioBuffer(mNumberChannels: 1, mDataByteSize: 2048, mData: 0x00000001034b62c0)))

My code:

    rtmpStream = RTMPStream(rtmpConnection: rtmpConnection)

    rtmpStream.videoSettings = [
        "profileLevel": kVTProfileLevel_H264_Baseline_5_0,
        "width": 288,
        "height": 512,
        "fps": 25,
        "bitrate": 1000000,
    ]

    rtmpStream.audioSettings = [
        "bitrate": 1000000
    ]

    rtmpStream.attachCamera(AVMixer.deviceWithPosition(.Back))
    rtmpStream.attachAudio(AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio))

    rtmpStream.view.frame = CGRectMake(0, 0, preview.frame.width, preview.frame.height)
    preview.addSubview(rtmpStream.view)

    rtmpConnection.connect(streamServer)
    rtmpStream.publish(streamName)

camera focus problem

I've used lf.swift project to develop my own rtmp live broadcasting program, I've found that the camera is not able to focus on target properly so the preview image and the video received from wowza server are somewhat out of focus, this issue also existed in the sample application included in lf.swift project. I've set captureSettings:
rtmpStream!.captureSettings = [
"continuousAutofocus": true,
"continuousExposure": true,
]

Update(2016-04-29):
It's my problem. The default sessionPreset for captureSettings is AVCaptureSessionPresetMedium, that's why the video looks not very clear, set its value to AVCaptureSessionPresetHigh solve this.
Sorry for that.

RTMP connection error

Hi,

I've tried to make the connection to rtmp://pili-live-rtmp.skyplatanus.com/skyplatanus/56daaad95e77b0351602c712. But it always fails. I've try to make some modification to the code, but I found there are a lot of compatibilities to do for dealing with various RTMP servers now.

for examples, I've met a problem like this link

So I think it's hared to make a RTMP client without librtmp unless make lots of tests for it.

Some suggestions

  1. Adaptive video bitrate supports.
  2. Allow processing frame before encoding.

My idea is... (fund captureOutput in AVCEncoder.swift)

let image:CVImageBufferRef = CMSampleBufferGetImageBuffer(sampleBuffer)!
// delegate here to allow accessing the image and apply filters, etc.
encodeImageBuffer(image, presentationTimeStamp: CMSampleBufferGetPresentationTimeStamp(sampleBuffer), duration: CMSampleBufferGetDuration(sampleBuffer))

I will try to confirm is it possible.

Read MP4 file and stream it

Although lf.swift does not implement the function to stream data read from a MP4(H264/aac encoded) file to a stream server(wowza,for example), but I think that it is not difficult to implement this based on lf.swift, I'm not familiar with the structure of a MP4 file, any advice or guide? thanks in advance!

ipad2 ,ipadMini 1,iphone 5C cant send Audio

Following devices unable to stream audio to wowza server
ipadMini 1 (Model A1455)
ipad 2 (A1396)
iphone 5C (A1529)

Seems like all old ipads have this issue.

In AACEncoder.swift it fails at the following line while converting the audio. This method returns -50
status = fillComplexBuffer(&outputDataPacketSize, outOutputData: &outOutputData, outPacketDescription: nil)

Anyone faced the same issue ?

mix two audio inputs when capturing

I found there is a AVMixer in the project, Can it be used to mix two audio input(for instance, iPhone's mic or a line-in audio source from lightning port and a connected bluetooth mic ) when capturing?

setting Fps to 10, Output Fps is still 30

I am using master branch demo.
Then I added in the demo:

    rtmpStream.videoSettings = [
        "width" : 120,
        "height" : 160,
        "fps" : 10,
    ]

And then I play the output with rtmp://192.168.1.101:14000/hls/movie

Unfortunately the output Video is still 30 fps

suggestion for HTTPStream

HTTPStream should have a attachAudio method for adding audio capture, that will be perfect for an IP camera.
I also add some useful methods for HTTPStream class in my project:

public var audioSettings:[String: AnyObject] {
    get { return mixer.audioIO.encoder.dictionaryWithValuesForKeys(AACEncoder.supportedSettingsKeys)}
    set { mixer.audioIO.encoder.setValuesForKeysWithDictionary(newValue) }
}

public var videoSettings:[String: AnyObject] {
    get { return mixer.videoIO.encoder.dictionaryWithValuesForKeys(AVCEncoder.supportedSettingsKeys)}
    set { mixer.videoIO.encoder.setValuesForKeysWithDictionary(newValue)}
}

public var captureSettings:[String: AnyObject] {
    get { return mixer.dictionaryWithValuesForKeys(AVMixer.supportedSettingsKeys)}
    set { dispatch_async(lockQueue) { self.mixer.setValuesForKeysWithDictionary(newValue)}}
}

xcodebuild clean build "CompileSwift normal i386 RTMPMessage.swift Error"

-o /Users/shogo/Library/Developer/Xcode/DerivedData/App-fjijybpfmecgbvfpfwjlyiwatljp/Build/Intermediates/Pods.build/Release-iphonesimulator/lf.build/Objects-normal/i386/RTMPMessage.o
0  swift                    0x000000010b39166b llvm::sys::PrintStackTrace(llvm::raw_ostream&) + 43
1  swift                    0x000000010b390956 llvm::sys::RunSignalHandlers() + 70
2  swift                    0x000000010b391ccf SignalHandler(int) + 287
3  libsystem_platform.dylib 0x00007fff8a78252a _sigtramp + 26
4  libsystem_malloc.dylib   0x00007fff8d1540cc malloc + 42
5  swift                    0x0000000109376994 swift::SILCombiner::doOneIteration(swift::SILFunction&, unsigned int) + 196
6  swift                    0x0000000109377149 swift::SILCombiner::runOnFunction(swift::SILFunction&) + 297
7  swift                    0x00000001093775b5 (anonymous namespace)::SILCombine::run() + 197
8  swift                    0x00000001093c8955 swift::SILPassManager::runPassesOnFunction(llvm::ArrayRef<swift::SILFunctionTransform*>, swift::SILFunction*) + 1189
9  swift                    0x00000001093c95a4 swift::SILPassManager::runFunctionPasses(llvm::ArrayRef<swift::SILFunctionTransform*>) + 804
10 swift                    0x00000001093ca4f0 swift::SILPassManager::runOneIteration() + 608
11 swift                    0x00000001093d0515 swift::runSILOptimizationPasses(swift::SILModule&) + 213
12 swift                    0x00000001090e4579 performCompile(swift::CompilerInstance&, swift::CompilerInvocation&, llvm::ArrayRef<char const*>, int&) + 13193
13 swift                    0x00000001090e068d frontend_main(llvm::ArrayRef<char const*>, char const*, void*) + 2781
14 swift                    0x00000001090dc0ac main + 1932
15 libdyld.dylib            0x00007fff933b45ad start + 1
16 libdyld.dylib            0x0000000000000080 start + 1824832212

Any set orientation is overridden when RTMP publishing begins

If I set any orientation before broadcasting using rtmpStream.captureSettings["orientation"], that value is overridden when stream publishing starts. I've found that this is due to lines 409-411 in AVMixer.swift:

if let orientation:AVCaptureVideoOrientation = AVMixer.getAVCaptureVideoOrientation(UIDevice.currentDevice().orientation) {
     self.orientation = orientation
}

These lines should either be removed, or something should check if an orientation was set through captureSettings. (Pull request)

Not support the server of 'Simple Rtmp Server'

Does If.swift support the 'Simple Rtmp Server' server? I have tried it, but failed. It is my server's log:

RTMP client ip=180.168.126.117
[2016-05-24 10:44:40.251][trace][28905][3629] simple handshake success.
[2016-05-24 10:44:40.323][trace][28905][3629] connect app, tcUrl=rtmp://swiftc.org/live/livestream, pageUrl=, swfUrl=, schema=rtmp, vhost=**defaultVhost**, port=1935, app=live/livestream, args=null
[2016-05-24 10:44:40.323][trace][28905][3629] out chunk size to 60000
[2016-05-24 10:44:40.503][trace][28905][3629] input chunk size to 1024
[2016-05-24 10:44:40.503][trace][28905][3629] identify ignore messages except AMF0/AMF3 command message. type=0x13
[2016-05-24 10:44:40.503][trace][28905][3629] identify ignore messages except AMF0/AMF3 command message. type=0x13
[2016-05-24 10:44:40.504][trace][28905][3629] identify ignore messages except AMF0/AMF3 command message. type=0x13
[2016-05-24 10:44:40.577][trace][28905][3629] client identified, type=flash-publish), stream_name=test, duration=-1.00
[2016-05-24 10:44:40.578][trace][28905][3629] source url=/live/livestream/test, ip=180.168.126.117, cache=1, is_edge=0, source_id=-1[-1]
[2016-05-24 10:44:40.588][trace][28905][3629] start publish mr=0/350, p1stpt=20000, pnt=20000, tcp_nodelay=0, rtcid=3630
[2016-05-24 10:44:40.663][trace][28905][3629] 4B audio sh, codec(10, profile=LC, 2channels, 0kbps, 44100HZ), flv(16bits, 2channels, 44100HZ)
[2016-05-24 10:44:41.227][trace][28905][3629] 39B video sh,  codec(7, profile=Baseline, level=3.1, 640x384, 0kbps, 0fps, 0s)
[2016-05-24 10:44:44.175][trace][28905][118] -> PLA time=3089245661, msgs=0, okbps=0,0,0, ikbps=0,0,0, mw=350
[2016-05-24 10:44:51.268][error][28905][3629][11] chunk stream is fresh, fmt must be 0, actual is 2. cid=10, ret=2001(Resource temporarily unavailable)
[2016-05-24 10:44:51.268][error][28905][3629][11] read message header failed. ret=2001(Resource temporarily unavailable)
[2016-05-24 10:44:51.268][error][28905][3629][11] recv interlaced message failed. ret=2001(Resource temporarily unavailable)
[2016-05-24 10:44:51.268][error][28905][3629][11] thread process message failed. ret=2001(Resource temporarily unavailable)
[2016-05-24 10:44:51.268][warn][28905][3629][11] thread recv cycle failed, ignored and retry, ret=2001
[2016-05-24 10:44:51.269][error][28905][3629][11] recv thread failed. ret=2001(Resource temporarily unavailable)
[2016-05-24 10:44:51.269][trace][28905][3629] cleanup when unpublish
[2016-05-24 10:44:51.269][error][28905][3629][11] stream service cycle failed. ret=2001(Resource temporarily unavailable)

adobe authentication for rtmp not implemented

here is my temporary solution:

  • in RTMPConnection.swift add the following to rtmpStatusHandler(notification: NSNotification)
case "NetConnection.Connect.Rejected":
                    if self.user == "" || self.passwd == "" {
                        break
                    }
                    let cc = (data["description"] as! String).componentsSeparatedByString("?")
                    if (cc.count < 2) {
                        reconnect("/?authmod=adobe&user=\(self.user)")
                    } else if (cc.count > 1) {
                        let p = cc[1] as String
                        let params = p.componentsSeparatedByString("&")

                        var salt = "", challenge = "", opaque = ""
                        for i in 0..<params.count {
                            let pa = params[i]
                            let sIdx = pa.startIndex
                            if pa.substringToIndex(sIdx.advancedBy(5)) == "salt=" {
                                salt = pa.substringFromIndex(sIdx.advancedBy(5))
                            } else if pa.substringToIndex(sIdx.advancedBy(10)) == "challenge=" {
                                challenge = pa.substringFromIndex(sIdx.advancedBy(10))
                            } else if pa.substringToIndex(sIdx.advancedBy(7)) == "opaque=" {
                                opaque = pa.substringFromIndex(sIdx.advancedBy(7))
                            }
                        }

                        let newParams = adobe_auth(self.user, password: self.passwd, salt: salt, opaque: opaque, challenge: challenge)
                        print(newParams)
                        reconnect(newParams)
                    }

and then add a user and passwd properties and adobe_auth method as follow:

func adobe_auth(user:String, password:String, salt:String, opaque:String, challenge:String) -> String {
        var hashstr:String = ""
        let challenge2 = String(format: "%08x", random())

        hashstr = rawMD5Base64("\(user)\(salt)\(password)")
        if challenge != "" || opaque != "" {
            if opaque != "" {
                hashstr = "\(hashstr)\(opaque)"
            } else if challenge != "" {
                hashstr = "\(hashstr)\(challenge)"
            }
        }
        hashstr = rawMD5Base64("\(hashstr)\(challenge2)")

        var q = "/?authmod=adobe&user=\(user)&challenge=\(challenge2)&response=\(hashstr)"

        if opaque != "" {
            q = "\(q)&opaque=\(opaque)"
        }


        return q
    }

    private func rawMD5Base64(s:String) -> String {
        let data = s.dataUsingEncoding(NSUTF8StringEncoding)!
        var digest = [UInt8](count:Int(CC_MD5_DIGEST_LENGTH), repeatedValue: 0)
        CC_MD5(data.bytes, CC_LONG(data.length), &digest)
        let ret = NSData(bytes: digest, length: Int(CC_MD5_DIGEST_LENGTH)).base64EncodedStringWithOptions(NSDataBase64EncodingOptions(rawValue: 0))
        return ret
    }

but the last method needs an header file:

#import <CommonCrypto/CommonCrypto.h>

added these two methods for connect with auth and reconnect:

public func connectWithUserAndPasswd(command: String, user: String, passwd: String, arguments: Any?...) {
        if let url:NSURL = NSURL(string: command) {
            self.user = user
            self.passwd = passwd
            orgUri = command // orgUri is an extra member variable
            _uri = command
            self.arguments = arguments
            addEventListener(Event.RTMP_STATUS, selector: "rtmpStatusHandler:")
            socket.connect(url.host!, port: url.port == nil ? RTMPConnection.defaultPort : UInt32(url.port!.intValue))
        }
    }

    public func reconnect(query: String) {
        if let url:NSURL = NSURL(string: orgUri + query) {
            _uri = orgUri + query
            socket.connect(url.host!, port: url.port == nil ? RTMPConnection.defaultPort : UInt32(url.port!.intValue))
        }
    }

weird crash

rtmpStream.videoSettings = [
"bitrate": 500 * 1000,
"width": 540,
"height": 720,
"aspectRatio16by9": false,
]

if I remove "width": 540, no more crash.

Video aspect ratio cannot change when rotation changes.

override func viewDidLayoutSubviews() {
        super.viewDidLayoutSubviews()

        let ori = UIDevice.currentDevice().orientation
        if ori == UIDeviceOrientation.Portrait {
            self.rtStream?.videoSettings = [
                "width": 272,
                "height": 480
            ]
        } else {
            self.rtStream?.videoSettings = [
                "width": 480,
                "height": 272
            ]
        }
}

I'm not sure if this is the right way to do.

ScreenCaptureSession error

I use the following code to broadcast a screen capture:
self.rtmpConnection = RTMPConnection()
self.rtmpStream = RTMPStream(rtmpConnection: rtmpConnection!)
rtmpStream!.view.videoGravity = AVLayerVideoGravityResizeAspectFill

        rtmpStream!.attachAudio(AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio))
        self.screen = ScreenCaptureSession()
        rtmpStream!.attachScreen(self.screen)
        rtmpConnection!.connect("rtmp://192.168.1.1/live")
        rtmpStream!.publish("demotea")

when the app runs in iPhone 6 plus, an error occurs:

2016-04-22 22:24:20.053 [Debug] [VideoIOComponent.swift:31] VideoIOComponent > cicontext use hardware renderer
2016-04-22 22:24:20.098 [Info] [ScreenCaptureSession.swift:32] ScreenCaptureSession > cicontext use hardware renderer
2016-04-22 22:24:21.137 [Warning] [Function.swift:7] IsNoErr > -12902
2016-04-22 22:24:21.184 [Error] [AVCEncoder.swift:103] AVCEncoder > status = -12902
2016-04-22 22:24:21.373 [Error] [AVCEncoder.swift:103] AVCEncoder > status = -12902
2016-04-22 22:24:21.590 [Error] [AVCEncoder.swift:103] AVCEncoder > status = -12902
2016-04-22 22:24:21.746 [Error] [AVCEncoder.swift:103] AVCEncoder > status = -12902
2016-04-22 22:24:21.901 [Error] [AVCEncoder.swift:103] AVCEncoder > status = -12902

any suggestion?
this same app runs OK in my iPhone 5S.

bug in RTMPStream

added keyPath: frame

_view!.addObserver(self, forKeyPath: "frame", options: NSKeyValueObservingOptions.New, context: nil)

but try to remove keyPath: bounds

    deinit {
        _view?.removeObserver(self, forKeyPath: "bounds")
    }

this raises an error.

Audiobus Support

Audiobus interfaces [your new] apps with one another and system audio via iOS APIs and their very cool framework, allowing for multiple routes from the same source (eg, the same system mic input stream can be piped/redirected to any number of Audiobus effects, apps, and System Audio
(Audiobus is akin to iOS Inter-app Audio Input/Output!

Noise iPhone6s

I'm using with wowza but it doing a noise, not big, but have one.
I change the bitrate but does matter, still doing noise

HTTPStream corruption

I'm using lf.framework(downloaded on 10 May) to live broadcast a HTTPStream, at first my app runs OK and I can get the stream from http://ip.address:8080/hls/playlist.m3u8 ,while broadcasting several minutes, the app corrupted and stopped in the following statement within writeSampleBuffer method(TSWriter.swift):
currentFileHandle?.writeData(NSData(bytes: packet.bytes))

console message:
* Terminating app due to uncaught exception 'NSFileHandleOperationException', reason: '* -[NSConcreteFileHandle writeData:]: Resource temporarily unavailable'
*** First throw call stack:
(0x1814a2e38 0x180b07f80 0x1814a2d80 0x181e290dc 0x181e299cc 0x10017f324 0x100183550 0x1001835d8 0x1001bc0dc 0x1001bc508 0x10020a830 0x10020a8d0 0x187bb7408 0x187bb72cc 0x1838604a4 0x18387c188 0x100b5da3c 0x100b75de0 0x100b5ffe0 0x100b6a770 0x100b6172c 0x100b6a770 0x100b6172c 0x100b6c66c 0x100b6c364 0x181105470 0x181105020)
libc++abi.dylib: terminating with uncaught exception of type NSException

Dynamic content VisualEffect

I want to add an VisualEffect(an Image) that its content is changed from time to time and can be got from a Timer handler, for instance, I want to display a elapsed time label(can be converted to a CIImage) on the stream. How to implement this? Thanks!

Does lf support RTMP Playback?

I saw a "RTMPStream.play" Method in the source file, but I don't know how to use it, when I call it, application always crashes.

Preview not showing

import UIKit
import lf
import AVFoundation

class ViewController: UIViewController {

    var rtmpStream: RTMPStream!
    var rtmpConnection: RTMPConnection!

    override func viewDidLoad() {
        super.viewDidLoad()

        rtmpConnection = RTMPConnection()
        rtmpStream = RTMPStream(rtmpConnection: rtmpConnection)
        rtmpStream.videoGravity = AVLayerVideoGravityResizeAspectFill
        rtmpStream.attachAudio(AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio))
        rtmpStream.attachCamera(AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo))
        rtmpConnection.connect(url)
        rtmpStream.publish("ohyeahtest0001")
        self.view.addSubview(rtmpStream.view)
        rtmpStream.view.pinToEdgesOfSuperview()
        // Do any additional setup after loading the view, typically from a nib.
    }

    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
        // Dispose of any resources that can be recreated.
    }


}

Don't support new RTMP ver 3 about MonaServer

RTMP/RTMPSession.cpp[201] RTMPSession RTMP session 1 indicates a non-existent 3 FlashStream

Server code:

    // get flash stream process engine related by signature
    if (signature.size() > 4 && signature.compare(0, 5, "\x00\x54\x43\x04\x00", 5) == 0) // NetConnection
        pFlow = new RTMFPFlow(id, signature, peer, invoker, *this, _pMainStream);
    else if (signature.size()>3 && signature.compare(0, 4, "\x00\x54\x43\x04", 4) == 0) { // NetStream
        shared_ptr<FlashStream> pStream;
        UInt32 idSession(BinaryReader((const UInt8*)signature.c_str() + 4, signature.length() - 4).read7BitValue());
        if (!_pMainStream->getStream(idSession,pStream)) {
            ERROR("RTMFPFlow ",id," indicates a non-existent ",idSession," NetStream on session ",name());
            return NULL;
        }
        pFlow = new RTMFPFlow(id, signature, pStream, peer, invoker, *this);
    } else if(signature.size()>2 && signature.compare(0,3,"\x00\x47\x43",3)==0)  // NetGroup
        pFlow = new RTMFPFlow(id, signature, peer, invoker, *this, _pMainStream);

AMF3 failing handshake

I am trying to connect to Adobes shared ball project https://helpx.adobe.com/adobe-media-server/dev/sharedball-example.html

I have to change RTMPConnection defaultObjectEncoding from 0x00 to 0x03 otherwise the server never responds.

When the handshake completes and the delegate method "Listen" is called it fails to deserialise in RTMPMessage "commandName = serializer.deserialize(newValue, position: &position)" it shows an assertion failure because it is not a string

RTMPChunk{type:0,streamId:3,message:Optional(RTMPCommandMessage{type:AMF3Command,length:0,streamId:3,timestamp:0,commandName:connect,transactionId:1,commandObject:Optional(["tcUrl": Optional("rtmp://192.168.1.25/SharedBall"), "flashVer": Optional("AND 19,0,0,169"), "swfUrl": Optional("rtmp://192.168.1.25/SharedBall"), "app": Optional("SharedBall"), "fpad": Optional(false), "audioCodecs": Optional(1024), "videoCodecs": Optional(128), "videoFunction": Optional(1), "capabilities": Optional(239), "pageUrl": nil, "objectEncoding": Optional(3)]),arguments:[nil]})}

The byte array after the header has been removed is below, it starts with a 2 (Bool False) rather than 6 (String) which is what I think it is expecting

  • [0] : 2
  • [1] : 0
  • [2] : 6
  • [3] : 95
  • [4] : 101
  • [5] : 114
  • [6] : 114
  • [7] : 111
  • [8] : 114
  • [9] : 0
  • [10] : 65
  • [11] : 112
  • [12] : 80
  • [13] : 0
  • [14] : 0
  • [15] : 0
  • [16] : 0
  • [17] : 0
  • [18] : 5
  • [19] : 3
  • [20] : 0
  • [21] : 5
  • [22] : 108
  • [23] : 101
  • [24] : 118
  • [25] : 101
  • [26] : 108
  • [27] : 2
  • [28] : 0
  • [29] : 5
  • [30] : 101
  • [31] : 114
  • [32] : 114
  • [33] : 111
  • [34] : 114
  • [35] : 0
  • [36] : 4
  • [37] : 99
  • [38] : 111
  • [39] : 100
  • [40] : 101
  • [41] : 2
  • [42] : 0
  • [43] : 30
  • [44] : 78
  • [45] : 101
  • [46] : 116
  • [47] : 67
  • [48] : 111
  • [49] : 110
  • [50] : 110
  • [51] : 101
  • [52] : 99
  • [53] : 116
  • [54] : 105
  • [55] : 111
  • [56] : 110
  • [57] : 46
  • [58] : 67
  • [59] : 111
  • [60] : 110
  • [61] : 110
  • [62] : 101
  • [63] : 99
  • [64] : 116
  • [65] : 46
  • [66] : 82
  • [67] : 101
  • [68] : 106
  • [69] : 101
  • [70] : 99
  • [71] : 116
  • [72] : 101
  • [73] : 100
  • [74] : 0
  • [75] : 11
  • [76] : 100
  • [77] : 101
  • [78] : 115
  • [79] : 99
  • [80] : 114
  • [81] : 105
  • [82] : 112
  • [83] : 116
  • [84] : 105
  • [85] : 111
  • [86] : 110
  • [87] : 2
  • [88] : 0
  • [89] : 38
  • [90] : 91
  • [91] : 32
  • [92] : 83
  • [93] : 101
  • [94] : 114
  • [95] : 118
  • [96] : 101
  • [97] : 114
  • [98] : 46
  • [99] : 82
  • [100] : 101
  • [101] : 106
  • [102] : 101
  • [103] : 99
  • [104] : 116
  • [105] : 32
  • [106] : 93
  • [107] : 32
  • [108] : 58
  • [109] : 32
  • [110] : 67
  • [111] : 111
  • [112] : 110
  • [113] : 110
  • [114] : 101
  • [115] : 99
  • [116] : 116
  • [117] : 105
  • [118] : 111
  • [119] : 110
  • [120] : 32
  • [121] : 102
  • [122] : 97
  • [123] : 105
  • [124] : 108
  • [125] : 101
  • [126] : 100
  • [127] : 46
  • [128] : 195
  • [129] : 0
  • [130] : 0

NAL packaging

I’m newbie in the field of video streaming, please help me to understand your code in method sampleOutput(video sampleBuffer: CMSampleBuffer) in RTMPMuxer.swift:

  1. data in buffer is formed by a header and encoded video data got from CMBlockBuffer in the CMSampleBuffer.
  2. the header is 5 bytes in length, byte[0] is either 0x17(if it’s a keyframe) or 0x27(if it’s not a keyframe)
  3. byte[1] is 1(NAL type)
  4. I watched from debug window that byte[2]..byte[4] is always 0x000000
  5. so I get every buffer data(in hex):
    1701000000 followed by encoded H264 video data
    or
    2701000000 followed by encoded H264 video data
    I know that the encoded video buffer should be converted to NALU before sent to stream,the NALU should be Annex B(3 or 4 bytes start code in header) or AVCC(4bytes in length in header) type. so I donnt understand why the 5 bytes header be added before the encoded H264 video data, would you please to explain this to me? thank you very mush!

muted streaming

do you think I can just not attach audio to stream without sounds?

Crash during YouTube Live

�[fg211,211,211;�[bg;2016-05-24 23:58:01.261 [Verbose] [RTMPSocket.swift:38] doOutput(chunk:) > RTMPChunk{size:0,type:One,streamId:4,message:Optional(RTMPAudioMessage{type:Audio,length:0,streamId:1,timestamp:21,payload:[175, 1, 1, 24, 20, 169, 208, 118, 74, 12, 16, 64, 68, 8, 54, 67, 11, 169, 112, 93, 124, 23, 48, 123, 182, 247, 44, 29, 246, 153, 242, 65, 91, 145, 164, 86, 83, 75, 197, 76, 167, 117, 137, 175, 73, 135, 82, 29, 70, 219, 168, 90, 236, 176, 215, 93, 103, 119, 4, 141, 110, 76, 80, 138, 203, 151, 54, 206, 57, 138, 145, 188, 205, 220, 87, 103, 211, 17, 51, 11, 81, 3, 20, 151, 132, 248, 192, 1, 96, 10, 111, 30, 199, 207, 225, 182, 88, 194, 87, 108, 45, 163, 89, 228, 93, 38, 164, 13, 121, 237, 224, 130, 42, 110, 241, 68, 157, 69, 90, 82, 195, 41, 56, 149, 133, 155, 188, 48, 6, 35, 64, 212, 11, 70, 197, 179, 11, 105, 171, 59, 10, 173, 35, 22, 178, 25, 230, 88, 165, 196, 98, 163, 172, 239, 240, 245, 140, 213, 32, 164, 19, 80, 40, 95, 94, 231, 66, 84, 107, 99, 27, 93, 248],config:nil,codec:Unknown,soundR2016-05-24 23:58:01.378 Example MacOS[5398:633050] Dropping audio sample buffer 0x102817590 for output 
�[fg255,0,0;�[bg;2016-05-24 23:58:01.322 [Error] [RTMPMessage.swift:59] create > 198�[;
�[fg255,0,0;�[bg;2016-02016-05-26 06:48:21.562 Example MacOS[11017:1396736] Dropping audio sample buffer 0x100b9c350 for output <AVCaptureAudioDataOutput: 0x60800003edc0> connection <AVCaptureConnection: 0x608000008e90 [type:soun][enabled:1][active:1]>
5-26 06:48:21.468 [Error] [RTMPMessage.swift:59] create > 23�[;
�[fg255,0,0;�[bg;2016-05-26 06:48:21.563 [Error] [RTMPChunk.swift:213] bytes > [130, 0, 0, 0, 0, 4, 2, 23]�[;

pod trunk push --verbose error

/var/folders/x2/_jy7nvq93_q6clqh396jyjk00000gn/T/CocoaPods/Lint/build/Pods.build/Release-iphonesimulator/lf.build/Objects-normal/x86_64/RTMPMessage.swiftdeps -o /var/folders/x2/jy7nvq93_q6clqh396jyjk00000gn/T/CocoaPods/Lint/build/Pods.build/Release-iphonesimulator/lf.build/Objects-normal/x86_64/RTMPMessage.o
0 swift 0x000000010eb004eb llvm::sys::PrintStackTrace(llvm::raw_ostream&) + 43
1 swift 0x000000010eaff7d6 llvm::sys::RunSignalHandlers() + 70
2 swift 0x000000010eb00b4f SignalHandler(int) + 287
3 libsystem_platform.dylib 0x00007fff91e2f52a sigtramp + 26
4 libsystem_platform.dylib 0x00007fff00010000 sigtramp + 1847462640
5 swift 0x000000010cae7144 swift::SILCombiner::doOneIteration(swift::SILFunction&, unsigned int) + 196
6 swift 0x000000010cae78f9 swift::SILCombiner::runOnFunction(swift::SILFunction&) + 297
7 swift 0x000000010cae7d65 (anonymous namespace)::SILCombine::run() + 197
8 swift 0x000000010cb39145 swift::SILPassManager::runPassesOnFunction(llvm::ArrayRefswift::SILFunctionTransform*, swift::SILFunction
) + 1189
9 swift 0x000000010cb39d94 swift::SILPassManager::runFunctionPasses(llvm::ArrayRefswift::SILFunctionTransform_) + 804
10 swift 0x000000010cb3ace0 swift::SILPassManager::runOneIteration() + 608
11 swift 0x000000010cb40d05 swift::runSILOptimizationPasses(swift::SILModule&) + 213
12 swift 0x000000010c855309 performCompile(swift::CompilerInstance&, swift::CompilerInvocation&, llvm::ArrayRef<char const*>, int&) + 13193
13 swift 0x000000010c85141d frontend_main(llvm::ArrayRef<char const*>, char const
, void
) + 2781
14 swift 0x000000010c84ce3c main + 1932
15 libdyld.dylib 0x00007fff92bce5ad start + 1

The following build commands failed:
CompileSwift normal x86_64 /var/folders/x2/_jy7nvq93_q6clqh396jyjk00000gn/T/CocoaPods/Lint/Pods/lf/lf/RTMP/RTMPMessage.swift
CompileSwiftSources normal x86_64 com.apple.xcode.tools.swift.compiler
(2 failures)
-> lf (0.2)
- ERROR | [iOS] xcodebuild: Returned an unsuccessful exit code.

[!] The podspec does not validate.

Failing to connect to Shared Object

I have noticed when trying to connect to a shared object

RTMPChunk{type:0,streamId:3,message:Optional(RTMPSharedObjectMessage{sharedObjectName:table_1,currentVersion:0,flags:[1, 0, 0, 0, 0, 0, 0, 0],events:[Event{type:1,name:nil,data:nil}]})}

The server responds with

RTMPChunk{type:0,streamId:2,message:Optional(RTMPSetChunkSizeMessage{type:1,length:4,streamId:0,timestamp:0,payload:4})}

Which is then parsed as the shared object response and an "Array index out of range" error occurs in RTMPSharedObjectMessage at the following line
sharedObjectName = String(bytes: Array(newValue[2..<position]), encoding: NSUTF8StringEncoding)!

For a lot of the outgoing messages I have found that I have to set the Chunk stream id to 2 to have the server accept the message, if I do that with this message then the server disconnects

Mute audio

Hello @shogo4405 and thank you for an awesome framework.
Is there any why to mute audio while streaming? (sending silence AAC packets)

VisualEffect performance issue

I run Example iOS built from lf.swift project on my iPhone 6 Plus, the example runs OK, but when I use its Pronama feature(by pressing Pronama segment), the preview video and streamed video become not fluency, can the project be improved to solve this?

Question Playback

I need to make a video conference app (low latency).
In a player can I playback the video without using HLS?

ERROR: Zero sized function

Wowza Streaming Engine
FPS=15, RTMPConnection.defaultChunkSizeS = 1024

ERROR   server  comment 2016-05-31  02:27:11    -   -   -   -   -   76361.583   -   -   -   -   -   -   -   -   Zero sized function (client:599303301:FME/3.0 (compatible; FMSc/1.0)): type:20 size:0:
ERROR   server  comment 2016-05-31  02:27:11    -   -   -   -   -   76361.584   -   -   -   -   -   -   -   -   Zero sized function (client:599303301:FME/3.0 (compatible; FMSc/1.0)): type:20 size:0:
ERROR   server  comment 2016-05-31  02:27:11    -   -   -   -   -   76361.585   -   -   -   -   -   -   -   -   Zero sized function (client:599303301:FME/3.0 (compatible; FMSc/1.0)): type:20 size:0:
ERROR   server  comment 2016-05-31  02:46:16    -   -   -   -   -   77507.262   -   -   -   -   -   -   -   -   Zero sized function (client:697395480:FME/3.0 (compatible; FMSc/1.0)): type:20 size:0:

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.