Giter Club home page Giter Club logo

nativescript-audio's Introduction

NativeScript plugin to play and record audio files for Android and iOS.

npm


Installation

NativeScript 7+:

npm install nativescript-audio

NativeScript Version prior to 7:

tns plugin add [email protected]


Android Native Classes

iOS Native Classes

Permissions

iOS

You will need to grant permissions on iOS to allow the device to access the microphone if you are using the recording function. If you don't, your app may crash on device and/or your app might be rejected during Apple's review routine. To do this, add this key to your app/App_Resources/iOS/Info.plist file:

<key>NSMicrophoneUsageDescription</key>
<string>Recording Practice Sessions</string>

Android

If you are going to use the recorder capability for Android, you need to add the RECORD_AUDIO permission to your AndroidManifest.xml file located in App_Resources.

    <uses-permission android:name="android.permission.RECORD_AUDIO"/>

Usage

TypeScript Example

import { TNSPlayer } from 'nativescript-audio';

export class YourClass {
  private _player: TNSPlayer;

  constructor() {
    this._player = new TNSPlayer();
    // You can pass a duration hint to control the behavior of other application that may
    // be holding audio focus.
    // For example: new  TNSPlayer(AudioFocusDurationHint.AUDIOFOCUS_GAIN_TRANSIENT);
    // Then when you play a song, the previous owner of the
    // audio focus will stop. When your song stops
    // the previous holder will resume.
    this._player.debug = true; // set true to enable TNSPlayer console logs for debugging.
    this._player
      .initFromFile({
        audioFile: '~/assets/song.mp3', // ~ = app directory
        loop: false,
        completeCallback: this._trackComplete.bind(this),
        errorCallback: this._trackError.bind(this)
      })
      .then(() => {
        this._player.getAudioTrackDuration().then(duration => {
          // iOS: duration is in seconds
          // Android: duration is in milliseconds
          console.log(`song duration:`, duration);
        });
      });
  }

  public togglePlay() {
    if (this._player.isAudioPlaying()) {
      this._player.pause();
    } else {
      this._player.play();
    }
  }

  private _trackComplete(args: any) {
    console.log('reference back to player:', args.player);
    // iOS only: flag indicating if completed succesfully
    console.log('whether song play completed successfully:', args.flag);
  }

  private _trackError(args: any) {
    console.log('reference back to player:', args.player);
    console.log('the error:', args.error);
    // Android only: extra detail on error
    console.log('extra info on the error:', args.extra);
  }

  // This is an example method for watching audio meters and converting the values from Android's arbitrary 
  // value to something close to dB. iOS reports values from -120 to 0, android reports values from 0 to about 37000.
  // The below method converts the values to db as close as I could figure out. You can tweak the .1 value to your discretion.
  // I am basically converting these numbers to something close to a percentage value. My handle Meter UI method
  // converts that value to a value I can use to pulse a circle bigger and smaller, representing your audio level. 
  private _initMeter() {
    this._resetMeter();
    this._meterInterval = this._win.setInterval(() => {
      this.audioMeter = this._recorder.getMeters();
      if (isIOS) {
        this.handleMeterUI(this.audioMeter+200)
      } else {
        let db = (20 * Math.log10(parseInt(this.audioMeter) / .1)); 
        let percentage = db + 85; 
        this.handleMeterUI(percentage)
      }
    }, 150);
  }

  handleMeterUI(percentage) {
    let scale = percentage/100;

    function map_range(value, in_low, in_high, out_low, out_high) {
      return out_low + (out_high - out_low) * (value - in_low) / (in_high - in_low);
    }
    let lerpScale = map_range(scale, 1.2, 1.9, 0.1, 2.1)
    if (scale > 0) {
      this.levelMeterCircleUI.animate({
        scale: {x: lerpScale, y: lerpScale},
        duration: 100
      }).then(() => {}).catch(() => {})
    }
    if (lerpScale > 2.2) {
      this.levelBgColor = 'rgba(255, 0, 0, 1)';
    } else {
      this.levelBgColor = 'rgb(0, 183, 0)';
    }
  }
}

Javascript Example:

const audio = require('nativescript-audio');

const player = new audio.TNSPlayer();
const playerOptions = {
  audioFile: 'http://some/audio/file.mp3',
  loop: false,
  completeCallback: function () {
    console.log('finished playing');
  },
  errorCallback: function (errorObject) {
    console.log(JSON.stringify(errorObject));
  },
  infoCallback: function (args) {
    console.log(JSON.stringify(args));
  }
};

player
  .playFromUrl(playerOptions)
  .then(res => {
    console.log(res);
  })
  .catch(err => {
    console.log('something went wrong...', err);
  });

API

Recorder

TNSRecorder Methods

Method Description
TNSRecorder.CAN_RECORD(): boolean - static method Determine if ready to record.
start(options: AudioRecorderOptions): Promise<void> Start recording to file.
stop(): Promise<void> Stop recording.
pause(): Promise<void> Pause recording.
resume(): Promise<void> Resume recording.
dispose(): Promise<void> Free up system resources when done with recorder.
getMeters(channel?: number): number Returns the amplitude of the input.
isRecording(): boolean - iOS Only Returns true if recorder is actively recording.
requestRecordPermission(): Promise<void> Android Only Resolves the promise is user grants the permission.
hasRecordPermission(): boolean Android Only Returns true if RECORD_AUDIO permission has been granted.

TNSRecorder Instance Properties

Property Description
ios Get the native AVAudioRecorder class instance.
android Get the native MediaRecorder class instance.
debug Set true to enable debugging console logs (default false).

TNSRecorder AudioRecorderOptions

Property Type Description
filename string Gets or sets the recorded file name.
source int Android Only Sets the source for recording. Learn more here https://developer.android.com/reference/android/media/MediaRecorder.AudioSource
maxDuration int Gets or set the max duration of the recording session. Input in milliseconds, which is Android's format. Will be converted appropriately for iOS.
metering boolean Enables metering. This will allow you to inspect the audio level by calling the record instance's getMeters ,method. This will return dB on iOS, but an arbitrary amplitude number for Android. See the metering example for a way to convert the output to something resembling dB on Android.
format int or enum The Audio format to record in. On Android, use these Enums: https://developer.android.com/reference/android/media/AudioFormat#ENCODING_PCM_16BIT On ios, use these format options: https://developer.apple.com/documentation/coreaudiotypes/1572096-audio_format_identifiers
channels int Number of channels to record (mono, st)
sampleRate int The sample rate to record in. Default: 44100
bitRate int Android Only The bitrate to record in. iOS automatically calculates based on iosAudioQuality flag. Default: 128000
encoder int or enum Android Only Use https://developer.android.com/reference/android/media/MediaRecorder.AudioEncoder#AAC
iosAudioQuality string ios uses AVAudioQuality to determine encoder and bitrate. Accepts Min, Low, Medium, High, Max https://developer.apple.com/documentation/avfaudio/avaudioquality
errorCallback function Gets or sets the callback when an error occurs with the media recorder. Returns An object containing the native values for the error callback.
infoCallback function Gets or sets the callback to be invoked to communicate some info and/or warning about the media or its playback. Returns An object containing the native values for the info callback.

Player

TNSPlayer Methods

Method Description
initFromFile(options: AudioPlayerOptions): Promise Initialize player instance with a file without auto-playing.
playFromFile(options: AudioPlayerOptions): Promise Auto-play from a file.
initFromUrl(options: AudioPlayerOptions): Promise Initialize player instance from a url without auto-playing.
playFromUrl(options: AudioPlayerOptions): Promise Auto-play from a url.
pause(): Promise<boolean> Pause playback.
resume(): void Resume playback.
seekTo(time:number): Promise<boolean> Seek to position of track (in seconds).
dispose(): Promise<boolean> Free up resources when done playing audio.
isAudioPlaying(): boolean Determine if player is playing.
getAudioTrackDuration(): Promise<string> Duration of media file assigned to the player.
playAtTime(time: number): void - iOS Only Play audio track at specific time of duration.
changePlayerSpeed(speed: number): void - On Android Only API 23+ Change the playback speed of the media player.

TNSPlayer Instance Properties

Property Description
ios Get the native ios AVAudioPlayer instance.
android Get the native android MediaPlayer instance.
debug: boolean Set true to enable debugging console logs (default false).
currentTime: number Get the current time in the media file's duration.
volume: number Get/Set the player volume. Value range from 0 to 1.

License

MIT

Demo App

  • fork/clone the repository
  • cd into the src directory
  • execute npm run demo.android or npm run demo.ios (scripts are located in the scripts of the package.json in the src directory if you are curious)

nativescript-audio's People

Contributors

agisboye avatar alexgritton avatar andreasotto avatar bradmartin avatar cmckni3 avatar codef0rmerz avatar danieldspx avatar davecoffin avatar dicksmith avatar dylanryan avatar eddyverbruggen avatar edusperoni avatar entelostre avatar govi2010 avatar gtnunes1956 avatar gurvancampion avatar jibon57 avatar jlooper avatar junying1 avatar kabirules avatar markoimake avatar mehyaa avatar mfik avatar nathanwalker avatar redpandatronicsuk avatar shiv19 avatar sis0k0 avatar stevengbrown avatar triniwiz avatar vallemar avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nativescript-audio's Issues

Issue with published package > 3.5.0

Note to all, until this addressed.

The package contents in versions greater than 3.5.0 are currently missing.

Until you see this issue closed, install 3.5.0 directly:

npm install [email protected] --save

To do list:

  • Update dependencies
  • Solve tsc fail error

How to record

Do you have a simple example of of to start and stop a recording? Docs are totally not clear. Yoju one example is for android. What about iOS? demo/app/main-view-model.ts

Thanks

no non-static method "Landroid/media/MediaRecorder;.pause()V"

ERROR Error: Uncaught (in promise): Error: java.lang.NoSuchMethodError: no non-static method "Landroid/media/MediaRecorder;.pause()V"

Any idea of what might be the problem ? Is it because that method was introduced in API level 24?

When I call this.recorder.start or this.recorder.stop It works. But with this.recorder.pause it throws that error.

resume method after pause audio from url

Thanks for your nice plugin, i'm playing a streaming from url, from here everything is ok, but when i tap pause method and then play, it takes time to resume streaming, like 3 or 4 seconds.

Is there any change that you can add resume method from URL?

Thanks! #

delay between loops

Hi,

I'm playing a 5s mp3 file from a local repo on the player.
It is supposed to loop perfectly, however when playing it on the player, there is a blank.
(a moment between the loops where there is no sound playing)

How can I avoid this?

Thanks a lot!

Vincent

Android playFromFile only works if the file name starts with ~/

The code to play looks like this:

        let audioPath;

        let fileName = isString(options.audioFile) ? options.audioFile.trim() : "";
        if (fileName.indexOf("~/") === 0) {
          fileName = fs.path.join(fs.knownFolders.currentApp().path, fileName.replace("~/", ""));
          console.log('fileName: ' + fileName);
          audioPath = fileName;
        }

if file name doesn't start with ~/, audioPath is never set.

Pull request forthcoming to set audioPath to the fileName if there isn't a ~/ in it.

tns build android is failing - :asbg:generateBinding

When I build for android, I get the following error:

Running full build
finished with reading lines with js files
Warning: there already is an extend called com.tns.NativeScriptActivity.
Warning: there already is an extend called com.tns.FragmentClass.
:asbg:generateBindings

Exception in thread "main" java.io.IOException: File already exists. This may lead to undesired behavior.
Please change the name of one of the extended classes.
<project_folder>/platforms/android/src/main/java/com/tns/gen/android/content/BroadcastReceiver_frnal_ts_helpers_l58_c38__BroadcastReceiver.java
        at org.nativescript.staticbindinggenerator.Generator.writeBindings(Generator.java:60)
        at org.nativescript.staticbindinggenerator.Main.main(Main.java:15)
:asbg:generateBindings FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<project_folder>/platforms/android/build-tools/android-static-binding-generator/build.gradle' line: 251

* What went wrong:
Execution failed for task ':asbg:generateBindings'.
> Process 'command '/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/bin/java'' finished with non-zero exit value 1

I've done some googling but not getting anywhere.

I then grepped my project for those two classes and found twice: once in node_modules/tns-core-modules and again in nativescript-audio's node_modules/tns-core-modules dependency.

How is the Android compiler supposed to reconcile this? I didn't have a problem building for iOS.

Also, using v2.5.4. Any thoughts are appreciated. Thanks.

translate to Javascript

hi
I need this code is in Javascript

import { TNSPlayer } from 'nativescript-audio';
 
export class YourClass {
    private _player: TNSPlayer;
    
    constructor() {
        this._player = new TNSPlayer();
        this._player.initFromFile({
            audioFile: '~/audio/song.mp3', // ~ = app directory
            loop: false,
            completeCallback: this._trackComplete.bind(this),
            errorCallback: this._trackError.bind(this)
        }).then(() => {
 
            this._player.getAudioTrackDuration().then((duration) => {
                // iOS: duration is in seconds
                // Android: duration is in milliseconds
                console.log(`song duration:`, duration);
            });
        });
    }
 
    public togglePlay() {
        if (this._player.isAudioPlaying()) {
            this._player.pause();
        } else {
            this._player.play();
        }
    }
 
    private _trackComplete(args: any) {
        console.log('reference back to player:', args.player);
 
        // iOS only: flag indicating if completed succesfully
        console.log('whether song play completed successfully:', args.flag);
    }
 
    private _trackError(args: any) {
        console.log('reference back to player:', args.player);
        console.log('the error:', args.error);
 
        // Android only: extra detail on error
        console.log('extra info on the error:', args.extra);
    }
}

Segmentation fault after DISPOSE

Hello !
I get Service exited due to Segmentation fault: 11
After playing audio files in mp3, m4a. Anyone know how to resolve this or debug it?

Exception when starting app in android device

Following this tutorial I got the exception below after when I open the app in android real device:

An uncaught Exception occurred on "main" thread. java.lang.RuntimeException: Unable to start activity ComponentInfo{org.nativescript.AudioRecorder/com.tns.NativeScriptActivity}: com.tns.NativeScriptException: Failed to find module: "nativescript-audio", relative to: app/tns_modules/ at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2184) at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2233) at android.app.ActivityThread.access$800(ActivityThread.java:135) at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1196) at android.os.Handler.dispatchMessage(Handler.java:102) at android.os.Looper.loop(Looper.java:136) at android.app.ActivityThread.main(ActivityThread.java:5001) at java.lang.reflect.Method.invokeNative(Native Method) at java.lang.reflect.Method.invoke(Method.java:515) at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:785) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:601) at dalvik.system.Na

It didn´t even ask permission for RECORD_AUDIO

Request: handle iOS permissions for recording

First of all 👍 for this awesome plugin. Thanks for creating this.

I found that on iOS, if user deny the permission to record audio using microphone, the promise of start method is not getting resolved or rejected.

It will be great if we can have some intimation if user has given permission or not, to take further actions.

I quickly looked into the code. I think it will satisfy this requirement if we just reject the promise if permission is denied as follows:

src/ios/recorder.ts

public start(options: AudioRecorderOptions): Promise<any> {
    return new Promise((resolve, reject) => {
       this._recordingSession.requestRecordPermission((allowed: boolean) => {
          if (allowed) {
                 ...
                resolve();
                ...
          } else {
              reject('Permission Denied');
          }
    });
 });
}

If there is already some way to get intimation that user has denied the permission, then please let me know. It will be very helpful to me.

Thanks.

Problem with playing remote file

Hi,

Thanks for the plugin. I am trying to play remote file like this:

var audioPlayer = new TNSPlayer();
    audioPlayer.playFromUrl({
      audioFile: file,
      loop: false,
      completeCallback: function() {
        console.log("Completed !!")
      },
      errorCallback: function(err) {

      }
    })

But I am getting following error in IOS:

Jul 21 18:18:21 --- last message repeated 8 times ---
CONSOLE LOG file:///app/components/coursemodules/jibon/jibon.js:35:20: 7
CONSOLE ERROR file:///app/tns_modules/nativescript-angular/zone-js/dist/zone-nativescript.js:571:22: Error: Uncaught (in promise): undefined
1   0x10824415b NativeScript::FFICallback<NativeScript::ObjCBlockCallback>::ffiClosureCallback(ffi_cif*, void*, void**, void*)
2   0x10893211e ffi_closure_unix64_inner
3   0x108932a52 ffi_closure_unix64
4   0x10c291095 __49-[__NSCFLocalSessionTask _task_onqueue_didFinish]_block_invoke
5   0x108f96237 __NSBLOCKOPERATION_IS_CALLING_OUT_TO_A_BLOCK__
6   0x108f95f3b -[NSBlockOperation main]
7   0x108f946f7 -[__NSOperationInternal _start:]
8   0x108f9047c __NSOQSchedule_f
9   0x10efa1792 _dispatch_client_callout
10  0x10ef87237 _dispatch_queue_serial_drain
11  0x10ef8798f _dispatch_queue_invoke
12  0x10ef89899 _dispatch_root_queue_drain
13  0x10ef8950d _dispatch_worker_thread3
14  0x10f32d5a2 _pthread_wqthread
15  0x10f32d07d start_wqthread
file:///app/tns_modules/nativescript-audio/src/ios/player.js:83:34: JS ERROR TypeError: null is not an object (evaluating '_this._player.delegate = _this')
Jul 21 18:18:22 Jibons-Pro SpringBoard[5014]: [KeyboardArbiter] HW kbd: Failed to set (null) as keyboard focus
Jul 21 18:18:22 Jibons-Pro backboardd[5015]: [Common] Unable to get short BSD proc info for 9046: No such process
Jul 21 18:18:22 Jibons-Pro backboardd[5015]: [Common] Unable to get proc info for 9046: Undefined error: 0

Any idea, in where I am doing wrong?

Thanks

Can't run demo on emulator and device

I am trying to run demo project on emulator and device and it gives me the following error:

nativescriptaudiomaster[10539]: undefined: JS ERROR Error: Could not find module './'. Computed path '/Users/username/Library/Developer/CoreSimulator/Devices/9787AD04-4D86-4193-A304-D49D5B45CC17/data/Containers/Bundle/Application/41F49A6B-B7AE-49A4-9615-04603C1E1E51/nativescriptaudiomaster.app/app'.
com.apple.CoreSimulator.SimDevice.9787AD04-4D86-4193-A304-D49D5B45CC17.launchd_sim[10279] (UIKitApplication:username[0x8487][10539]): Service exited due to Segmentation fault: 11

I haven't changed anything. Pure project doesn't work on device or emulator. Any suggestions? Does it work for others? I am on XCode 8.1.

I tried to do this, because when I try to use the library in my project, no sound is played and no errors are given. Anyone else experienced this? This is the code that I use:

this.player = new TNSPlayer();
this.playAudio('~/faded.mp3', 'localFile');
/* this.playAudio('https://dl.dropboxusercontent.com/u/50681696/content/Alan%20Walker%20-%20Faded.mp3', 'remoteFile'); */

public playAudio(filepath: string, fileType: string) {
        try {
            var playerOptions: AudioPlayerOptions = {
                audioFile: filepath,
                loop: true,
                completeCallback: () => {
                    this.player.dispose().then(() => {
                        console.log('DISPOSED');
                    }, (err) => {
                        console.log('ERROR disposePlayer: ' + err);
                    });
                },

                errorCallback: (err) => {
                    console.log(err);
                },

                infoCallback: (info) => {
                    console.log("what: " + info);
                }
            };

            if (fileType === 'localFile') {
                this.player.playFromFile(playerOptions).then(() => {
                }, (err) => {
                    console.log(err);
                });
            } else if (fileType === 'remoteFile') {
                this.player.playFromUrl(playerOptions).then(() => {
                }, (err) => {
                    console.log(err);
                });
            }
        } catch (ex) {
            console.log(ex);
        }
}

Edit: I noticed nativescript-audio wasn't defined in package.json.

Build android app fail

HI,

I build an Android application with 3.3.0 and everything work fine. When i upgrade to 3.4.2, I got the following error.

:asbg:generateBindings
Exception in thread "main" java.io.IOException: File already exists. This may lead to undesired behavior.
Please change the name of one of the extended classes.
/Users/gary/Documents/ma288/ma288-radio/platforms/android/src/main/java/com/tns/gen/android/content/BroadcastReceiver_frnal_ts_helpers_l58_c38__BroadcastReceiver.java
        at org.nativescript.staticbindinggenerator.Generator.writeBindings(Generator.java:60)
        at org.nativescript.staticbindinggenerator.Main.main(Main.java:15)
:asbg:generateBindings FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '/Users/gary/Documents/ma288/ma288-radio/platforms/android/build-tools/android-static-binding-generator/build.gradle' line: 251

* What went wrong:
Execution failed for task ':asbg:generateBindings'.
> Process 'command '/Library/Java/JavaVirtualMachines/jdk1.8.0_45.jdk/Contents/Home/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.

BUILD FAILED

Total time: 12.036 secs
Command /Users/gary/Documents/ma288/ma288-radio/platforms/android/gradlew failed with exit code 1

v3.4.2 fail on Android

Hi,

It's working on IOS, but the app crash on Android v3.4.2

Error:

JS: TypeError: Cannot read property 'release' of undefined
JS:     at file:///data/data/com.ugroupmedia.pnp.client.mobile/files/app/tns_modules/nativescript-audio/src/android/player.js:203:29
JS:     at new ZoneAwarePromise (file:///data/data/com.ugroupmedia.pnp.client.mobile/files/app/tns_modules/nativescript-angular/zone-js/dist/zone-nativescript.js:776:29)
JS:     at TNSPlayer.dispose (file:///data/data/com.ugroupmedia.pnp.client.mobile/files/app/tns_modules/nativescript-audio/src/android/player.js:201:16)

it's the dispose() method that seems to fail on Android:
here my code

this._player.dispose().then(() => {
               this._player.initFromUrl({
                   audioFile: callUrl, 
                   loop: false,
                   completeCallback: this._trackComplete.bind(this),
                   errorCallback: this._trackError.bind(this),
               }).then(() => {
                   this._player.getAudioTrackDuration().then((duration) => {
                       this._player.play();
                   });
               });
           });

Run demo application

I have a problem with running demo application. I clone whole repository go to demo directory and finaly has this error:

Unable to apply changes on device: DEVICE_ID. Error is: Processing node_modules failed. Error: cp: cannot create directory '/Users/ps/workspace/tutorial/nativescript/nativescript-audio/demo/platforms/ios/demo/app/tns_modules': No such file or directory.

Below I put whole console log. What am I doing wrong?

➜  demo git:(master) ✗ tns run ios
Searching for devices...
WARN registry Unexpected warning for https://registry.npmjs.org/: Miscellaneous Warning EINTEGRITY: sha1-UWbihkV/AzBgZL5Ul+jbsMPTIIM= integrity checksum failed when using sha1: wanted sha1-UWbihkV/AzBgZL5Ul+jbsMPTIIM= but got sha512-yJHVQEhyqPLUTgt9B83PXu6W3rx4MvvHvSUvToogpwoGDOUQ+yDrR0HRot+yOCdCO7u4hX3pWft6kWBBcqh0UA==. (11423 bytes)
WARN registry Using stale package data from https://registry.npmjs.org/ due to a request error during revalidation.

> [email protected] postinstall /Users/ps/workspace/tutorial/nativescript/nativescript-audio/demo/node_modules/nativescript-dev-typescript
> node postinstall.js

Adding 'es6' lib to tsconfig.json...
Adding 'dom' lib to tsconfig.json...
Adding tns-core-modules path mappings lib to tsconfig.json...
Project already targets TypeScript ^2.3.0
npm notice created a lockfile as package-lock.json. You should commit this file.
npm WARN demo No description
npm WARN demo No repository field.
npm WARN demo No license field.

added 46 packages in 5.281s
Copying template files...
Installing  tns-ios
  ◞ Installing tns-ios⸨      ░░░░░░░░░░░░⸩ ⠼ loadExtraneous: sill resolveWithNewModule [email protected] checking installabl  ◟ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◜ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◠ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◝ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◞ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◡ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◟ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◜ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◠ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◝ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◞ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◡ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◟ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◜ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◠ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◝ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◞ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◡ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◟ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d  ◜ Installing tns-ios⸨       ░░░░░░░░░░░⸩ ⠼ extract:tns-ios: verb lock using /Users/ps/.npm/_locks/staging-a2d+ [email protected]
added 1 package in 2.159s
Project successfully created.
Executing before-prepare hook from /Users/ps/workspace/tutorial/nativescript/nativescript-audio/demo/hooks/before-prepare/nativescript-dev-typescript.js
Found peer TypeScript 2.5.2
Preparing project...
Cycle link found.
Cycle link found.
Cycle link found.
Cycle link found.
Cycle link found.
Cycle link found.
Cycle link found.
Cycle link found.
Cycle link found.
Cycle link found.
Cycle link found.
Cycle link found.
Unable to apply changes on device: DEVICE_ID. Error is: Processing node_modules failed. Error: cp: cannot create directory '/Users/ps/workspace/tutorial/nativescript/nativescript-audio/demo/platforms/ios/demo/app/tns_modules': No such file or directory.
Executing after-watch hook from /Users/ps/workspace/tutorial/nativescript/nativescript-audio/demo/hooks/after-watch/nativescript-dev-typescript.js

Nativescript-audio and -videoplayer can't be controlled alongside each other

Hi

First of all thanks for some great plugins guys! have made the progress on my latest project much faster and much easier.
But.. I'm now in a situation where i'm in need of running separate audio tracks alongside a video being played.
Assumed i have set it up correctly (more on that further down the post), this is how i try to handle pause() and play() for both audio and video.

cycleVideoPlayer.on(gestures.GestureTypes.swipe, function(eventdata) {
    console.log("Swipe detected, with direction: " + eventdata.direction);
    if (!paused) {

        vm.audioPlayer.pause()
        .then((result) => {

          paused = true;
          const id = timer.setTimer(() => {
            vm.cycleVideoPlayerViewModel.pause(); //This doesnt work alongside audiopause.. why?
            console.log("cycleVideo should be paused");
          },500);

        }, (err) => {

          console.log("pause sound and video failed with error:\
                      " + err);
        });
        console.log("Paused");
      
    } else {
      
        vm.audioPlayer.resume()
        .then((result) => {

          paused = false;
          const id = timer.setTimer(() => {
            vm.cycleVideoPlayerViewModel.play();
            console.log("cycleVideo should be playing");
          }, 500);

        }, (err) => {

          console.log("resume sound and video failed with error:\
          " + err);
        });
        console.log("Resumed");          
    }
  });

I've made a post on stackoverflow which describes the situation in more detail. I still haven't solved the issue and hope you guys can help another dev out :)
Link to stackoverflow: https://stackoverflow.com/questions/44927740/cant-make-nativescript-videoplayer-and-nativescript-audioplayer-work-alongside

Hope to hear from you

// Mikkel

how can I create a count down/progress bar

I want to display a count down indicator, so i want to know current position, such as already played seconds. I checked the doc and it doesn't look like any progress callback.
So how can I get the already played period?

Thanks

How to get recorded audio file base64 image

Hi,

great plugin; i have the need to get the recorded audio file and encode it in base64 for posting it to the web (or to some chat); how can i accomplish this ?

Thanks very much in advance

URL with Referer

Hi,

The URL that i'm calling need to be called with a referer.
Any idea how I can perform this with this plugin ?

Thanks !

Set Volume on a player

Hi guys,

How could I est the volume on a player.

I am using 6 players on my app which should be able to play at different volumes.
I couldn't find a module that sets the volume (like on EZPlayer).

How could I add this to my app?,

Thanks!

Audio randomly stops playing on android device

I made an app that make a sound whenever a task is completed. And sometimes, the audio doesn't play. But then it will work just fine for most of the time.

Here's the error status: {"mp":{},"what":100,"extra":0}

Does anyone know what happen?

OSStatus error 2003334207

Hey,

I had issue to play sound on iOS so I went through running your demo, and found the same errors:

CONSOLE LOG file:///app/main-view-model.js:132:24: NSErrorWrapper: The operation couldn’t be completed. (OSStatus error 2003334207.).

Btw, this bit of the demo needs to be updated: from fs.knownFolders.currentApp().getFolder("audio") to fs.knownFolders.documents().getFolder("audio").

I'm unable to play sounds, neither from remote or local...

How to play audio in Background Mode?

What is the correct way of playing audible content in background?
I'm playing an mp3 in a loop - but as soon as the phone's screen locks, the sound will be stopped.

On Ionic i used the plugin "BackgroundMode" which works perfectly: the sound keeps on playing.

How can that be achived with NativeScript audio?

Android Crashes

2 semi-related issues.

On android, playFromFile requires all the callbacks to be present, because it unconditionally sets the callbacks (unlike playFromUrl which only sets them if they are present). These should be optional (since the docs imply they are)

Second, the onErrorListener and onInfoListener are supposed to take functions that return boolean (see http://developer.android.com/reference/android/media/MediaPlayer.OnErrorListener.html and http://developer.android.com/reference/android/media/MediaPlayer.OnInfoListener.html ), and it crashes trying to get the boolean value of the return value. This applies to both playFromFile and playFromUrl.

iOS is not working

I have this very simple implementation. In Android it works fine. In iOS you will hear no audio, but the track will "play" for the correct amount of time. It seems like the track is playing, but no audio is being played.

I am also having the exact same issue with the DEMO application. Android works fine and iOS does not.

Here is my simple implementation.

Component

import {Component} from "@angular/core";
import {TNSPlayer} from "nativescript-audio";

@Component({
    selector: "ns-items",
    moduleId: module.id,
    templateUrl: "./items.component.html",
})
export class ItemsComponent {

    private _player: TNSPlayer;

    constructor() {
        this._player = new TNSPlayer();
        this._player.initFromUrl({
            audioFile: "http://www.noiseaddicts.com/samples_1w72b820/2514.mp3",
            loop: false,
            completeCallback: this.audioComplete,
            errorCallback: this.audioError
        });
    }

    public togglePlay(): void {
        (this._player.isAudioPlaying()) ? this._player.pause() : this._player.play();
    }

    private audioComplete(): void {
        console.log("audio complete success");
    }

    private audioError(): void {
        console.log("audio error");
    }
}

Template

<GridLayout class="p-30">
    <Button class="btn btn-primary btn-active" id="button" text="Play" (tap)="togglePlay()"></Button>
</GridLayout>

I've used this plugin to record and play audio. I build android version successfully but not work on iOS. I attach my code.

import { Component } from "@angular/core";
import { Observable } from 'data/observable';
import { Router } from "@angular/router";
import { RouterExtensions, PageRoute } from "nativescript-angular/router";
import { TNSFontIconService } from 'nativescript-ng2-fonticon';
import {Page} from "ui/page";

import * as platform from 'platform';
import { knownFolders, File } from "file-system";
import { TNSRecorder, TNSPlayer, AudioPlayerOptions, AudioRecorderOptions } from 'nativescript-audio';
import { isAndroid, isIOS, device, screen } from "platform";

import { GlobalSettings } from '../../globals/globals';
import * as Toast from 'nativescript-toast';

declare var android;

@component({
selector: "audiorecord",
templateUrl: "template/record_audio/audiorecord.html",
styleUrls: ["template/record_audio/audiorecord.css"]
})
export class AudioRecordComponent extends Observable {
public recordState: number = 0;
// 0 - waiting
// 1 - recording
// 2 - record-stop
// 3 - playing
// 4 - play-stop
public recordedAudioFile: string;
private recorder;
private player;
private audioSessionId;
private page;
private meterInterval: any;
private recordedName: string;

private timerId: number;
private timeVal: number = 0;
timerActivated = false;
private durationTime: string = "00:00"
// Your TypeScript logic goes here
constructor(private router: Router, page: Page, private routerExtensions: RouterExtensions) {
super();
page.actionBarHidden = true;

this.player = new TNSPlayer();
this.recorder = new TNSRecorder();
this.player.volume = 1.0;

this.recordedName = GlobalSettings.question_id + this.platformExtension();

}

platformExtension() {
var ex = "";

if (isAndroid) {
  ex = ".mp3";
} else if (isIOS) {
  ex = ".caf";
}

return ex;

}

message (msg: string) {
Toast.makeText(msg).show();
}

public backToItem () {
this.routerExtensions.back();
}

private initMeter() {
this.resetMeter();
this.meterInterval = setInterval(() => {
console.log(this.recorder.getMeters());
}, 500);
}

private resetMeter() {
if (this.meterInterval) {
clearInterval(this.meterInterval);
this.meterInterval = undefined;
}
}

public setTime () {
var minutes = Math.floor(this.timeVal / 60);
var seconds = this.timeVal % 60;

function str_pad_left(string,pad,length) {
  return (new Array(length+1).join(pad)+string).slice(-length);
}
var finalTime = str_pad_left(minutes,'0',2)+':'+str_pad_left(seconds,'0',2);
this.durationTime = finalTime;
console.log(this.durationTime);

}

private startTimer () {
this.timeVal = 0;
this.timerActivated = true;
countTime();

var that = this;

function countTime () {
  setTimeout (() => {
    if (that.timerActivated) {
      that.timeVal++;
      that.setTime();
      countTime();
    }
  }, 1000);
}

}

private stopTimer () {
this.timerActivated = false;
}

public recordStart (args: any) {
if (TNSRecorder.CAN_RECORD()) {
console.log("This device can record audio");

  var audioFolder = knownFolders.currentApp().getFolder("audio");
  console.log(JSON.stringify(audioFolder));


  let recordingPath = audioFolder.path + "/" + this.recordedName;
  let recorderOptions: AudioRecorderOptions = {

    filename: recordingPath,
    metering: true,

    infoCallback: (infoObject) => {
      console.log("File information: " + JSON.stringify(infoObject));
    },

    errorCallback: (errorObject) => {
      console.log("Record Error: " + JSON.stringify(errorObject));
    }
  };

  this.recorder.start(recorderOptions).then((result) => {
    this.startTimer();
    this.recordState = 1;
    if (recorderOptions.metering) {
      this.initMeter();
    }
  }, (err) => {
    this.recordState = 0;
    this.resetMeter();
    console.log(err);
  });

} else {
  alert("This device cannot record audio");
}

}

public recordStop(args: any) {
this.resetMeter();
this.recorder.stop().then(() => {
this.stopTimer();
this.resetMeter();
var audioFolder = knownFolders.currentApp().getFolder("audio");
GlobalSettings.audio_file = audioFolder.path + "/" + this.recordedName;
this.recordState = 0;
}, (ex) => {
console.log("**************");
console.log(ex);
this.recordState = 0;
this.stopTimer();
this.resetMeter();
});
}

public isRecorded () {
var retVal = false;
if (GlobalSettings.audio_file != "") {
retVal = true;
}
return retVal;
}

public playRecordedFile(args) {
if (this.isRecorded()) {
var audioFolder = knownFolders.currentApp().getFolder("audio");
var recordedFile = audioFolder.getFile(this.recordedName);

  try {
    console.log('recording exists: ' + File.exists(recordedFile.path));
    console.log(File.fromPath(recordedFile.path));
    this.recordedAudioFile = recordedFile.path;
  } catch (ex) {
    console.log(ex);
  }

  var playerOptions: AudioPlayerOptions = {
    audioFile: recordedFile.path,
    loop: false,
    completeCallback: () => {
      this.player.dispose().then(() => {
        console.log('DISPOSED');
        this.stopTimer();
        this.recordState = 0;
      }, (err) => {
        console.log(err);
      });
    },

    errorCallback: (errorObject) => {
      console.log("Play Error: " + JSON.stringify(errorObject));
    },

    infoCallback: (infoObject) => {
      console.log("File information: " + JSON.stringify(infoObject));
    }
  };

  this.player.playFromFile(playerOptions).then(() => {
    this.startTimer();
    console.log("playing");
  }, (err) => {
    console.log("playing error");
  });
} else {
  this.message("No file you recorded for this fask. Please record your reply, first.");
}

}
}

Getting this error on android

Error: java.lang.RuntimeException: setAudioSource failed.
JS: android.media.MediaRecorder.setAudioSource(Native Method)
JS: com.tns.Runtime.callJSMethodNative(Native Method)
JS: com.tns.Runtime.dispatchCallJSMethodNative(Runtime.java:1197)
JS: com.tns.Runtime.callJSMethodImpl(Runtime.java:1061)
JS: com.tns.Runtime.callJSMethod(Runtime.java:1047)
JS: com.tns.Runtime.callJSMethod(Runtime.java:1028)
JS: com.tns.Runtime.callJSMethod(Runtime.java:1018)
JS: com.tns.gen.android.view.View_OnClickListener.onClick(android.view.View$OnClickListener.java)
JS: android.view.View.performClick(View.java:5207)
JS: android.view.View$PerformClick.run(View.java:21177)
JS: android.os.Handler.handleCallback(Handler.java:739)
JS: android.os.Handler.dispatchMessage(Handler.java:95)
JS: android.os.Looper.loop(Looper.java:148)
JS: android.app.ActivityThread.main(ActivityThread.java:5441)
JS: java.lang.reflect.Method.invoke(Native Method)
JS: com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:738)
JS: com.android.internal.os.ZygoteInit.main(ZygoteInit.java:628)

Can't play my recorded file into web browser

Hi, i finally succeeded to record an mp3 file with the plugin in android.

I used outputformat 2 (that i think should be MPEG_4) and audio encoder 1.
I have the need to post this recorded file to the web, so i encode it as Base64 and transfer it prepending data:audio/mp3;base64, to the encoded.

But that file, that correctly plays on my phone (which recorded it) does not play on the web (only on Safari, but not on chrome, mozilla, ecc...).

I previously did the same thing in another cordova app of mine, and using the media-capture plugin i could correctly listen to the phone recorded files onto the web (where i put them inside and HTML5 tag.

Can someone please help me ?

Thanks in advance

. Processing node_modules failed. Error: cp: cannot create directory 'D:/xgm/demos /nativescript/nativescript-audio-master/demo/platforms/android/src/main/assets/a pp/tns_modules': No such file or directory

D:\xgm\demos\nativescript\nativescript-audio-master\demo>tns build android --com
pileSdk 25
Executing before-prepare hook from D:\xgm\demos\nativescript\nativescript-audio-
master\demo\hooks\before-prepare\nativescript-dev-typescript.js
Found peer TypeScript 2.5.3
Preparing project...
Processing node_modules failed. Error: cp: cannot create directory 'D:/xgm/demos
/nativescript/nativescript-audio-master/demo/platforms/android/src/main/assets/a
pp/tns_modules': No such file or directory
Sending exception report (press Ctrl+C to stop)....

in D:\xgm\demos\nativescript\nativescript-audio-master\demo
1.tns install
2.tns build android --compileSdk 25
the error is happen

I use the demo,no changed

tns info:
All NativeScript components versions information
──────────────┬──────────────────┐
│ Component │ Current version │ Latest version │ Information │
│ nativescript │ 3.2.0 │ 3.2.1 │ Update available │
│ tns-core-modules │ 3.2.0 │ 3.2.0 │ Up to date │
│ tns-android │ 3.1.1 │ 3.2.0 │ Update available │
│ tns-ios │ │ 3.2.0 │ Not installed │
└───────────────── ─┴─────────────────┴──
![
111

Eliminate the "dependancy" on tns-core-modules

Some versions of NPM will install dependencies in a node_modules of the plugin. When you get TWO tns-core-modules in your project; the project can crash on Android with a really weird error message about having to use frame to navigate.

IF you need to fix it to a version of the runtime use a "peerDependencies"

Exception happened when running on device

Hi

This's a great plugin and it works great on emulators. But when I tried to deploy(the demo or an test app created by me) and play(remote and local both tried) on a Samsung device(N-7100, android 4.3), there's an exception:

com.tns.NativeScriptException:
Calling js method onInfo failed

TypeError: Cannot read property 'msg' of undefined
File: "/data/data/org.nativescript.audio/files/app/main-view-model.js, line: 119, column: 50

StackTrace:
Frame: function:'playerOptions.infoCallback', file:'/data/data/org.nativescript.audio/files/app/main-view-model.js', line: 119, column: 51
Frame: function:'_this.player.setOnInfoListener.MediaPlayer.OnInfoListener.onInfo', file:'/data/data/org.nativescript.audio/files/app/tns_modules/nativescript-audio/src/android/player.js', line: 45, column: 37

at com.tns.Runtime.callJSMethodNative(Native Method)
at com.tns.Runtime.dispatchCallJSMethodNative(Runtime.java:861)
at com.tns.Runtime.callJSMethodImpl(Runtime.java:726)
at com.tns.Runtime.callJSMethod(Runtime.java:712)
at com.tns.Runtime.callJSMethod(Runtime.java:693)
at com.tns.Runtime.callJSMethod(Runtime.java:683)
at com.tns.gen.android.media.MediaPlayer_OnInfoListener_ftns_modules_nativescript-audio_src_android_player_l43_c52__.onInfo(android.media.MediaPlayer$OnInfoListener.java)
at android.media.MediaPlayer$EventHandler.handleMessage(MediaPlayer.java:2521)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:137)
at android.app.ActivityThread.main(ActivityThread.java:5493)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:525)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1209)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1025)
at dalvik.system.NativeStart.main(Native Method)

Please kindly take a look and tell me if there's any work around...
Thank you very much!!

Loop on adroid does not seem to work

Hi,
when using the loop declaration in the TNSPlayer: the loop works on ios but not on android simulator.

Now I'm wondering if the error is in my code or in the plugin?
There are no error declarations anywhere, and i get the console logs:
"Launching play"
"audio file complete"
DISPOSED

import { Component } from "@angular/core";
import { TNSPlayer} from 'nativescript-audio';
import * as dialogs from 'ui/dialogs';



@Component({
    selector: "my-app",
    templateUrl: "app.component.html",
})


export class AppComponent {
  public counter: number = 16;
  public player; 
  public isPlaying: boolean= false;
  public started:boolean=false
  constructor() {
    this.player=new TNSPlayer();
  } 
  public playerOptions = {
        audioFile: '~/audio/TestSound.wav',
        loop: true,
        completeCallback: () => {
            console.log("Audio file complete");
            this.isPlaying=false
            this.started=false
            this.player.dispose().then(() => {
            console.log('DISPOSED');
          }, (err) => {
            console.log('ERROR disposePlayer: ' + err);
          });
        },
        errorCallback: (errorObject) => {
          console.log('Error occurred during playback.');
          console.log(JSON.stringify(errorObject));
        },

        infoCallback: (info) => {
          console.log(JSON.stringify(info));

          dialogs.alert('Info callback: ' + info.msg);
          console.log(JSON.stringify(info));
        }
     }


    public get message(): string {
        if (this.counter > 0) {
            return this.counter + " taps left";
        } else {
            return "Hoorraaay! \nYou are ready to start building!";
        }
    }
    
    public onTap() {
        this.counter--;
    }
    public play() {
        if (this.isPlaying==false){      
            if(this.started==false){
                console.log("launching play");
                console.log(JSON.stringify(this.player));  
                this.player.playFromFile(this.playerOptions).then(() => {
                this.isPlaying= true;
                this.started=true;
                }, (err) => {
                console.log("THERE IS AN ERROR BUT I DON4T KNOW WHAT IT MEANS");
                console.log(this.isPlaying);
                console.log(err);
                this.isPlaying= false;
                })
            } else{
                this.player.resume();
                this.isPlaying=true;
            }
        } else {
                this.player.pause();
                this.isPlaying=false;
        }
}
}

App crashes with Segmentation fault: 11

Hi @bradmartin,

I am experiencing a crash on iOS (simulator and device) whilst using your plugin. I think my code is essentially the same as your demo.

var audioModule = require('nativescript-audio');

var player = new audioModule.TNSPlayer();
var audioSessionId;

exports.tapPlaySound = function(args) {
    playAudio('~/sound/alert_14_ascending.mp3');
}

function playAudio(filepath) {

    try {
        var playerOptions = {
            audioFile: filepath,

            completeCallback: () => {
                console.log('audio player completeCallback');

                player.dispose().then(() => {
                    console.log('DISPOSED');
                }, (err) => {
                    console.log('ERROR disposePlayer: ' + err);
                });
            },

            errorCallback: (errorObject) => {
                console.log('audio player errorCallback');
                console.log(JSON.stringify(errorObject));
            },

            infoCallback: (args) => {
                console.log('audio player infoCallback');
                console.log(JSON.stringify(args));
            }
        };

        player.playFromFile(playerOptions).then(() => {
        }, (err) => {
            console.log('audio player playFromFile error');
            console.log(err);
        });

    } catch (ex) {
        console.log(ex);
    }

}

Console output:

CONSOLE LOG file:///app/main-page.js:17:28: audio player completeCallback
CONSOLE LOG file:///app/main-page.js:20:32: DISPOSED
[aqme] 254: AQDefaultDevice (173): skipping input stream 0 0 0x0
[aqme] 254: AQDefaultDevice (173): skipping input stream 0 0 0x0
[aqme] 254: AQDefaultDevice (173): skipping input stream 0 0 0x0
[aqme] 254: AQDefaultDevice (173): skipping input stream 0 0 0x0
May 12 10:36:00 Dans-iMac com.apple.CoreSimulator.SimDevice.78DEE541-DC78-4569-990C-37D62E8F4C7B.launchd_sim[1285] (UIKitApplication:org.nativescript.templateblank[0x9a91][1308][8642]): Service exited due to Segmentation fault: 11

Here is a sample project that consistently reproduces the problem for me: https://github.com/3rror404/nativescript-audio-test

┌──────────────────┬─────────────────┬────────────────┬──────────────────┐
│ Component        │ Current version │ Latest version │ Information      │
│ nativescript     │ 2.5.5           │ 3.0.1          │ Update available │
│ tns-core-modules │ 2.5.2           │ 3.0.0          │ Update available │
│ tns-android      │                 │ 3.0.0          │ Not installed    │
│ tns-ios          │ 2.5.0           │ 3.0.0          │ Update available │
└──────────────────┴─────────────────┴────────────────┴──────────────────┘

┌────────────────────┬─────────┐
│ Plugin             │ Version │
│ nativescript-audio │ ^2.1.5  │
│ tns-core-modules   │ ^2.5.2  │
└────────────────────┴─────────┘

MacOS: 10.12.4
iOS: 10.3
Xcode: 8.3.2

Potential bug / issue: setAudioSource failed.

I installed the plugin and copied the demo-code.
Unfortunately I got an error every time I wanted to record audio:

Error: java.lang.RuntimeException: setAudioSource failed.
JS:     android.media.MediaRecorder.setAudioSource(Native Method)
JS:     com.tns.Runtime.callJSMethodNative(Native Method)
JS:     com.tns.Runtime.dispatchCallJSMethodNative(Runtime.java:1197)
JS:     com.tns.Runtime.callJSMethodImpl(Runtime.java:1061)
JS:     com.tns.Runtime.callJSMethod(Runtime.java:1047)
JS:     com.tns.Runtime.callJSMethod(Runtime.java:1028)
JS:     com.tns.Runtime.callJSMethod(Runtime.java:1018)
JS:     com.tns.gen.android.view.View_OnClickListener.onClick(android.view.View$OnClickListener.java)
JS:     android.view.View.performClick(View.java:5226)
JS:     android.view.View$PerformClick.run(View.java:21266)
JS:     android.os.Handler.handleCallback(Handler.java:739)
JS:     android.os.Handler.dispatchMessage(Handler.java:95)
JS:     android.os.Looper.loop(Looper.java:168)
JS:     android.app.ActivityThread.main(ActivityThread.java:5845)
JS:     java.lang.reflect.Method.invoke(Native Method)
JS:     com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:797)
JS:     com.android.internal.os.ZygoteInit.main(ZygoteInit.java:687)

No improvements after I added the permission to the AndroidManifest.xml:
<uses-permission android:name="android.permission.RECORD_AUDIO" />

Then after changing the targetSdkVersion from "__APILEVEL__" to "22", it worked:

<uses-sdk
	android:minSdkVersion="17"
	android:targetSdkVersion="22"/>

If this isn't a bug, it could be helpful to mention it in the README?

However I'm glad, that it works now 👍

Feature Request: Add pause() method to recorder

It would be a useful enhancement for this plugin to have a 'pause()' feature available for the recorder, so as to allow pausing and resuming recording within the same configuration state. Incidentally, this is a feature available natively on both Android and iOS environments.

Get recorded file in main-page.js

I'm getting this error when trying to play recorded audio:

player error: Error: java.io.IOException: setDataSource failed

main-page.xml:

<StackLayout class="p-20">
        
    <ActivityIndicator color="#3489db" busy="{{ isRecording }}" />
    <Button text="Start Recording" tap="start" />
    <Button text="Stop Recording" tap="stop" />
    <Button text="Get Recorded File" tap="getFile" />
    <label text="{{ recordedAudioFile }}" color="#3489db" textWrap="true" />

   <Button text="Play" tap="playAudio" />

</StackLayout>

main-page.js:

var data = new observable.Observable({});
var recorder;
var audioName;

/* START RECORDING */
 
function start(args) {
    
      permissions.requestPermission(android.Manifest.permission.RECORD_AUDIO, "Let me hear your thoughts...")
    .then(function () {
    
      // you should check if the device has recording capabilities
      if (audio.TNSRecorder.CAN_RECORD()) {
    
        recorder = new audio.TNSRecorder();
    
        var audioFolder = fs.knownFolders.currentApp().getFolder("audio");
    
        var recorderOptions = {
    
          filename: audioFolder.path + '/recording.mp3',
          infoCallback: function () {
             console.log('infoCallback');
           },
          errorCallback: function () {
             console.log('errorCallback');
             alert('Error recording.');
           }
        };

        audioName = recorderOptions.filename;
    
       console.log('RECORDER OPTIONS: ' + recorderOptions);
    
       recorder.start(recorderOptions).then(function (res) {
          data.set('isRecording', true);
       }, function (err) {
           data.set('isRecording', false);
           console.log('ERROR: ' + err);
       });
    
      } else {
        alert('This device cannot record audio.');
      }
    
     })
      .catch(function () {
         console.log("Uh oh, no permissions - plan B time!");
      });
   }
   exports.start = start;
    
   /* STOP RECORDING */
    
   function stop(args) {
      if (recorder != undefined) {
        recorder.stop().then(function () {
        data.set('isRecording', false);
        alert('Audio Recorded Successfully.');
      }, function (err) {
        console.log(err);
        data.set('isRecording', false);
      });
     }
   }
   exports.stop = stop;
    
   function getFile(args) {
    try {
       var audioFolder = fs.knownFolders.currentApp().getFolder("audio");
       var recordedFile = audioFolder.getFile('recording.mp3');
       data.set("recordedAudioFile", recordedFile.path);
     } catch (ex) {
       console.log(ex);
     }
   }
   exports.getFile = getFile;

   /* PLAY */
   function playAudio() {
        
    var player = new audio.TNSPlayer();
    var playerOptions = {
     audioFile: audioName,
     loop: false,
     completeCallback: function () {
         alert('finished playing')
     },
     errorCallback: function (errorObject) {
         alert(JSON.stringify(errorObject));
     },
     infoCallback: function (args) {
         alert(JSON.stringify(args));
     }
    };

    
    player.playFromFile(playerOptions)
     .then(function (res) {
      alert(res);
    })
    .catch(function (errorObject) {
        alert("play error: "+ errorObject);
    });

   }

   exports.playAudio = playAudio;

How to add Seekbar?

Hi, I',m using this plugin for my application and its working really good, but i want to make a seek bar where users can change audio track position. Unfortunately i could no see this option in provided document. Is there any way?
Quick reply will be really appreciated.
Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.