Giter Club home page Giter Club logo

loopchat-client's Introduction

LoopChat Client

freaky friend beats ๐ŸŽผ

Build Status Coverage Status GitBook Trello

What is LoopChat?

LoopChat is a real-time music collaboration application. Think Google Docs, but for making beats, noise loops, etc. with your friends!

Functional Requirements

  1. Real-Time Music Generation: players shall be able to generate sounds and apply treatments in real-time with session-mates.
  2. MIDI Generation: players shall be able to generate MIDI data from their browsers.
  3. MIDI Editor: players shall be able to edit midi sequences in real time.
  4. Record Audio: players shall be able to record audio from the computer microphone.
  5. Save/Export: players shall be able to save drafts and export to a variety of audio formats.
  6. Create Samples: players shall be able to upload audio snippets to create samples.
  7. Create MIDI Sequences: players shall be able to generate midi sequences similar to samples.
  8. Create Timelines: players shall be able to create multiple timelines of audio. Time lines can be converted into loops/samples.
  9. Create Tracks: players shall be able to create multiple tracks of audio using any input source (including loops/samples). Tracks can be converted to loops/samples.
  10. Create Loops: players shall be able to create loops composed of a number of tracks or other loops.
  11. Playback: players shall be able to Play, Pause, Stop, FF, RW, Reverse, Scrub timelines/samples.
  12. Track Edit: players shall be able to Cut, Copy, Move, Paste, Crop, Mute, Edit Volume of tracks.
  13. Tempo: players shall be able to edit the tempo and signature of individual timelines. This way, multiple timelines can be composed to generate polyrhythms. Additionally, players shall have fine grained control over quantizing MIDI, samples, and granularity of the visual tempo grid.
  14. Instruments: players shall be able to choose a variety of instruments to play including Sampler, Synths, Drums, Sequencer.
  15. Players: players can join sessions, projects, groups. Players can make a profile to show what groups/projects they work on. Players in a session can chat in real-time.
  16. Publish: players or groups can publish selected works, snippets, or sessions.

Ontology

  • Collaboration Context: Since collaboration is a primary focus of this application, it is important for each component to keep of the players which are currently operating on it. The collaboration context is a basic way for describe player interactions at varying levels of granularity.
  • Timeline: Playback of all audio must take place in a timeline. Timelines have their own specified duration, tempo, and time signature. Timelines have their own collaboration contexts.
  • Sample: Every snippet of audio is a sample. A sample can be edited, treated with effects, and looped. A sample can be created by importing pre-existing audio, or by converting a timeline/track into a sample. Each sample has its own timeline.
  • MIDI Sequence: Similar to samples, MIDI Sequences have independent timelines and are the basic chunk MIDI data. Each MIDI Sequence has an instrument that it uses to translate the midi data to audio.
  • Track: A track is a linear sequence of audio coming from an input source. Samples can be dragged onto tracks. A track can co-exist with parallel tracks on the same timeline. A single track or a collection of tracks on the same timeline can be converted to samples. Midi can be directly input on a track or a MIDI Sequence can be dragged onto a track.
  • Instrument: For MIDI data (generated by MIDI sequences), an instrument must be assigned to convert the midi data into audio. The most important instruments are
    1. Sample Sequencer: Sequences a bank of audio samples. Generates a MIDI Sequence which is translated to playing the supplied audio samples. The interface for the drum machine will allow players to generate MIDI Sequences using their specified sample bank.
    2. Synth Sequencer: Sequences MIDI data to play synths. (Could be in same interface as Drum Sequencer)
    3. Mic/Line In: directly record from a mic or line in.
  • Effects: all audio can be treated with a daisy chain of effects.

UX

  • Active Context: There can only be one active timeline context that a given player can edit/playback. This timeline can be either a Sequence, Sample, or Track Collection (called a Timeline in UI). Since the active context can be comprised of other timelines which can be simultaneously edited by other players, when a context is active for a given user, all sub-timelines will be frozen at time of opening the context. Players should be able to see when sub-contexts have been edited by their collaborators and either choose to merge the changes or maintain a separate branch of the sub-contexts. Additionally, each active context will have access to effects and be able to apply an effects daisy chain.
  • Timelines: this is a collection of Track Collections.
  • Samples: this is a collection of samples.
  • Sequences: this is a collection of sequences.
  • Current Session: List the collaborators (online/offline) and session details.
  • Chat: chat interface for player communication.

Wireframes

alt text alt text alt text alt text alt text

Resources

loopchat-client's People

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

loopchat-client's Issues

Sweeping Algorithm To Process Overdubs into Master

Overview

Each recording can have an arbitrary number of overdubs which are consolidated into a master timeline for the recording. Given that overdubs can either overwrite or overlay, the algorithm must be able to efficiently handle all layers. Since all overdubs must be layered overtime, we must rely on the server to sort the creation of each overdub creation by time. This means the server will be stateful w.r.t. to the clients who have created an overdub first. Since reliance on a connected server might be annoying, we can default to offline mode if there is no connection or something.

Setup Redux Unit Tests

Overview

There need to be Unit Tests for the Redux logic. This way, if actions, reducers, or selectors are refactored, we know when something has been broken.

Read MIDI Data from USB MIDI Device

Overview

Players should be able to use their own USB connected MIDI devices to generate MIDI data.

Functional Requirements

  • players shall be able to configure which USB connected instrument they wish to use

Record User Generated MIDI Data To UI Timeline

Overview

Players shall be able to record MIDI data they generate onto the current timeline.

Functional Requirements

given when then
no MIDI source is active MIDI source is selected activate the selected MIDI source
MIDI source is active MIDI source is selected activate the selected MIDI source
no MIDI data recorded record button is pressed cursor move along the timeline and record MIDI data which is input
MIDI data recorded record button is pressed cursor move along timeline and overwrites MIDI data in record interval

Read MIDI Data From Computer Keyboard

Overview

Players should be able to use their computer keyboards to generate MIDI data.

Functional Requirements

  • asdfghjkl shall be mapped to the white keys of a piano keyboard starting at C and ending at D ๐ŸŽน
  • wetyuop1 shall be mapped to the black keys of a piano keyboard starting at C# and ending at D#

Playback Recorded MIDI Data

Overview

Players shall be able to playback the MIDI data that they record and it should use the sounds that they specify from the server.

Questions

Should playback relay on server rendering of audio?

Setup Redux Dev Tools Env

Overview

For debugging, it would be useful to use the Redux dev tools in the browser to look at the store.

Route Midi Events Appropriately With Midi Bus

Overview

Currently, midi events are all routed to the midi-middleware MidiEventBus, but we need to be able to:

  1. Process Events: enable/disable processing of midi events on a per-device basis through redux actions
  2. Assign Instrument: assign/unassign processed output of midi-events to a virtual instrument on a per-device basis through redux actions
  3. Record: start/stop recording processed midi events on a per-devices basis through redux actions

Requirements

Processing Events

given when then
processing of device A events is disabled device A events occur events are ignored
"" DEACTIVATE_MIDI_INPUT_DEVICE for device A is dispatched processing of device A events is disabled
"" ACTIVATE_MIDI_INPUT_DEVICE for device A is dispatched processing of device A events is enabled
processing of device A events is enabled device A events occur events are processed AND each event dispatches an event updating the status of the midi input in the redux store
"" ACTIVATE_MIDI_INPUT_DEVICE for device A is dispatched processing of device A events is enabled
processing of device A events is enabled DEACTIVATE_MIDI_INPUT_DEVICE for device A is dispatched processing of device A events is disabled

Assigning Instrument
we assume that the processing of device A events is enabled

given when then
no instrument assigned to device A device A events occur events are sent to no instrument AND there is no audio output
no instrument assigned to device A ASSIGN_INSTRUMENT_TO_MIDI_DEVICE for a device and an instrument is dispatched a new instrument of selected type is instantiated and assigned as output
instrument A is assigned to device A ASSIGN_INSTRUMENT_TO_MIDI_DEVICE for device A and instrument B is dispatched a new instrument B is instantiated and assigned to the output of device A (overwriting the instrument A assignment)

Recording
we assume that the processing of device A events is enabled

given when then
recording is not in progress MIDI_RECORDING_STOPPED for device A is dispatched recording remains not in progress
"" MIDI_RECORDING_STARTED for device A is dispatched a new recording object is created AND dispatched to the redux store with MIDI_RECORDING_CREATED AND recording is in progress
recording is in progress a device A event occurs the processed event is stored in the recording object AND the updated recording object is dispatched to the redux store via MIDI_RECORDING_UPDATED

Notes

As a result of these requirements and the fact that we cannot store the MidiInput/MidiOutput objects returned from the MidiAccess object in the Redux store, we will have to track the state of the registered midi devices in both the Redux store (for the UI) and in the MidiEventBus (for event processing). This violates the single-source of truth principle which is encouraged in Redux applications. It would be nice to think of a better solution, but at this point I'm not sure if there is.

Timeline With Playback Controls

Overview

Players should be able to interact with a timeline with playback controls.

Functional Requirements

  • play from any point in the timeline
  • pause at any point in the timeline
  • stop from any point in the timeline, bringing the cursor to the beginning of the timeline
  • skip to the beginning and end of the timeline
  • loop the timeline

Design Sequence Timeline

Overview

Players should be able to record, edit, and playback a timeline with MIDI sequence data.

Functional Requirements

  • players shall be able to edit the time signature of the current timeline
  • players shall be able to edit the tempo of the current timeline
  • players shall be able to edit the granularity of the tempo grid
  • players shall be able to generate MIDI data using their keyboard
  • players shall be able to generate MIDI data using a USB MIDI device
  • players shall be able to record MIDI data in a timeline
  • players shall be able to start playback from any point within the timeline
  • players shall be able to pause playback within the timeline
  • players shall be able to skip to the beginning or end of a timeline
  • players shall be able to dynamically increase the timeline length
  • players shall be able to loop the timeline
  • players shall be able to set the start and end markers for the loop interval in the timeline

MIDI events not handled despite MIDI device being connected & active

Overview

When connecting and activating a midi device, often events are not handled and it takes some unplugging/replugging in of the usb MIDI device to get events to be properly read. This was not observed previously on OSX, but is being observed on Arch Linux using Google Chrome.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.