This app is meant to be or to become a serious alternative to proprietary VJing softwares without having to install anything else than a decent web browser (which you probably already have).

It focuses on live performance and interactivity with either the VJ able to code the visuals or control them with MIDI controllers. But it is also possible to extend it in order to respond to social media posting.

Controller

Open the controller to start.

Layers

Every layer has a type which defines what can be done with it (canvas for 2d, threejs for 3d, ...).

Scripts

The scripts receive several *"pseudo globals"* which depend on the type of layer they are bound to. For instance, most of the layers have utility functions and the canvas layer has additional functions based on the Canvas API.

Some common functions:

  • min() equivalent to Math.min().
  • max() equivalent to Math.max().
  • sin() equivalent to Math.sin().
  • tan() equivalent to Math.tan().
  • cos() equivalent to Math.cos().
  • random() equivalent to Math.random().
  • PI equivalent to Math.PI.
  • PI2 equivalent to Math.PI * 2.
  • and so forth

Setup

The setup script of a layer is meant to prepare and ease the animation script. It is runned when the layer instanciate.

Animation

The animation script will be called every time a frame can be rendered by the browser.

Signals

You can use signals in your scripts in order to add interactivity. For example, if you want to react to the mouse horizontal movment, add a read() function call as follow:

const mouseX = read('mouse-x');

The first argument used with the read() function should be the name of the signal and the second (optional) is the default value returned by the function.

Mouse

The mouse movment is captured when the cursor is hovering the controller display only.

const mouseX = read('mouseX', 0);
const mouseY = read('mouseY', 0);

Keyboard

The key pressed are only captured when the editor is not focused.

const keyA = read('a', 0);
const keyShiftA = read('a-s', 0);

Sound

The controller has an audio tab in which you can tune your input

![image](https://user-images.githubusercontent.com/65971/34534096-3aff73d8-f0bd-11e7-9253-41ae0d9c25fb.png)
The audio settings tab

and use sound in the scripts as follow

const frq = read('frequencies', []);
const vol = read('volume', []);

MIDI

WebMIDI is only supported in Google Chrome but it seems like Firefox (with a plugin) could also support it (not tested yet).

For now the supported devices are:

  • KORG INC.
    • KP3
    • nanoKONTROL2
  • AKAI professional LLC
    • LPD8
  • Focusrite A.E. Ltd
    • Launchpad Mini

WebSocket

Using WebSocket as signal is in its early development. It will allow external source (like touch or orientation events on mobile phones or tablets) to be communicated as signals to the controller.

App wide shortcuts

Keys Description
CTRL + S Saves the current setup (the name of the current setup can be found on the right side of the app toolbar)
CTRL + SHIFT + S Opens the storage dialog to save a setup
CTRL + O Opens the storage dialog to load a setup
CTRL + R Reload the current setup
CTRL + P Toggle animation
CTRL + D Open display window

Embeddedable

The demo at the top of this page is an example of how Fiha can be embedded in a page with an iframe.

<iframe src="https://zeropaper.gitlab.io/fiha/embedded/" style="width:100%;min-height:70vh;margin:0;padding:0;border:none"></iframe>

Local installation

Clone the repository locally, install the dependencies and run the "live" script:

git clone https://github.com/zeropaper/fiha.git
cd fiha
npm i
npm run live

Development

Fork the project, clone your forked repository locally, install the dependencies and start the development server:

# change with your fork repository URL if necessary
git clone https://github.com/zeropaper/fiha.git
cd fiha
npm i
npm start