midi and music support

acorrias

Member
Licensed User
Longtime User
Hi
I've been away far for a lot a time.
Does Basic4Android now support the develpment of music and audio apps?
I wish to develop app that play notes (single and chords) and that allow the user to choose among variouos instruments (piano, accordion, synth, guitar).

Is it possibile? is there any library? I've looked for that but i was not able to find somewhat.

thanks in advance
alex
 

stevel05

Expert
Licensed User
Longtime User
It is possible to write some musical apps, like those NJDude has listed, if you are looking to create more of a virtual instrument, it is much harder due to the lack of a real time audio kernal in Android and other OS limitations.

With current hardware and OS, you will be lucky to get latency below 150ms, far too long for a serious virtual instrument.

The beauty of using Basic For Android though is that you can try things out in a very short pace of time. So don't take my word for it. Have a look at Sound Pool, which is likely to give the best results currently, but requires samples for the instrument sounds. I looked briefly at the JET engine, which didn't look at all instrument friendly.

Unfortunately it is not possible to stream midi to the midi player, you have to send a file (Again an Android limitation), I have had reasonable success sending single notes as separate files which would enable the creation of a monophonic instrument. But again, having to create a file for each note introduces latency.

There are several threads in the forum if you search for them which may give some more insight.
 
Upvote 0

moster67

Expert
Licensed User
Longtime User
Being a home/hobby musician myself, this is also a subject which I am very keen on.

iOS has many products for musicians. For instance, IK Multimedia has a range of products including amplifiers etc which you can find here:

Products - Mobile

Recently they have launched a product for Android in their iRig-series, namely the iRig Recorder (see IK Multimedia | iRig Recorder for Android). I don't know but maybe in this case the latency is less important compared to a virtual amplifier. In either case, if you look at their website, you can see they are looking for Android Developers so maybe they are planning to introduce more products for Android. Here are some of the requirements for the job:

-A minimum of 2 years experience with Java, C++ and OOP programming with a solid background in computer science.
-Hands on experience developing Android applications.
Android NDK and OpenSL experience represents a plus.
-Basic/advanced knowledge of DSP, MIDI, audio plug-in architecture represents a plus.

Since NDK is mentioned, maybe they are building their own framework/hardware support to overcome the problem with the latency and to bypass some limitations of Android. Maybe it is just wishful thinking from my side.
 
Upvote 0

stevel05

Expert
Licensed User
Longtime User
Informatix said: The main latency is introduced by the screen

Unfortunately this is not the case take a look here.

Touch does add it's own latency which differs across hardware, a problem that IOS doesn't suffer from, the latency for the devices will be known as they use more or less the same screens.

Using NDK will make little or no difference as it's the OS kernal that is not optimized for real time audio.

There have been some improvements to playback and touch latency on Android 4.1 & Android 4.2, but it is hardware dependent and recording is still behind.
 
Last edited:
Upvote 0

Informatix

Expert
Licensed User
Longtime User
Unfortunately this is not the case take a look here.

In fact, there are two different problems: one is with the screen, and apart in the Microsoft laboratories (source), no one has created or plan to create a hardware solving this problem. Even Apple screens suffer from this problem.
The second problem is more usual (it's the same problem with a PC): the delay between an input signal and an output signal, through the audio layers. Google works on it and, since JB 4.1, this latency has been reduced at a usable level. But these improvements need a dedicated hardware and that's only be available when the devices will be on the market, but we can start to create audio effect processors or audio recorders. There's no more major problem (it wasn't one under iOS).
 
Last edited:
Upvote 0

stevel05

Expert
Licensed User
Longtime User
Unfortunately i don't have a 4.1+ device to try it on yet, I did see a quote of 80-90ms latency which is still quite high. Although it depends on exactly what use it's being put to.

I did see one effect processor on the market which is currently getting bad reviews because of it's latency. Until 4.1+ devices are the norm, it is probably not worth releasing anything. But it doesn't stop testing and discussion on these areas in preparation.
 
Upvote 0

acorrias

Member
Licensed User
Longtime User
So is there a way to have some soundfont (a piano, a guitar) and to play one or more chords? Some library that semplifies the job?

I think that is would be simple to create music apps but wihout music play (chord tabs, chord keyboards). But I would like to have some gui object that play some sound


thanks in advance
alex
 
Upvote 0

stevel05

Expert
Licensed User
Longtime User
Have a look at soundpool, you'll need a few samples to load for each instrument, and play it through touch events from buttons, or touch and the gesture library.

No there is not a simple way to do it, but it's not that difficult, it just takes a little reading and planning.
 
Upvote 0
Top