Java Sound Resources: FAQ: MIDI Programming

This page presents Questions and Answers related to the Java Sound API.

MIDI Programming

1. MidiDevice general
1.1. Why do getMaxReceivers() and getMaxTransmitters() return -1?
1.2. Why are there MidiDevice instances that provide only Transmitter or only Receiver instances?
1.3. Why does getTransmitter() on the same MidiDevice return different Transmitter objects?
1.4. Can I make a MidiDevice implemented in Java available to other (maybe native) programs?
2. MIDI Input/Output
2.1. Why does MIDI input and output not work with the Sun JDK 1.3/1.4?
2.2. Can I bundle a MIDI IO implementation with my application to ease installation?
2.3. How do I send sysex events?
2.4. I get confusing behaviour when using MidiSystem.getReceiver() and MidiSystem.getTransmitter().
2.5. Why are there two instances of MidiDevice for each MIDI port?
2.6. How can I find out if a MidiDevice instance represents a MIDI IN or a MIDI OUT port?
2.7. Can a MIDI device used by Java Sound be shared with other applications?
2.8. Why are "all notes off", "all sounds off" or other events delivered when a MidiDevice is closed?
3. Sequencer
3.1. Why is a Receiver registered to the Sequencer of the JDK 1.3/1.4 never called?
3.2. Why is the timing of the Sequencer unstable with the Sun JRE, but not with the Sun JDK?
3.3. How can I get notified of note on/off events?
3.4. Can I use a Sequencer to trigger the playback of audio samples/clips?
3.5. How can I find out which Synthesizer is used by the Sequencer?
3.6. How can I find out the track a message originated from when I receive it in a Receiver, ControllerListener or MetaEventListener?
3.7. How can I get a Sequencer that does not play notes on the default Synthesizer (or default MIDI device)?
3.8. Why is the timestamp associated with events emitted by a Sequencer via a Transmitter always -1?
3.9. How can I schedule single MIDI events?
3.10. What is the difference between the "Java Sound Sequencer" and the "Real Time Sequencer"?
3.11. Is the "Real Time Sequencer" available in the JDK 1.4?
3.12. How can I synchronize a Sequencer to an external clock?
4. Sequence Playback
4.1. Why does adding or removing events from a Sequence while it is playing have no effect?
4.2. How can I play RMF files?
4.3. Which method should I use to change the playback tempo: setTempoInBPM(), setTempoInMPQ() or setTempoFactor()?
4.4. Why is the tempo reset at the beginning of the loop when looping with the realtime sequencer?
4.5. How can I control the playback volume of the Sequencer?
4.6. How can I change the instrument when playing a Sequence?
4.7. How can I loop a Sequence?
4.8. How can I find out when a Sequence has completed playing?
4.9. Why is a Sequence "clipped" when listening to end-of-track meta events?
5. Synthesizer
5.1. How can I implement a custom synthesizer or sound producer for Java Sound?
5.2. How can I replace the default synthesizer of Java Sound?
5.3. How can I use a hardware synthesizer?
5.4. Is it possible to use a hardware synthesizer as a Synthesizer object?
5.5. Why can I hear sound from my program that uses a Synthesizer (or Sequencer) if the program is executed by the JDK, but not if it is executed by the JRE?
5.6. Why is there no sound at all when I use the Synthesizer directly?
5.7. Is the "Java Sound Synthesizer" GM compatible?
5.8. How can I change the sound (instrument) with a Synthesizer object?
5.9. Why does calling Synthesizer.programChange(...) have no effect?
5.10. How can I determine whether the "Java Sound Synthesizer" uses its software synthesis engine or a hardware synthesizer?
5.11. Which controllers and sysex messages are implemented in the "Java Sound Synthesizer"?
5.12. Why does Synthesizer.getVoiceStatus() return an empty array?
5.13. Why is there a delay when playing notes on the "Java Sound Synthesizer"?
5.14. How can I reduce the latency of the "Java Sound Synthesizer"?
5.15. How can I control the playback volume of the synthesizer?
5.16. How can I change attack, decay, sustain and release of an instrument?
5.17. What is special about MIDI channel 9/10?
5.18. Is there support for microtonal?
6. Soundbanks
6.1. Can I create my own soundbank?
6.2. Can I use soundbanks in other formats than the Beatnik format with Java Sound?
6.3. How do I load a soundbank (other than the default soundbank)?
6.4. Why am I still able to hear sound although Synthesizer.getDefaultSoundbank() returns null?
6.5. Why does my custom soundbank only work if the default soundbank (soundbank.gm) is available?
6.6. How can I use instruments with a number > 127?
6.7. Which soundbank is loaded as default if there are more than one?
6.8. Why is there no sound for instrument on banks 2 and 3?
6.9. Why does Synthesizer.getAvailableInstruments() return an empty array?
6.10. Is there support for soundfont (sf2) or DLS files?
6.11. Can I obtain a list of instruments available in a soundbank?
7. MIDI files and events
7.1. How are MIDI channels related to the Track class in Java Sound?
7.2. Why do some MIDI files have only NOTE_ON events, but no NOTE_OFF events?
7.3. How do I represent "running status" in MIDI events with Java Sound?
7.4. What is the relation between status, command and channel in ShortMessage?
7.5. How do I create an end of track event for a Sequence constructed by a program?
7.6. How do I create a Sequence (including events) from scratch?
7.7. How can I edit individual events in a Sequence?
7.8. How do I convert a RMF file to a MIDI file?
7.9. How do I convert a type 1 MIDI file to a type 0 MIDI file?
7.10. How can I add a tempo change event to a Sequence?
7.11. How can I add a text/lyric/marker/copyright/trackname meta event to a Sequence?
7.12. How can I assign a name to a track?
7.13. Why do I get an InvalidMidiMessageException when constructing a realtime message?
7.14. How can I search the corresponding note off event for a note on event?
7.15. Does the length of a Sequence in ticks as returned by getTickLength() change as the tempo is changed?
8. Miscellaneous
8.1. Why is the timing of MIDI playback unstable?
8.2. Which MIDI cards can I use with Java Sound?
8.3. Can I connect Java Sound MIDI applications to native MIDI applications?
8.4. How can I convert from audio to MIDI?
8.5. How can I convert from MIDI to audio (or save the output of the synthesizer to a file).
8.6. Where can I get documentation about RMF?
8.7. Should I use RMF or not?
8.8. Is there XMF support in Java Sound?
8.9. Where can I get a list of the standard MIDI instruments (names and patch numbers)?

1. MidiDevice general

1.1. Why do getMaxReceivers() and getMaxTransmitters() return -1?
1.2. Why are there MidiDevice instances that provide only Transmitter or only Receiver instances?
1.3. Why does getTransmitter() on the same MidiDevice return different Transmitter objects?
1.4. Can I make a MidiDevice implemented in Java available to other (maybe native) programs?
1.1.

Why do getMaxReceivers() and getMaxTransmitters() return -1?

This is correct, while currently undocumented behaviour. It just means that you can retrieve any number of Transmitters / Receivers, limited only by memory size. A MidiDevice not supporting Transmitters / Receivers returns 0 on the respective method. (Matthias)

1.2.

Why are there MidiDevice instances that provide only Transmitter or only Receiver instances?

For MIDI I/O devices, see Why are there two instances of MidiDevice for each MIDI port? for an explanation. Synthesizer implementations usually only have Receiver instances.The following table gives an overview:

MidiDevice typeTransmittersReceivers
MIDI IN portunlimited---
MIDI OUT port---unlimited
Java Sound Synthesizer---unlimited
Java Sound Sequencerunlimitedunlimited
Real Time Sequencerunlimitedunlimited

See also Why do getMaxReceivers() and getMaxTransmitters() return -1? and How can I find out if a MidiDevice instance represents a MIDI IN or a MIDI OUT port? (Matthias)

1.3.

Why does getTransmitter() on the same MidiDevice return different Transmitter objects?

You may want to route messages from the same MidiDevice to multiple destinations. In this case, you need multiple Transmitter instances to connect different Receiver instances to them. Therefore, returning different Transmitter objects is a necessity. (Matthias)

1.4.

Can I make a MidiDevice implemented in Java available to other (maybe native) programs?

The answer is no and yes. For a device to be available to other programs, you need to write a device driver for the operating system. If you manage to write a device driver, you're done. But making a device implementation in Java available from a device driver would mean to start a Java VM from a device driver, which is barely possible.

On the other hand, you typically can restructure your implementation to make it work indirectly. Instead of just implementing a MidiDevice, write a Java program that, for instance, listens to a configurable MIDI IN device and passes the incoming events to your custom synthesizer or MIDI OUT device. Now you can use virtual MIDI ports to make this program's features available to other programs, even native ones. See also Can I connect Java Sound MIDI applications to native MIDI applications? (Matthias)

2. MIDI Input/Output

2.1. Why does MIDI input and output not work with the Sun JDK 1.3/1.4?
2.2. Can I bundle a MIDI IO implementation with my application to ease installation?
2.3. How do I send sysex events?
2.4. I get confusing behaviour when using MidiSystem.getReceiver() and MidiSystem.getTransmitter().
2.5. Why are there two instances of MidiDevice for each MIDI port?
2.6. How can I find out if a MidiDevice instance represents a MIDI IN or a MIDI OUT port?
2.7. Can a MIDI device used by Java Sound be shared with other applications?
2.8. Why are "all notes off", "all sounds off" or other events delivered when a MidiDevice is closed?
2.1.

Why does MIDI input and output not work with the Sun JDK 1.3/1.4?

The vavailability of MIDI I/O depends on the operating system and the JDK version:

Operating SystemAvailability in JDKAlternative Solutions
Windowssince 1.4.1WireProvider provides MIDI I/O, but no Sequencer. implementation.
Linuxsince 1.5.0Tritonus has an almost complete MIDI implementation (including Sequencer implementations). It is based on the ALSA sequencer API. KfuenfMidiDeviceProvider provides MIDI I/O based on the OSS API.
Mac OS (X)not available in 1.4.2. Plans for 1.5.0 are unknown. See Q: 11The PLUM-STONE MIDI PROJECT provides MIDI I/O for Mac OSX. It does not provide a Sequencer implementation. However, it can be used with Tritonus' pure-java sequencer.
Solarissince 1.5.0---

Note that availability as listed above means only that you can use MidiDevice instances to access MIDI ports. If you want a Sequencer play to a MIDI port, you also need an appropriate Sequencer implementation. See Why is a Receiver registered to the Sequencer of the JDK 1.3/1.4 never called? (Matthias)

2.2.

Can I bundle a MIDI IO implementation with my application to ease installation?

In some cases, this may be appropriate.Especially in the case of PLUM-STONE, which is implemented in pure java, this may be worth of consideration. See Bob's mail for details. (Matthias)

2.3.

How do I send sysex events?

See the example Deliver a system exclusive message to a MIDI device. (Matthias)

2.4.

I get confusing behaviour when using MidiSystem.getReceiver() and MidiSystem.getTransmitter().

In the JDK 1.4.2 and earlier, the implementation of these methos is buggy and inconsistent. In 1.5.0, the semantics is now well defined, but also quite complicated. Here are some suggestions for "best practice":

  • Do not use MidiSystem.getReceiver() and MidiSystem.getTransmitter(). Their semantics is too complicated to rely on.

  • Instead, obtain the MidiDevice.Info array, select your desired device (or let the user select it), obtain the device and obtain Receiver and Transmitter instances from it.

  • Open the device before using its transmitters and receivers. It is not necessary to open it before obtainin the R/Ts. Close the R/Ts when finished. Close the device when finished. In 1.5, closing the device will close all its R/Ts.

  • If you use a closed receiver or a receiver from a closed device, you will get an exception.

(Matthias)

2.5.

Why are there two instances of MidiDevice for each MIDI port?

In the Sun JDK, there is one instance for input and one for output. It can always be an implementation decision to do it the one way or the other (one instance or two instances per device). On Windows, it's the abstraction of the device drivers that make it difficult to associate the input and output devices belonging to the same port. If you want to write a portable application, it should be prepared to handle both cases.

See also Why are there MidiDevice instances that provide only Transmitter or only Receiver instances? and How can I find out if a MidiDevice instance represents a MIDI IN or a MIDI OUT port? (Matthias)

2.6.

How can I find out if a MidiDevice instance represents a MIDI IN or a MIDI OUT port?

You can use the methods getMaxReceivers() and getMaxTransmitters(). A MidiDevice instance that represents a MIDI IN port has Transmitters, but no Receivers. For MIDI OUT, it's vice versa: Receivers, but no Transmitters. For a code example, see how MidiPlayer handles the -l option.

See also Why are there MidiDevice instances that provide only Transmitter or only Receiver instances? and Why are there two instances of MidiDevice for each MIDI port? (Matthias)

2.7.

Can a MIDI device used by Java Sound be shared with other applications?

The answer depends on the operating system, the driver of the soundcard and the Java Sound implementation.

On Windows, shared access is supported if the soundcard driver and the soundcard support it. WDM drivers typically support sharing the MIDI OUT device.

On Linux with the Sun JDK, the ALSA "rawmidi" interface is used, which allows only exclusive open of the device. The MIDI implementation of Tritonus is based on the ALSA "sequencer" interface, which allows shared access to the devices. (Matthias)

2.8.

Why are "all notes off", "all sounds off" or other events delivered when a MidiDevice is closed?

This seems to happen on Windows with certain soundcards. Soundblaster Audigy is reported to have this problem on Windows XP. The suspected reason is that either the device driver or the multimedia subsystem of Windows issues these events (on closing a device, the method midiOutReset() of the Windows API is called). (Matthias)

3. Sequencer

3.1. Why is a Receiver registered to the Sequencer of the JDK 1.3/1.4 never called?
3.2. Why is the timing of the Sequencer unstable with the Sun JRE, but not with the Sun JDK?
3.3. How can I get notified of note on/off events?
3.4. Can I use a Sequencer to trigger the playback of audio samples/clips?
3.5. How can I find out which Synthesizer is used by the Sequencer?
3.6. How can I find out the track a message originated from when I receive it in a Receiver, ControllerListener or MetaEventListener?
3.7. How can I get a Sequencer that does not play notes on the default Synthesizer (or default MIDI device)?
3.8. Why is the timestamp associated with events emitted by a Sequencer via a Transmitter always -1?
3.9. How can I schedule single MIDI events?
3.10. What is the difference between the "Java Sound Sequencer" and the "Real Time Sequencer"?
3.11. Is the "Real Time Sequencer" available in the JDK 1.4?
3.12. How can I synchronize a Sequencer to an external clock?
3.1.

Why is a Receiver registered to the Sequencer of the JDK 1.3/1.4 never called?

This is a bug in the JDK 1.3/1.4. It is fixed in the JDK 1.5.0 with the introduction of the "Real Time Sequencer".

Alternativly, you can use one of Tritonus' Sequencer implementations. You have the choice between three Sequencer implementations:

  • The ALSA sequencer. It is based on a sequencer implemented in native code and running as a kernel module as part of ALSA. Currently, ALSA is only available for Linux. You can use this sequencer even if you have no hardware MIDI I/O. It's possible to use it with Timidity, a software sequencer running in user space.

  • The MidiShare sequencer. It is based on a sequencer implemented in native code that is part of MidiShare. Since MidiShare is available for Windows, Mac and Linux, this sequencer should run on all of these platforms. space.

  • The pure-java sequencer. As the name says, this sequencer is implemented solely in Java. In theory, it runs on any platform. In practice, it depends heavily on the implementation of System.currentTimeMillis() (on some operating systems, this call actually has a resolution of only 30 ms) and on the quality of the thread scheduling. On Linux, this sequencer has quite good timing as long as there is low system load. But even a running 'top' can kill the timing, moving windows definitely does. On Windows, several experiments did not lead to satisfying results.

See also How can I get a Sequencer that does not play notes on the default Synthesizer (or default MIDI device)? (Matthias)

3.2.

Why is the timing of the Sequencer unstable with the Sun JRE, but not with the Sun JDK?

If there is no soundbank (default installation of the JRE), the Sequencer cannot use the Synthesizer that it uses by default. Instead, it falls back to using MIDI OUT. This results in bad timing for Java 1.4 and older. The solution is to install a soundbank for the JRE versions 1.4 and older or upgrade to J2SE 5.0. For more details, see Why can I hear sound from my program that uses a Synthesizer (or Sequencer) if the program is executed by the JDK, but not if it is executed by the JRE? and Why am I still able to hear sound although Synthesizer.getDefaultSoundbank() returns null? (Matthias)

3.3.

How can I get notified of note on/off events?

You should use a Sequencer implementation with correct Transmitter/Receiver behaviour (see Why is a Receiver registered to the Sequencer of the JDK 1.3/1.4 never called?). An alternative may be to insert controller or meta events into the sequence. They can be catched by registering a ControllerEventListener or MetaEventListener. For an example of using Receiver as well as event listeners see Playing a MIDI file (advanced)

A suggestion from Espen Riskedal: you can use a meta event like the 'text' event (get the midi specification or look for it on the web) and add a string as the text of the meta-event like: "NOTEON c-4 56" (pitch c-4, or better just use an int number, velocity 56) ... now to parse the sequence you need to get the Track objects from it, and iterate over all the events in the tracks, and add meta-events where there are on/offs. See also see How can I add a text/lyric/marker/copyright/trackname meta event to a Sequence? (Matthias)

3.4.

Can I use a Sequencer to trigger the playback of audio samples/clips?

Yes, you can. Assuming a correct Sequencer implementation, you can add meta events to the sequence and register a MetaEventListener with the Sequencer. When constructing the MetaMessage objects, you can either use a text event (0x01) or a sequencer-specific event (0x7F) as the type of the message. Both allow arbitrary 8-bit data. Note that the text event is different from a lyrics event (0x05), so you can really put your stuff in there. The sequencer-specific events contain a manufacturer id in the first 3 bytes. There are ids intended for local or experimental work (0x7D). Anyway, this is only important if you want to save the sequence including these events to a file and exchange it with others.

An alternative idea is to build a soundbank with the samples you want to play. Then, you can select an instrument and just use ordinary MIDI events to play the samples. (Matthias)

3.5.

How can I find out which Synthesizer is used by the Sequencer?

The solution depends on the Java version:

  • In the Sun JDK 1.4.2 and earlier, Sequencer instances also implement the Synthesizer interface. So you can cast your Sequencer object to a Synthesizer and use the methods of Synthesizer. To do this in a portable way, you should check if Synthesizer is really implemented:

    Sequencer seq = ...;
    if (seq instanceof Synthesizer)
    {
        Synthesizer synth = (Synthesizer) seq;
        // now you can use Synthesizer methods safely.
    }
  • In the Sun JDK 1.5.0 and later, Sequencer and Synthesizer are decoupled. In your program you can obtain a Synthesizer instance yourself and connect it to a Sequencer explicitly using a Transmitter / Receiver pair. So you know which Synthesizer is used. In case you obtained a Sequencer that is pre-connected to a Synthesizer, you can find out the Synthesizer by using the method getTransmitters().

  • Tritonus has the same behaviour as the Sun JDK 1.5.0.

See also How can I get a Sequencer that does not play notes on the default Synthesizer (or default MIDI device)? (Matthias)

3.6.

How can I find out the track a message originated from when I receive it in a Receiver, ControllerListener or MetaEventListener?

Currently, there is no way to find out this. There were some discussion about making this information available in some way (see bug # 4716399). However, the result was that first, there is no clean way to augment the API to provide this and second, it may be difficult to implement. So it's not likely that this functionality will appear any time soon. (Matthias)

3.7.

How can I get a Sequencer that does not play notes on the default Synthesizer (or default MIDI device)?

With the JDK 1.5.0, you can use MidiSystem.getSequencer(false). The "Real Time Sequencer" of 1.5 is no longer an instance of Synthesizer, it is no longer implicitly connected to a Synthesizer object. Instead, if you call MidiSystem.getSequencer() or MidiSystem.getSequencer(true), a Sequencer instance is returned that is explicitly connected to the default Synthesizer or default MIDI device using a Transmitter / Receiver pair. See also MidiSystem.getSequencer(boolean).

As an alternative, it is possible to disconnect the default synthesizer explicitly. This can be done by obtaining all Transmitter instances of the Sequencer (see MidiDevice.getTransmitters()) and close the one that is used as the connection to the synthesizer. Usually, there is only one Transmitter instance, so it's clear which one to close.

See also How can I find out which Synthesizer is used by the Sequencer? (Matthias)

3.8.

Why is the timestamp associated with events emitted by a Sequencer via a Transmitter always -1?

The timestamp passed to a Receiver is the information to the Receiver when the MIDI message should be delivered (some hardware MIDI ports have the facility to delay messages and deliver them at a specific time). The Sequencer implementations in the Sun JDK and in Tritonus are real-time sequencers: they emit a message at the time it is intended to be delivered. These Sequencer implementations expect a MIDI device to pass a message immediately. This is the condition expressed by a timestamp value of -1.

If you want to know the time of an event, you can use the methods getMicrosecondPosition() or getTickPosition() of the Sequencer instance that emitted the event. (Matthias)

3.9.

How can I schedule single MIDI events?

There are two common solutions:

  • Use a Sequencer instance with an empty Sequence object that has an EOF event far in the future. Then add events to the Sequence object. This approach has its pitfalls, see Why does adding or removing events from a Sequence while it is playing have no effect?

  • Write your own scheduling mechanism in Java, using one of the available high-precision clocks (See Q: 7). This is recommended if you need specific semantics (for instance, a guaranteed reaction time) or if you can't use one of the Sequencer implementations that are required for the approach above.

There is a third solution that is specific to the "Tritonus ALSA Sequencer": it has an undocumented public method that allows to enqueue MIDI events into its scheduling queue directly. (Matthias)

3.10.

What is the difference between the "Java Sound Sequencer" and the "Real Time Sequencer"?

(Matthias)

3.11.

Is the "Real Time Sequencer" available in the JDK 1.4?

No, it isn't. If possible, switching to the JDK 1.5 is highly recommended. If this is not possible, you can use Tritonus' "Java Sequencer". Alternativly, you can try to take the source code of the "Real Time Sequencer" and port it back to 1.4. Note that there are license issues that limit the redistribution of the results. The SCSL (Sun Community Source License) allows only in-house use. The JRL (Java Research License) allows distribution to members of the java.net community and "in the usually accepted academic manner", but still no general distribution. See also What is the difference between the "Java Sound Sequencer" and the "Real Time Sequencer"? (Matthias)

3.12.

How can I synchronize a Sequencer to an external clock?

This is currently not possible, no existing Sequencer implementation has such a feature implemented. See also Q: 6.3 (Matthias)

4. Sequence Playback

4.1. Why does adding or removing events from a Sequence while it is playing have no effect?
4.2. How can I play RMF files?
4.3. Which method should I use to change the playback tempo: setTempoInBPM(), setTempoInMPQ() or setTempoFactor()?
4.4. Why is the tempo reset at the beginning of the loop when looping with the realtime sequencer?
4.5. How can I control the playback volume of the Sequencer?
4.6. How can I change the instrument when playing a Sequence?
4.7. How can I loop a Sequence?
4.8. How can I find out when a Sequence has completed playing?
4.9. Why is a Sequence "clipped" when listening to end-of-track meta events?
4.1.

Why does adding or removing events from a Sequence while it is playing have no effect?

The "Java Sound Sequencer" (Sun JDK 1.4.2 and earlier) does not react to changes in the Sequence after it has been started. This is true even if you use Sequencer.setSequence(InputStream) -- this method reads the stream until EOF and then acts like Sequencer.setSequence(Sequence).

The "Real Time Sequencer" (Sun JDK 1.5.0 and later) and Tritonus' Sequencer implementations react to changes in the Sequence while playing. However, the semantics can be difficult:

For good reasons, the behaviour of altering a Sequence while it is played is unspecified. Virtually all sequencers are implemented by putting messages in a queue for scheduling. So removing an event from the Sequence only works if it has not been queued. Adding an event only works if the sequencer's queueing point hasn' already passed. The "Java Sound Sequencer" in the Sun JDK 1.4.2 and earlier is even more restricted: it loads the complete Sequence before starting to play, so it doesn't react to changes in the Sequence at all. The "Tritonus ALSA Sequencer", the "Tritonus Java Sequencer" and the "Real Time Sequencer" in the Sun JDK 1.5.0 have the more typical behaviour descibed above.

To write a portable program, you should make no assumptions about the behaviour of the sequencer. According to Florian: "Better the evil you know than that which you don't know." (Matthias)

4.2.

How can I play RMF files?

You have to use the following technique:

File file = new File("your_file.rmf");
InputStream inputStream = new BufferedInputStream(
    new FileInputStream(file), 1024);
Sequencer sequencer = ...
sequencer.open();
sequencer.setSequence(inputStream);

Note that the following does not work:

File file = new File("your_file.rmf");
Sequence sequence = MidiSystem.getSequence(file);
Sequencer sequencer = ...
sequencer.open();
sequencer.setSequence(sequence);

The reason for this is that RMF is a proprietary format. That's why Java Sound implementations can't expose the content in a Sequence object.

Also note that the new "Real Time Sequencer" of the JDK 1.5 cannot deal with RMF files. The old "Java Sound Sequencer" is still present in the JDK and used automatically if you try to load a RMF Sequence. See also Should I use RMF or not? and What is the difference between the "Java Sound Sequencer" and the "Real Time Sequencer"? (Matthias)

4.3.

Which method should I use to change the playback tempo: setTempoInBPM(), setTempoInMPQ() or setTempoFactor()?

setTempoInBPM() and setTempoInMPQ() both set an absolute tempo value. The problem with this is that many MIDI files contain tempo change events. If a sequencer encounters such a tempo change event, it sets the tempo to the absolute value given in this event. So if you set the tempo with setTempoInBPM() or setTempoInMPQ(), and then the sequencer processes a tempo change envent, your tempo is gone. Therefore, it is recommended to use setTempoFactor(). If you use this method, the tempo is always scaled correctly, even for tempo change events. Why is the tempo reset at the beginning of the loop when looping with the realtime sequencer? (Matthias)

4.4.

Why is the tempo reset at the beginning of the loop when looping with the realtime sequencer?

This seems to be a bug. As a workaround, you can change the tempo factor. The tempo factor is not reset on loop iterations. You can use code similar to this:

Sequencer sequencer = ...;

public void setTempo(float fBPM)
{
    float fCurrent = sequencer.getTempoInBPM();
    float fFactor = fBPM / fCurrent;
    sequencer.setTempoFactor(fFactor);
}

See also Which method should I use to change the playback tempo: setTempoInBPM(), setTempoInMPQ() or setTempoFactor()? (Matthias)

4.5.

How can I control the playback volume of the Sequencer?

A Sequencer itself doesn't produce sound, so it doesn't maintain a volume value. What you can do is to control the volume of the Synthesizer that the Sequencer uses to produce sound. This is detailed in How can I control the playback volume of the synthesizer? See also How can I find out which Synthesizer is used by the Sequencer? (Matthias)

4.6.

How can I change the instrument when playing a Sequence?

There are two possibilities:

  • Include program change (and maybe bank change) events into the Sequence. This makes sure the events are sent to whatever output device the Sequencer is using.

  • Detect the output device the Sequencer uses and control it directly. For the first step, see How can I find out which Synthesizer is used by the Sequencer? Then, you can either send MIDI events to the MidiDevice (in any case) or use the methods of MidiChannel (if the output device is a Synthesizer).

(Matthias)

4.7.

How can I loop a Sequence?

There are several ways to do this:

  • The obvious approach is to register a MetaEventListener with the Sequencer and wait for the end of track message (meta message 47). Once you receive this message, set the position to 0 and call start() again. The problem with this approach is that there is a small delay each time the Sequencer is restarted.

  • An advanced version of the above is using two Sequencer instances in an alternating fashion. for details see Swapping of Sequencers in Java Sound. This technique reduces the restarting delay, but does not remove it completely. Is also allows to change the content of the Sequence objects easily.

  • With the "Java Sound Sequencer", you can add special marker meta events ("loopstart" and "loopend", type 6) to create an endless loop:

    Track track = ...;
    final int MARKER = 6;
    long loopStartTick = ...;
    long loopEndTick = ...;
    addEvent(track, MARKER, "loopstart".getBytes(), loopStartTick);
    addEvent(track, MARKER, "loopend".getBytes(), loopEndTick);
    
    // ...
    
    private void addEvent(Track track, int type, byte[] data, long tick)
    {
        MetaMessage message = new MetaMessage();
        try
        {
            message.setMessage(type, data, data.length);
            MidiEvent event = new MidiEvent( message, tick );
            track.add(event);
        }
        catch (InvalidMidiDataException e)
        {
            e.printStackTrace();
        }
    }

    The timing of this method is good. However, it does not allow to change the content of the Sequence. And note that this method does no longer work with the new Sequencer implementation in the Sun JDK 1.5.0. The Tritonus Sequencer implementations will support this in the future.

  • Starting with 1.5.0, there are new methods in the Sequencer interface to do looping cleanly. See the methods setLoopStartPoint(), getLoopStartPoint(), setLoopEndPoint(), getLoopEndPoint(), setLoopCount() and getLoopCount().

See also Why is the tempo reset at the beginning of the loop when looping with the realtime sequencer? (Matthias)

4.8.

How can I find out when a Sequence has completed playing?

There is an end-of-track meta event. It has the meta event number 47. To catch it, you can use the following technique:

Sequencer sequencer = ...;
MetaEventListener listener = new MetaEventListener()
{
    public void meta(MetaMessage event)
    {
        if (event.getType() == 47)
        {
            // end of track action here
        }
    }
};
sequencer.addMetaEventListener(listener);

(Matthias)

4.9.

Why is a Sequence "clipped" when listening to end-of-track meta events?

Actually, the Sequence is not "clipped", but the end-of-track event may be sent too early. Java Sound's Track class automatically manages an end-of-track event: There is such an event inserted after the last event inserted by the user. Now, if the last (user) event happens to be significantly earlier than the end of the measure, the end-of-track event will be before the end in musical terms. The solution is to insert a dummy event at the very end of the Track. Alternativly, you can manually insert an end-of-track event at the correct position .In this case, the automatic end-of-track event will be behind the one inserted manually, which doesn't harm. (Matthias)

5. Synthesizer

5.1. How can I implement a custom synthesizer or sound producer for Java Sound?
5.2. How can I replace the default synthesizer of Java Sound?
5.3. How can I use a hardware synthesizer?
5.4. Is it possible to use a hardware synthesizer as a Synthesizer object?
5.5. Why can I hear sound from my program that uses a Synthesizer (or Sequencer) if the program is executed by the JDK, but not if it is executed by the JRE?
5.6. Why is there no sound at all when I use the Synthesizer directly?
5.7. Is the "Java Sound Synthesizer" GM compatible?
5.8. How can I change the sound (instrument) with a Synthesizer object?
5.9. Why does calling Synthesizer.programChange(...) have no effect?
5.10. How can I determine whether the "Java Sound Synthesizer" uses its software synthesis engine or a hardware synthesizer?
5.11. Which controllers and sysex messages are implemented in the "Java Sound Synthesizer"?
5.12. Why does Synthesizer.getVoiceStatus() return an empty array?
5.13. Why is there a delay when playing notes on the "Java Sound Synthesizer"?
5.14. How can I reduce the latency of the "Java Sound Synthesizer"?
5.15. How can I control the playback volume of the synthesizer?
5.16. How can I change attack, decay, sustain and release of an instrument?
5.17. What is special about MIDI channel 9/10?
5.18. Is there support for microtonal?
5.1.

How can I implement a custom synthesizer or sound producer for Java Sound?

You have to implement the interface javax.sound.midi.Synthesizer. In theory, you can use any means you want in this implementation. Often it's convenient to get a SourceDataLine from AudioSystem and use this line to output the generated sound data.

To create instances of your class there are two possibilities: Either you can instantiate the class directely in your program. Or you can have MidiSystem instantiating it. In the latter case, you have to implement an additional class derived from javax.sound.midi.spi.MidiDeviceProvider. In this case, your application programm can just construct an appropriate MidiDevice.Info and call MidiSystem.getMidiDevice(). For details, see the Java Sound Programmer's Guide. (Matthias)

5.2.

How can I replace the default synthesizer of Java Sound?

With the Sun JDK 1.3/1.4, you can't. Typically, what you want to do is to instantiate or retrieve a Synthesizer instance and link it to a Sequencer instance with a Transmitter/Receiver pair. In the JDK 1.3/1.4, the sequencer is implicitely linked to a synthesizer. In theory, it should at least be possible to link another synthesizer to the default sequencer. However, since the Transmitter/Receiver mechanism doesn't work as expected (see Why is a Receiver registered to the Sequencer of the JDK 1.3/1.4 never called?), this is not possible.

In the Sun JDK 1.5.0, there is no such implicit link, and the Transmitter / Receiver mechanism is working. So you can link your own synthesizers to the sequencers. Additionally, there are system properties to change the default Sequencer, Synthesizer, Receiver and Transmitter. So you can make old programs that just call MidiSystem.getSequencer() or MidiSystem.getSynthesizer() to work with an alternative Synthesizer implementation.

Tritonus behaves similar to the JDK 1.5.0: the Transmitter / Receiver mechanism is working. Default provider propertes are currently not implemented, but are likely to be implemented soon. (Matthias)

5.3.

How can I use a hardware synthesizer?

Hardware synthesizers on the soundcard typically appear as an ordinary MidiDevice instance in the list of devices retrieved via MidiSystem.getMidiDeviceInfo(). See also How can I replace the default synthesizer of Java Sound? and Is it possible to use a hardware synthesizer as a Synthesizer object? (Matthias)

5.4.

Is it possible to use a hardware synthesizer as a Synthesizer object?

No, using a hardware synthesizer as a Synthesizer object is currently not possible. To understand why this is difficult to implement think of the soundcard's synthesizer as being the same as an external synthesizer connected to a MIDI port with a MIDI cable, besides the fact that there is no cable. It is no problem to send information from the computer to the synthesizer with ordinary MIDI messages (note on, note off, ...). However, requesting information from the synth requires the synthesizer to respond to requests by sending back specific sysex messages (if possible at all, I'm not too familiar with these issues). Therefore, it is easy to emulate the MidiChannel's methods like noteOn(), noteOff() and programChange() by sending the respective MIDI event. But it is quite difficult to emulate the methods getMaxPolyphony(), getAvailableInstruments() etc.

In the case of the external synthesizer, you would need another cable connecting the synth's MIDI OUT to the computer's MIDI IN to be able to receive such responses. With the synthesizer on the sound card, it is a bit easier because communication is possible over the computer's bus. However, there seems to be no standardized API to achieve such things. In ALSA on Linux, there is a hwdep ("hardware dependant") device that abstracts the communication channel, but the protocol is specific to the soundcard chip dependand. And of course nobody is eager to write code for each soundcard.

However, note that it is no problem to access internal as well as external hardware synthesizers as ordinary MIDI devices or ports.

See also How can I replace the default synthesizer of Java Sound? (Matthias)

5.5.

Why can I hear sound from my program that uses a Synthesizer (or Sequencer) if the program is executed by the JDK, but not if it is executed by the JRE?

On Windows, the JRE does not install a soundbank by default. Without a soundbank, there will be no sound from the software synthesizer of the JRE. On the other hand, the JDK comes with a soundbank. The solution is to install a soundbank for the JRE. Download one of the soundbanks from Java Sound Soundbanks and install it as described on this page.

On Linux, the JRE installation is typically a symbolic link into the JDK installation, so this problem does not occur. See also Why am I still able to hear sound although Synthesizer.getDefaultSoundbank() returns null? (Matthias)

5.6.

Why is there no sound at all when I use the Synthesizer directly?

A common pitfal is that the Synthesizer needs to be opened. Not calling open() will result in no sound at all. (Matthias)

5.7.

Is the "Java Sound Synthesizer" GM compatible?

Yes. At least the instrument mapping in the soundbank (soundbank.gm) follows GM System Level 1. (Matthias)

5.8.

How can I change the sound (instrument) with a Synthesizer object?

There are two ways:

  • Send the synth a program change (maybe also a bank select) message via one of its Receiver instances.

  • On a MidiChannel object, call programChange(int program) or programChange(int bank, int program).

(Matthias)

5.9.

Why does calling Synthesizer.programChange(...) have no effect?

There are two cases when this can happen:

  • If the Synthesizer instance is not initialized by calling open(), it does not maintain program and bank settings.

  • If the soundbank has no instrument at the requested bank/program position, the current program is not changed.

(Matthias)

5.10.

How can I determine whether the "Java Sound Synthesizer" uses its software synthesis engine or a hardware synthesizer?

If Synthesizer.getDefaultSoundbank() returns null, the software synthesis engine wasn't initialized because no soundbank was available. In this case, the hardware synthesizer (or an external MIDI port) is used. If the return value is not null, you can assume that the software synthesis engine is used. (Matthias)

5.11.

Which controllers and sysex messages are implemented in the "Java Sound Synthesizer"?

It seems that nobody is able to come up with a list of implemented controllers, not even beatnik (from which Sun licensed the engine). (Matthias)

5.12.

Why does Synthesizer.getVoiceStatus() return an empty array?

VoiceStatus is an unimplemented feature. The design of this class makes it hard to implement it in an efficient manner, so it's unlikely it will ever be implemented. (Matthias)

5.13.

Why is there a delay when playing notes on the "Java Sound Synthesizer"?

The "Java Sound Synthesizer" is a software synthesizer. All software synthesizers have such a delay. This is caused by the design of software synthesizers: they have a buffer of audio data, in which the samples of different notes are added up. The length of this buffer determines the delay. So while other software synthesizers have a smaller delay than the "Java Sound Synthesizer", you won't get one without delay. See also How can I reduce the latency of the "Java Sound Synthesizer"? (Matthias)

5.14.

How can I reduce the latency of the "Java Sound Synthesizer"?

Latency is a fundamental problem of all software synthesizers, see Why is there a delay when playing notes on the "Java Sound Synthesizer"?

In the JDK 1.5.0, the latency of the "Java Sound Synthesizer" has been reduced dramatically (from 88 ms to 11 ms). Furthermore, the latency is now constant and can be queried with Synthesizer.getLatency(). So it is possible to compensate for this delay in your program.

If you need a synthesizer without (audible) delay, you can use a hardware synthesizer, either one on the soundcard or an external one connected via a MIDI port. See also How can I use a hardware synthesizer? (Matthias)

5.15.

How can I control the playback volume of the synthesizer?

There are several ways to do this:

  • You can change the volume of each individual note by using a different value for the velocity in the note on message. Note that this might affect the brightness of the sound as well as the volume.

  • You can change the per-channel volume of the synthesizer by setting the respective MIDI controller values (controller 7 for coarse volume and controller 39 for fine volume). See Setting the Volume of Playing Midi for a code example.

  • If the Java Sound implementation you use supports Port lines (see Q: 14.2), you can change the general playback volume of the soundcard. Note that this usually affects the synthesizer as well as audio playback. It does so in any case if you use the "Java Sound Synthesizer", because it plays its sounds via the "Java Sound Audio Engine". If you use a hardware synthesizer of the soundcard, it depends on the soundcard. Some models have a separate volume control for the synthesizer, others don't.

    For external devices, you may be able to use a Univeral SysEx Master Volume message.

(Matthias)

5.16.

How can I change attack, decay, sustain and release of an instrument?

Some external MIDI devices react to the controllers 72 (release time) and 73 (attack time). The beatnik engine doesn't implement these controllers. So the only way is to create a custom soundbank. See Can I create my own soundbank? and How do I load a soundbank (other than the default soundbank)? (Matthias)

5.17.

What is special about MIDI channel 9/10?

In the General MIDI specification, MIDI channel 10 (if channels are counted starting with 1) or 9 (if channels are counted starting with 0) is the percussion channel. The percussion channel behaves different from the other MIDI channels: Other channels produce sounds from the same instrument, but at different pitches for different keys. On the other hand, the percussion channel produces sound of different percussion instruments for different keys. See General Midi Percussion Key Map.

The key map in the above link specifies the percussion instruments that are available on almost all synthesizers. The "Java Sound Synthesizer" has some additional percussion instruments. Its key range of percussion instruments is 27 to 95 (opposed to 35 to 81 in General Midi). (Matthias)

5.18.

Is there support for microtonal?

The "Java Sound Synthesizer" currently does not support the "MIDI Tuning Standard" (see RFE # 4705306). However, it is possible to use pitch bend events to change the tuning. You can either insert such messages into a MIDI sequence or send them to the synthesizer directly using the MidiChannel interface. (Matthias)

6. Soundbanks

6.1. Can I create my own soundbank?
6.2. Can I use soundbanks in other formats than the Beatnik format with Java Sound?
6.3. How do I load a soundbank (other than the default soundbank)?
6.4. Why am I still able to hear sound although Synthesizer.getDefaultSoundbank() returns null?
6.5. Why does my custom soundbank only work if the default soundbank (soundbank.gm) is available?
6.6. How can I use instruments with a number > 127?
6.7. Which soundbank is loaded as default if there are more than one?
6.8. Why is there no sound for instrument on banks 2 and 3?
6.9. Why does Synthesizer.getAvailableInstruments() return an empty array?
6.10. Is there support for soundfont (sf2) or DLS files?
6.11. Can I obtain a list of instruments available in a soundbank?
6.1.

Can I create my own soundbank?

You can do this with the Beatnik Editor (a commercial product). (Matthias)

6.2.

Can I use soundbanks in other formats than the Beatnik format with Java Sound?

The "Java Sound Synthesizer" of the Sun JDK only supports Beatnik soundbanks. It is possible to write a plug-in in order to support other soundbank formats. However, the plugin would need to providea Synthesizer implementation, too. (Matthias)

6.3.

How do I load a soundbank (other than the default soundbank)?

See the example Using custom Soundbanks. (Matthias)

6.4.

Why am I still able to hear sound although Synthesizer.getDefaultSoundbank() returns null?

The default Synthesizer of the Sun JDK has the following behaviour: If no soundbank can be loaded on initialization of the Synthesizer (happening on the call to open()), the internal software synth cannot be used. Therefore, the synth falls back to using a MIDI device. If that MIDI device is the hardware synthesizer of the soundcard, you will hear sounds from that synthesizer. On the other hand, if there is no hardware synthesizer, but a MIDI port, or if the default MIDI out device (as configured in the operating system, "Midi Mapper" on Windows) is a MIDI port, messages will be sent through that MIDI port. If you connect an external synthesizer, you will be able to hear sound from it. (Matthias)

6.5.

Why does my custom soundbank only work if the default soundbank (soundbank.gm) is available?

The reason for this behaviour is similar to the one described in Why am I still able to hear sound although Synthesizer.getDefaultSoundbank() returns null?: since the software synthesizer is not used at all, it cannot load a custom soundbank. For the Sun JDK/JRE 1.4.2, the only solution is to rename the custom sounbank to soundbank.gm and put it into the directory as your application. Unfortunatly, this does not work for applets packaged in a .jar file. (Matthias)

6.6.

How can I use instruments with a number > 127?

The "instrument number" is the index in the list of all instruments of a soundbank (you can get this list with Soundbank.getInstruments() or Synthesizer.getAvailableInstruments()). This number is different from the MIDI program number. The beatnik soundbank is organized like this:

program/bankbank 0bank 1
program 0instrument 0---
.........
program 27instrument 27instrument 128
.........
program 87instrument 87instrument 188
program 88instrument 88---
.........
program 127instrument 127---

As you can see, for instruments < 128, the program number is the same as the instrument number. However, this is no general property. It just happens because instruments on bank 0 appear first in the list of all instruments, and in ascending order (it might as well has been the other way round...). The "instrument number" has no meaning in MIDI, it's just an ordinal number in the context of a soundbank. For changing the instrument, you have to find out the MIDI program number: From an Instument object, obtain a Patch object with getPatch(). The Patch object contains the MIDI bank and program number. (Matthias)

6.7.

Which soundbank is loaded as default if there are more than one?

The directories $JDKHOME/jre/lib/audio (J2SDK) or $JREHOME/lib/audio (J2RE) are searched for soundbank files.

The following filenames are tried (in this order): soundbank-deluxe.gm, soundbank-mid.gm, soundbank.gm and soundbank-min.gm.

If no soundbank is found, the current directory is searched for the same filenames. (Matthias)

6.8.

Why is there no sound for instrument on banks 2 and 3?

This seems to be a bug. (Matthias)

6.9.

Why does Synthesizer.getAvailableInstruments() return an empty array?

Soundbanks (and therefore instuments) are loaded on the opening of the Synthesizer. So you will only get useful information from getAvailableInstruments() after calling open(). (Matthias)

6.10.

Is there support for soundfont (sf2) or DLS files?

Not yet, but it is planned for the future. See RFE #4666912). Note that the hard part is not supporting the file format, but coding a Synthesizer that works with soundbank data in a specific format. A SoundbankReader for these formats could be implemented in pure Java and added as a SPI plug-in easily, but to take advantage of it, a new Synthesizer implementation would be necessary, too. (Matthias)

6.11.

Can I obtain a list of instruments available in a soundbank?

Yes, see the example Displaying instruments in Soundbanks. (Matthias)

7. MIDI files and events

7.1. How are MIDI channels related to the Track class in Java Sound?
7.2. Why do some MIDI files have only NOTE_ON events, but no NOTE_OFF events?
7.3. How do I represent "running status" in MIDI events with Java Sound?
7.4. What is the relation between status, command and channel in ShortMessage?
7.5. How do I create an end of track event for a Sequence constructed by a program?
7.6. How do I create a Sequence (including events) from scratch?
7.7. How can I edit individual events in a Sequence?
7.8. How do I convert a RMF file to a MIDI file?
7.9. How do I convert a type 1 MIDI file to a type 0 MIDI file?
7.10. How can I add a tempo change event to a Sequence?
7.11. How can I add a text/lyric/marker/copyright/trackname meta event to a Sequence?
7.12. How can I assign a name to a track?
7.13. Why do I get an InvalidMidiMessageException when constructing a realtime message?
7.14. How can I search the corresponding note off event for a note on event?
7.15. Does the length of a Sequence in ticks as returned by getTickLength() change as the tempo is changed?
7.1.

How are MIDI channels related to the Track class in Java Sound?

They are completely unrelated. MIDI channels are used to group notes that should be played with the same sound. This is because synthesizers allow to change the instrument on a per-channel basis. Because it is necessary to transfer this information over the MIDI wire (e.g. from a hardware sequencer to a hardware synthesizer), the channel is coded in the MIDI message itself. On the other hand, tracks are just a way of logical grouping. The concept originated from MIDI files. Tracks can be used to group MIDI events in arbitrary ways, whatever is convenient to the user. You may decide to have one track for all messages in the same channel, or you may decide to have only one track in total, for all mesages. (Matthias)

7.2.

Why do some MIDI files have only NOTE_ON events, but no NOTE_OFF events?

There actually are two ways of saying 'note off' in MIDI. One is a NOTE_OFF event. The other is a NOTE_ON event with a velocity of 0. Both methods are legal by the MIDI specification and are widely used. (Matthias)

7.3.

How do I represent "running status" in MIDI events with Java Sound?

You don't need to care about this. In Java Sound, the ShortMessage always have a status byte. Running status is handled on the device driver level for MIDI ports and in the file readers and writers for MIDI files. (Matthias)

7.4.

What is the relation between status, command and channel in ShortMessage?

Status is the first byte of the MIDI message. For channel messages (note off, note on, poly pressure, control change, program change, channel pressure, pitch bend), it contains the command in the upper 4 bits and the channel in the lower 4 bits. ShortMessage.getChannel() will always return the channel number zero-based (range 0 to 15).

When using ShortMessage.setMessage(command, channel, data1, data2) make sure that command is one of

  • ShortMessage.NOTE_OFF

  • ShortMessage.NOTE_ON

  • ShortMessage.POLY_PRESSURE

  • ShortMessage.CONTROL_CHANGE

  • ShortMessage.PROGRAM_CHANGE

  • ShortMessage.CHANNEL_PRESSURE

  • ShortMessage.PITCH_BEND

...and that channel is between 0 and 15. (Matthias)

7.5.

How do I create an end of track event for a Sequence constructed by a program?

You don't need to take care of the end of track event -- this is automatically handled by the class Track. It creates such an event, and adapts its tick value as new events are added to the track. You only have to make sure that you don't remove this event. (Matthias)

7.6.

How do I create a Sequence (including events) from scratch?

Have a look at the example CreateSequence. (Matthias)

7.7.

How can I edit individual events in a Sequence?

From a Sequence object, you can obtain an array of Track objects. In each track, you can obtain the number of events in the track and each individual MidiEvent object. The MidiEvent references you obtain refer to the original objects in the track, not to a copy. So you can directly manipulate the tick value of the event and the content of the MidiMessage object. Issues to note:

(Matthias)

7.8.

How do I convert a RMF file to a MIDI file?

This is not possible with Java Sound. The main reason for this is that the RMF format is not open (see Where can I get documentation about RMF?). However, it should be possible to extract the MIDI part of a RMF file using beatnik's RMF editor. See also How can I play RMF files? and Should I use RMF or not? (Matthias)

7.9.

How do I convert a type 1 MIDI file to a type 0 MIDI file?

See the example Converting MIDI type 1 files to MIDI type 0 files. (Matthias)

7.10.

How can I add a tempo change event to a Sequence?

Tempo change is a meta event (type 0x51, decimal 81). The new tempo value has to be given in microseconds per quarter (MPQ). You can calculate MPQ from BPM: mpq = 60000000 / bpm.

Track track = ...;
final int TEMPO = 0x51;
long tick = ...;
int tempoInMPQ = ...;
byte[] data = new byte[3];
data[0] = (byte)((tempoInMPQ >> 16) & 0xFF);
data[1] = (byte)((tempoInMPQ >> 8) & 0xFF);
data[2] = (byte)(tempoInMPQ & 0xFF);
addEvent(track, TEMPO, data, tick);

// ...

private void addEvent(Track track, int type, byte[] data, long tick)
{
    MetaMessage message = new MetaMessage();
    try
    {
        message.setMessage(type, data, data.length);
        MidiEvent event = new MidiEvent( message, tick );
        track.add(event);
    }
    catch (InvalidMidiDataException e)
    {
        e.printStackTrace();
    }
}

(Matthias)

7.11.

How can I add a text/lyric/marker/copyright/trackname meta event to a Sequence?

Text is a meta event (type 1). Similar structured events are copyright (type 2), sequence or track name (type 3), track instrument name (type 4), lyric (type 5) and marker (type 6)

Track track = ...;
final int TEXT = 0x01;
long tick = ...;
String text = "...";
addEvent(track, TEXT, text.getBytes(), tick);

// ...

private void addEvent(Track track, int type, byte[] data, long tick)
{
    MetaMessage message = new MetaMessage();
    try
    {
        message.setMessage(type, data, data.length);
        MidiEvent event = new MidiEvent( message, tick );
        track.add(event);
    }
    catch (InvalidMidiDataException e)
    {
        e.printStackTrace();
    }
}

(Matthias)

7.12.

How can I assign a name to a track?

Track names are set by inserting a trackname meta event (type 3) at the beginning of the track. See also How can I add a text/lyric/marker/copyright/trackname meta event to a Sequence? (Matthias)

7.13.

Why do I get an InvalidMidiMessageException when constructing a realtime message?

Since realtime messages are not channel messages, it is not intended to use ShortMessage.setMessage(int, int, int, int, int) for them. This is the case where you get an exception "javax.sound.midi.InvalidMidiDataException: command out of range: 0xfe" (or similar). Use ShortMessage.setMessage(int)to prevent this exception. (Matthias)

7.14.

How can I search the corresponding note off event for a note on event?

Think of the way a synthesizer works: it receives NOTE ON and NOTE OFF messages. A note with a specific pitch value is 'on' untill it is turned off by a NOTE OFF message with the same pitch value. So you can find the corresponding NOTE OFF message by searching for the next one with the corresponding pitch value. Keep in mind that it is legal (and common practice) to use NOTE ON messages with velocitiy 0 instead of NOTE OFF messages. (Matthias)

7.15.

Does the length of a Sequence in ticks as returned by getTickLength() change as the tempo is changed?

No. the length in ticks does not express absolute time, but time in "musical" terms. Ticks are directly related to musical units like beats and bars. The absolute duration of one tick is defined by the sequence's PPQ resolution (or SMPTE setting), and the tempo at that specific tick. So if you increase the tempo, the absolute duration of each tick becomes smaller. Given this, it should be obvious that changing the tempo (or resolution) of a Sequence will not alter its tick length. Only adding or removing notes or other events will alter the tick length. However, increasing the tempo will change the absolute length as returned by getMicrosecondLength().

Note also that Sequencer.getTickLength() always returns the same value as Sequence.getTickLength() of the sequencer's sequence (there is a bug in the sequencer implementation of 1.4.2 and earlier that results in slightly different values, but for the new sequencer implementation of 1.5, the values are exactly identical). However, the same is not true for the absolute length in microseconds. The value returned by Sequencer.getMicrosecondLength() is affected by the sequencer's current values of tempo and tempo factor. (Matthias)

8. Miscellaneous

8.1. Why is the timing of MIDI playback unstable?
8.2. Which MIDI cards can I use with Java Sound?
8.3. Can I connect Java Sound MIDI applications to native MIDI applications?
8.4. How can I convert from audio to MIDI?
8.5. How can I convert from MIDI to audio (or save the output of the synthesizer to a file).
8.6. Where can I get documentation about RMF?
8.7. Should I use RMF or not?
8.8. Is there XMF support in Java Sound?
8.9. Where can I get a list of the standard MIDI instruments (names and patch numbers)?
8.1.

Why is the timing of MIDI playback unstable?

Java Sounds synthesizer is a virtual synthesizer - it uses your soundcard's audio out and creates the sounds in software. That is why there is a delay: audio data is quite huge compared to midi data (e.g. 5 bytes compared to 88200 bytes for one note). Audio data is put in blocks of a number of samples to the soundcard. When the blocks are 23ms long, you have at least 23 ms delay. Another source of delay comes from moving these data to the soundcard. Better soundcards, and especially PCI soundcards can do the memory transfer very fast. There is also more software overhead for audio data - obviously it takes more time to calculate and treat audio data than MIDI data.

I cannot give an authoritative answer to the problem, why the delay changes. It may be due to a block-wise implementation of the internal Java Sound synthesizer: as the audio data is always pushed in blocks to the soundcard, it may be that MIDI events are always synthesized from the start of a block. (I just wrote (for testing purposes) a drum machine that works like this). This results in unequal delay. I am not sure about this "theory", as playing a midi file seems to have perfect timing (i.e. midi notes may start inside a block). Let's see whether the programmers of the synthesizer engine respond. (Florian)

8.2.

Which MIDI cards can I use with Java Sound?

If you are using the Sun JDK on Windows, you can use any card that has a driver for your version of Windows. If you are using the Sun JDK 1.5.0 on Linux or Tritonus/ALSA, you need a soundcard that is supported by the Advanced Linux Sound Architecture (ALSA). Have a look at the ALSA soundcard matrix for supported cards. (Matthias)

8.3.

Can I connect Java Sound MIDI applications to native MIDI applications?

Yes, this is possible if this feature is provided by a device driver on the operating system level. If you have installed such a thing, the devices should appear as MidiDevice instances in Java Sound. On Windows, solution to do this include MIDI Yoke Junction and the Hubi virtual Midi driver. On Linux, this is possible with the virmidi driver of ALSA. (Matthias)

8.4.

How can I convert from audio to MIDI?

This is not possible at all. There is research on this issue, but it is estimated that it will take some five to ten years untill there are results that are useful in practice. (Matthias)

8.5.

How can I convert from MIDI to audio (or save the output of the synthesizer to a file).

There is no built-in way to capture the output of the synthesizer. A workaround may be to connect line out to line in on your soundcard. Then, you can use a TargetDataLine to capture the output.

An alternative may be to call external programs like Timidity. (Matthias)

8.6.

Where can I get documentation about RMF?

You can't. RMF (Rich Music Format) is a proprietary format designed by beatnik. They don't release any specs to the public. See also Should I use RMF or not? (Matthias)

8.7.

Should I use RMF or not?

For certain tasks, RMF is currently the only way to accomplish them in Java Sound. However, RMF is superseeded by the eXtensible Music Format (XMF). XMF is an open standard. RMF will be still supported in the Sun JDK 1.5.0, but support may be dropped in future versions. Also note that open source Java Sound implementations will never be able to support RMF, but are able to support XMF or even more advanced formats like MPEG-4 Structured Audio. See also Is there XMF support in Java Sound? (Matthias)

8.8.

Is there XMF support in Java Sound?

No, not in the JDK 1.5.0 and earlier versions. Newer versions of the beatnik engine support XMF. The Sun JDK uses an older version of the beatnik engine which only supports RMF. Sun plans to add XMF support in the future (see bug # 4666912). See also Should I use RMF or not?

Also note that it is possible to add support for additional file and soundbank formats by implementing a plug-in ("Service Provider") that handles the respective format. (Matthias)

8.9.

Where can I get a list of the standard MIDI instruments (names and patch numbers)?

The official source for the standard MIDI instrument map ("General MIDI") is the MIDI standard as published by the MIDI Manufacturers Association. There are many unofficial (but usually correct) lists on the net. Some links to such resources are in the links section.

It is also possible to obtain a list of instruments available in a soundbank. See Can I obtain a list of instruments available in a soundbank? (Matthias)