Saturday, June 19, 2010

Using your new Bluecherry MPEG-4 codec card and driver...

Now that the dust has settled and people are taking notice of the new driver for Bluecherry's MPEG-4 codec cards, here's a quick How-To for using it.

You will notice that there are two types of v4l2 devices created for each card. One device for the display port that produces uncompressed YUV and one for each input that produces compressed video in either MPEG-4 or MJPEG.

We'll start with the display port device. When loading the driver, a display port is created as the first device for that card. You can see in dmesg output something like this:

solo6010 0000:03:01.0: PCI INT A -> GSI 22 (level, low) -> IRQ 22
solo6010 0000:03:01.0: Enabled 2 i2c adapters
solo6010 0000:03:01.0: Initialized 4 tw28xx chips: tw2864[4]
solo6010 0000:03:01.0: Display as /dev/video0 with 16 inputs (5 extended)
solo6010 0000:03:01.0: Encoders as /dev/video1-16
solo6010 0000:03:01.0: Alsa sound card as Softlogic0

This is for a 16-port card. The output for a 4-port card would show "Encoders as /dev/video1-4" and similarly for 8-port show /dev/video1-8.

The display port allows you to view and configure what is shown on the video out port of the card. The device has several inputs and depends on which card you have installed:

  • 4-port: 1 input per port and 1 virtual input for all 4 inputs in 4-up mode.
  • 8-port: 1 input per port and 2 virtual inputs for 4-up on inputs 1-4 and 5-8 respectively.
  • 16-port: 1 input per port and 5 virtual inputs for 4-up on inputs 1-5, 5-8, 9-12 and 13-16 and 1 virtual input for 16-up on all inputs.

You do not have to open this device for the video output of the card to work. If you open the device and set the input to input 2, and close it (without viewing any of the video) it will continue to show that input on the video out of the card. So you can simply change inputs using v4l2's ioctl's.

This is useful if you want to have live display to a CRT and use a simple program that rotates through the inputs (or multi-up virtual inputs) a few second intervals.

You can still use vlc, mplayer or whatever to view this device (you can open it multiple times).

Now for the encoder devices. There's obviously one device for each physical input on the card. The driver will allow you to record MPEG-4 and MJPEG from the same device (but you must open it twice, one for each feed). The video format cannot be configured once recording starts. So if you open the device for MPEG-4 and full D1 resultion at 30fps, that's what you're going to get if you also open a simultaneous record for MJPEG.

However, it's good to note here that MJPEG will automatically skip frames when recording. This allows you to pipe the output to a network connection (e.g. MJPEG over HTTP) with no worry of the remote connection being overloaded on bandwidth.

However, this isn't so for MPEG-4. It is possible if you are too slow at recording (not likely) to fall behind the card's internal buffer. I was not able to do this writing the full frames to disk on 44 records (4 cards of 16, 16, 8 and 4 ports).

Unlike any card before this supported by v4l2, the Bluecherry cards produce containerless MPEG-4 frames. Most v4l2 applications expect some sort of MPEG-2 stream such as program or transport. Since these programs do not expect MPEG-4 raw frames, I don't know of any that are capable of playing the encoders directly (much less being able to record from them). You can do something simple like 'cat /dev/video1' and somehow pipe it to vlc (I haven't tested this), or write a program that just writes the frames to disk (I have tested this, most programs can play the raw m4v files produced from the driver).

However, since most people will record to disk, the easiest way is to write the video frames straight out to disk.

Now on to the audio. The cards produce what is known as G.723, which is a voice codec typically found on phone systems (especially VoIP).

Since Alsa currently doesn't have a format for G.723, the driver shows it as unsigned 8-bit PCM audio. However, I can assure you that it isn't. I have sent a patch that was included in alsa-kernel (hopefully getting synced to mainline soon). But this only defines the correct format, it doesn't change the way you handle it at all.

You must convert G.723-24 (3-bit samples at 8khz) yourself. The example program I provide in my next post will show you how to do this, as well as how to convert it to MP2 audio, and record all of this to a container format on disk for later playback.

7 comments:

  1. What profile is being used for compression, or is this tweakable?

    Will this card work in gstreamer if we add the container at that time?:

    v4l2src device=X ! decodebin ! ffmpegcolorspace ! ffenc_mpeg4 ! mp4mux ! filesink location=output.mp4

    ReplyDelete
  2. Or even without the decoding?:

    gst-launch v4l2src device=X ! mp4mux ! filesink location=output.mpg

    ReplyDelete
  3. It's MPEG4 Simple profile @L1/2/3, so it's pretty basic (not tweakable though).

    I'm not sure how gstreamer works, but if it can expect MPEG4 video frames, then it should be able to play the compressed stream. It would be in m4v format (no container).

    ReplyDelete
  4. This is awesome!
    Do you know how hard it is to find these things which even remotely work under Linux. I am incredibly impressed.

    ReplyDelete
  5. You say "The example program I provide in my next post will show you how to do this, as well as how to convert it to MP2 audio, and record all of this to a container format on disk for later playback."

    I can't seem to find this post? - Can you please link me?

    ReplyDelete
  6. Really great stuff Ben. Also please note the following syntax that I lifted from the ffmpeg-devel mailing list for decoding the audio. Can you provide some samples of using the ioctl to set the inputs and also how you access the mp4 stream directly?

    # Thanks to Zalewa PL for the code
    # Creates a named pipe
    mkfifo /tmp/audio_raw_stream

    # Extract audio data from the ALSA device. Decode it from g723 to PCM.
    # NOTE: Make sure this runs in background. You can't close this
    # pipeline or the streaming from ffmpeg will stop.
    arecord -Dplughw:1,0,1 -traw | decode-g72x -l -3>/tmp/audio_raw_stream

    # ffmpeg now reads from the named pipe instead of the ALSA device
    # and decodes audio properly.
    ffmpeg -f s16le -acodec pcm_s16le -ac 1 -ar 8000 -i /tmp/audio_raw_stream -f v4l2 -i /dev/video1 -r 29.97 -b:v 800 -qmax 51 -async 1 -ar 11025 -ac 1 -ab 24k -acodec libmp3lame -vcodec libx264 -f flv rtmp://localhost/live/**

    ReplyDelete
  7. Is this link the one you're looking for? lizard.bluecherry.net/~bcollins/bc-record.c

    ReplyDelete