IMediaRecorder

interface NatML.Recorders.IMediaRecorder

This interface defines all common functionality for media recorders. NatCorder includes several implementations of this interface, each of which can record video, audio, or both. Below are the methods provided by this interface.

All recorder methods are thread safe, so they can be used from any thread.

All recorders are designed with a push architecture. This means that the client (you) will push frames to the recorder when they want. This is different from say a screen recorder, which will automatically pull frames for you--whether from the screen or from the microphone.

Frame Size

/// <summary>
/// Recording frame size.
/// </summary>
(int width, int height) frameSize { get; }

NatCorder is mainly designed for recording video. As a result, most recorders will be created with a frame size (width and height) which defines the pixel size of the output video. With this in mind, the IMediaRecorder interface exposes the frameSize property.

Committing Video Frames

/// <summary>
/// Commit a video pixel buffer for encoding.
/// The pixel buffer MUST have an RGBA8888 pixel layout.
/// </summary>
/// <param name="pixelBuffer">Pixel buffer to commit.</param>
/// <param name="timestamp">Pixel buffer timestamp in nanoseconds.</param>
void CommitFrame<T> (T[] pixelBuffer, long timestamp) where T : unmanaged;

The recorder accepts video frames as RGBA8888 pixel buffers. This could be a Color32[] provided by Unity's Texture2D.GetPixels32 or WebCamTexture.GetPixels32 methods; it could be a managed byte[] provided by the NatDevice camera preview or an OpenCV matrix; or it can be any other managed numeric array that contains data which can be interpreted as an RGBA8888 pixel buffer.

Note that the dimensions of the committed pixel buffer must match the frame size that was used to create the recorder. In other words, the byte size of the pixel buffer must be equal to frameSize.width * frameSize.height * 4.

The CommitFrame method has overloads that take anRGBA8888 pixel buffer in native memory:

/// <summary>
/// Commit a video pixel buffer for encoding.
/// The pixel buffer MUST have an RGBA8888 pixel layout.
/// </summary>
/// <param name="pixelBuffer">Pixel buffer to commit.</param>
/// <param name="timestamp">Pixel buffer timestamp in nanoseconds.</param>
void CommitFrame<T> (NativeArray<T> pixelBuffer, long timestamp) where T : unmanaged;

And:

/// <summary>
/// Commit a video pixel buffer for encoding.
/// The pixel buffer MUST have an RGBA8888 pixel layout.
/// </summary>
/// <param name="nativeBuffer">Pixel buffer in native memory to commit.</param>
/// <param name="timestamp">Pixel buffer timestamp in nanoseconds.</param>
void CommitFrame (void* nativeBuffer, long timestamp);

These overloads are useful for applications that want to avoid garbage collection when recording video frames.

Do not commit raw pointers unless you know what you are doing, as these are much more likely to result in a hard crash if something goes wrong.

Committing Audio Frames

/// <summary>
/// Commit an audio sample buffer for encoding.
/// </summary>
/// <param name="sampleBuffer">Linear PCM audio sample buffer, interleaved by channel.</param>
/// <param name="timestamp">Sample buffer timestamp in nanoseconds.</param>
void CommitSamples (float[] sampleBuffer, long timestamp);

The recorder accepts audio frames as floating-point linear PCM sample buffers.

When the audio frames contain more than one channel (for example, stereo audio), the samples are expected to be interleaved by channel.

The CommitSamples method has overloads that take sample buffers in native memory:

/// <summary>
/// Commit an audio sample buffer for encoding.
/// </summary>
/// <param name="sampleBuffer">Sample buffer to commit.</param>
/// <param name="timestamp">Sample buffer timestamp in nanoseconds.</param>
void CommitSamples (NativeArray<float> sampleBuffer, long timestamp);

And:

/// <summary>
/// Commit an audio sample buffer for encoding.
/// The sample buffer MUST be a linear PCM floating point buffer interleaved by channel.
/// </summary>
/// <param name="nativeBuffer">Sample buffer in native memory to commit.</param>
/// <param name="sampleCount">Total number of samples in the buffer.</param>
/// <param name="timestamp">Sample buffer timestamp in nanoseconds.</param>
void CommitSamples (float* nativeBuffer, int sampleCount, long timestamp);

The sampleCount parameter should account for the multiple channels of audio present within the buffer. In other words, the byte size of the nativeBuffer must be equal to sampleCount * sizeof(float).

These overloads are useful for applications that want to avoid garbage collection and extra allocations for high performance recording.

Specifying Frame Timestamps

All frames are committed with a corresponding timestamp. This timestamp must be in nanoseconds. You can either manually compute the timestamps, or use an IClock instance. All timestamps are expected to be zero-based, meaning that the very first timestamp for either a video or audio frame must be zero. Not meeting this requirement will not raise an exception or a crash, but it will likely cause drifting and other synchronization issue in the resulting media file.

Some recorders do not need frame timestamps.

Finishing Recording

/// <summary>
/// Finish writing and return the path to the recorded media file.
/// </summary>
Task<string> FinishWriting ();

When you are done committing frames, you can end the recording session with this method. When the method is called, the recorder will complete its recording operations, finalize the media file, then release any resources. The method returns a path to the recorded media file once the recorder is finished with its cleanup operations.

If the recording fails for any reason, the task will raise an exception. But it is worth noting that a recorder will rarely ever fail to finish writing successfully.

All recorders will write the media file to the application's private documents directory. There is no way to change this behaviour, so if you want the video in a specific place, you can use the System.IO API's to move the file where you want it.

On WebGL, the returned path will point to a blob in the browser's memory. As such, it will look something like"blob:http://...".

No further media frames must be committed once FinishWriting is called. Doing so will typically result in a hard crash.

Last updated