ffme.common
Enumerates the differen Closed-Captioning Colors
No color
The white color
The white transparent color
The green color
The green transparent color
The blue color
The blue transparent color
The cyan color
The cyan transparent color
The red color
The red transparent color
The yellow color
The yellow transparent color
The magenta color
The magenta transparent color
The white italics color
The white italics transparent color
The background transparent color
The foreground black color
The foreground black underline color
Enumerates the Closed-Captioning misc commands
No command
The resume command
The backspace command
The alarm off command
The alarm on command
The clear line command
The roll up2 command
The roll up3 command
The roll up4 command
The start caption command
The star non caption command
The resume non caption command
The clear screen command
The new line command
The clear buffer command
The end caption command
Defines Closed-Captioning Packet types
The unrecognized packet type
The null pad packet type
The XDS class packet type
The misc command packet type
The text packet type
The mid row packet type
The preamble packet type
The color packet type
The charset packet type
The tabs packet type
Enumerates the differen Closed-Captioning Styles
The none style
The white style
The white underline style
The green style
The green underline style
The blue style
The blue underline style
The cyan style
The cyan underline style
The red style
The red underline style
The yellow style
The yellow underline style
The magenta style
The magenta underline style
The white italics style
The white italics underline style
The white indent0 style
The white indent0 underline style
The white indent4 style
The white indent4 underline style
The white indent8 style
The white indent8 underline style
The white indent12 style
The white indent12 underline style
The white indent16 style
The white indent16 underline style
The white indent20 style
The white indent20 underline style
The white indent24 style
The white indent24 underline style
The white indent28 style
The white indent28 underline style
Defines Closed-Captioning XDS Packet Classes
The none XDS Class
The current start XDS Class
The current continue XDS Class
The future start XDS Class
The future continue XDS Class
The channel start XDS Class
The channel continue XDS Class
The misc start XDS Class
The misc continue XDS Class
The public service start XDS Class
The public service continue XDS Class
The reserved start XDS Class
The reserved continue XDS Class
The private start XDS Class
The private continue XDS Class
The end all XDS Class
Represents a set of Closed Captioning Tracks
in a stream of CC packets.
Initializes a new instance of the class.
The closed captions.
Gets all the CC packets as originally provided in the constructor.
The CC1 Track Packets
The CC2 Track Packets
The CC3 Track Packets
The CC4 Track Packets
Represents a 3-byte packet of closed-captioning data in EIA-608 format.
See: http://jackyjung.tistory.com/attachment/499e14e28c347DB.pdf
Initializes a new instance of the class.
The timestamp.
The source.
The offset.
Initializes a new instance of the class.
The timestamp.
The header.
The d0.
The d1.
Gets the original packet data.
Gets the first of the two-byte packet data
Gets the second of the two-byte packet data
Gets the timestamp this packet applies to.
Gets the NTSC field (1 or 2).
0 for unknown/null packet
Gets the channel. 0 for any, 1 or 2 for specific channel toggle.
0 just means to use what a prior packet had specified.
Gets the type of the packet.
Gets the number of tabs, if the packet type is of Tabs
Gets the Misc Command, if the packet type is of Misc Command
Gets the Color, if the packet type is of Color
Gets the Style, if the packet type is of Mid Row Style
Gets the XDS Class, if the packet type is of XDS
Gets the Preamble Row Number (1 through 15), if the packet type is of Preamble
Gets the Style, if the packet type is of Preamble
Gets the text, if the packet type is of text.
Returns a that represents this instance.
A that represents this instance.
Compares the current instance with another object of the same type and returns an integer that indicates whether the current instance precedes, follows, or occurs in the same position in the sort order as the other object.
An object to compare with this instance.
A value that indicates the relative order of the objects being compared. The return value has these meanings: Value Meaning Less than zero This instance precedes in the sort order. Zero This instance occurs in the same position in the sort order as . Greater than zero This instance follows in the sort order.
Checks that the header byte starts with 11111b (5 ones binary)
The data.
If header has markers
Determines whether the valid flag of the header byte is set.
The data.
true if [is header valid falg set] [the specified data]; otherwise, false.
Gets the NTSC field type (1 or 2).
Returns 0 for unknown.
The data.
The field type
Determines whether the data is null padding
The d0.
The d1.
true if [is empty channel data] [the specified d0]; otherwise, false.
Drops the parity bit from the data byte.
The input.
The byte without a parity bit.
Converst an ASCII character code to an EIA-608 char (in Unicode)
The input.
The charset char.
Implements the logic to close a media stream.
Initializes a new instance of the class.
The media element.
Executes this command.
Represents a command to be executed against an intance of the MediaElement
Initializes a new instance of the class.
The command manager.
Type of the command.
Gets the associated parent command manager
Gets the type of the command.
Gets a value indicating whether this command is marked as completed.
Gets the task that this command will run.
Gets a value indicating whether this instance is running.
true if this instance is running; otherwise, false.
Marks the command as completed.
Executes the code for the command asynchronously
The awaitable task
Executes the command Synchronously.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Returns a that represents this instance.
A that represents this instance.
Performs the actions that this command implements.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Represents a single point of contact for media command excution.
Initializes a new instance of the class.
The media element.
Gets the number of commands pending execution.
Gets the core platform independent player component.
Gets a value indicating whether commands can be executed.
Returns false if an Opening or Closing Command is in progress.
true if this instance can execute commands; otherwise, false.
Opens the specified URI.
This command gets processed in a threadpool thread asynchronously.
The URI.
The asynchronous task
Closes the specified media.
This command gets processed in a threadpool thread asynchronously.
Returns the background task.
Starts playing the open media URI.
The awaitable command
Pauses the media.
The awaitable command
Pauses and rewinds the media
This command invalidates all queued commands
The awaitable command
Seeks to the specified position within the media.
This command is a queued command
The position.
Sets the playback speed ratio.
This command is a queued command
The target speed ratio.
Processes the next command in the command queue.
This method is called in every block rendering cycle.
Gets the pending count of the given command type.
The t.
The amount of commands of the given type
Enqueues the command for execution.
The command.
Outputs the state of the queue
The operation.
if set to true [output empty].
Clears the command queue.
Enumerates the different available Media Command Types
The open command
The seek command
The play command
The pause command
The stop command
The close command
The set speed ratio command
Implements the logic to open a media stream.
Initializes a new instance of the class.
The manager.
The source.
Gets the source uri of the media stream.
Performs the actions that this command implements.
Implements the logic to pause the media stream
Initializes a new instance of the class.
The manager.
Performs the actions that this command implements.
Implements the logic to start or resume media playback
Initializes a new instance of the class.
The media element.
Performs the actions that this command implements.
Implements the logic to seek on the media stream
Initializes a new instance of the class.
The media element.
The target position.
Gets or sets the target position.
The target position.
Performs the actions that this command implements.
A command to change speed ratio asynchronously
Initializes a new instance of the class.
The manager.
The speed ratio.
The target speed ratio
Performs the actions that this command implements.
Implements the logic to pause and rewind the media stream
Initializes a new instance of the class.
The media element.
Performs the actions that this command implements.
Contains audio format properties essential
to audio processing and resampling in FFmpeg libraries
The standard output audio spec
Initializes static members of the class.
Prevents a default instance of the class from being created.
Initializes a new instance of the class.
The frame.
Gets the channel count.
Gets the channel layout.
Gets the samples per channel.
Gets the audio sampling rate.
Gets the sample format.
Gets the length of the buffer required to store
the samples in the current format.
Creates a source audio spec based on the info in the given audio frame
The frame.
The audio parameters
Creates a target audio spec using the sample quantities provided
by the given source audio frame
The frame.
The audio parameters
Determines if the audio specs are compatible between them.
They must share format, channel count, layout and sample rate
a.
The b.
True if the params are compatible, flase otherwise.
An AVDictionary management class
To detect redundant Dispose calls
Initializes a new instance of the class.
Initializes a new instance of the class.
The other.
Gets the number of elements in the dictionary
The count.
Gets or sets the value with the specified key.
The .
The key.
The entry
Converts the AVDictionary to a regular dictionary.
The dictionary to convert from.
the converterd dictionary
A wrapper for the av_dict_get method
The dictionary.
The key.
if set to true [match case].
The Entry
Fills this dictionary with a set of options
The other dictionary (source)
Gets the first entry. Null if no entries.
The entry
Gets the next entry based on the provided prior entry.
The prior entry.
The entry
Determines if the given key exists in the dictionary
The key.
if set to true [match case].
True or False
Gets the entry given the key.
The key.
if set to true [match case].
The entry
Gets the value with specified key.
The key.
The value
Sets the value for the specified key.
The key.
The value.
Sets the value for the specified key.
The key.
The value.
if set to true [dont overwrite].
Removes the entry with the specified key.
The key.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
An AVDictionaryEntry wrapper
Initializes a new instance of the class.
The entry pointer.
Gets the key.
Gets the value.
Provides a set of utilities to perfrom logging, text formatting,
conversion and other handy calculations.
True when libraries were initialized correctly.
Gets the libraries path. Only filled when initialized correctly.
Gets the bitwise FFmpeg library identifiers that were loaded.
Registers FFmpeg library and initializes its components.
It only needs to be called once but calling it more than once
has no effect. Returns the path that FFmpeg was registered from.
This method is thread-safe.
The override path.
The bitwaise flag identifiers corresponding to the libraries.
Returns true if it was a new initialization and it succeeded. False if there was no need to initialize
as there is already a valid initialization.
When ffmpeg libraries are not found
Gets the FFmpeg error mesage based on the error code
The code.
The decoded error message
Converts a byte pointer to a UTF8 encoded string.
The pointer to the starting character
The string
Defines FFmpeg library metadata and access.
It allows for the loading of individual libraries.
The load lock preventing libraries to load at the same time.
Initializes static members of the class.
Initializes a new instance of the class.
The name.
The version.
The flag identifier.
Gets all the libraries as a collection.
Gets the AVCodec library.
Gets the AVFormat library.
Gets the AVUtil library.
Gets the SWResample library.
Gets the SWScale library.
Gets the AVDevice library.
Gets the AVFilter library.
Gets the flag identifier.
Gets the name of the library.
Gets the base path from where the library was loaded.
Returns null if it has not been loaded.
Gets the library version.
Gets the pointer reference to the library.
>
Gets a value indicating whether the library has already been loaded.
Gets the load error code. 0 for success.
Loads the library from the specified path.
The base path.
True if the registration was successful
When library has already been loaded.
Defines the library names as constants
A lock manager for FFmpeg libraries
The register lock
Keeps track of the unmanaged and managed locking structures for the FFmpeg libraries to use.
The registration state
Gets a value indicating whether the lock manager has registered.
Gets the FFmpeg lock manager callback.
Example: ffmpeg.av_lockmgr_register(FFLockManager.LockOpCallback);
Registers the lock manager. If it has been registered it does not do it again.
Thi method is thread-safe.
Manages FFmpeg Multithreaded locking
The mutex.
The op.
0 for success, 1 for error
A queue-based logger that automatically stats a background timer that
empties the queue constantly, at low priority.
Initializes static members of the class.
Gets the FFmpeg log callback method.
Example: ffmpeg.av_log_set_callback(LoggingWorker.FFmpegLogCallback);
Starts to listen to FFmpeg logging messages.
This method is not thread-safe.
Logs the specified message. This the genric logging mechanism available to all classes.
The sender.
Type of the message.
The message.
sender
When sender is null
Logs a block rendering operation as a Trace Message
if the debugger is attached.
The media engine.
The block.
The clock position.
Index of the render.
Logs the specified message. This the way ffmpeg messages are logged.
Type of the message.
The message.
Log message callback from ffmpeg library.
The p0.
The level.
The format.
The vl.
A reference counter to keep track of unmanaged objects
The synchronization lock
The current reference counter instance
The instances
The types of tracked unmanaged types
The packet
The frame
The filter graph
The SWR context
The codec context
The SWS context
Gets the singleton instance of the reference counter
Gets the number of instances by location.
Adds the specified unmanaged object reference.
The t.
The r.
The location.
Removes the specified unmanaged object reference
The PTR.
Removes the specified unmanaged object reference.
The unmanaged object reference.
Adds the specified packet.
The packet.
The location.
Adds the specified context.
The context.
The location.
Adds the specified context.
The context.
The location.
Adds the specified codec.
The codec.
The location.
Adds the specified frame.
The frame.
The location.
Adds the specified filtergraph.
The filtergraph.
The location.
A reference entry
A time measurement artifact.
Initializes a new instance of the class.
The clock starts poaused and at the 0 position.
Gets or sets the clock position.
Gets a value indicating whether the clock is running.
Gets or sets the speed ratio at which the clock runs.
Sets a new position value atomically
The new value that the position porperty will hold.
Starts or resumes the clock.
Pauses the clock.
Sets the clock position to 0 and stops it.
The speed ratio is not modified.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Provides audio sample extraction, decoding and scaling functionality.
Holds a reference to the audio resampler
This resampler gets disposed upon disposal of this object.
Used to determine if we have to reset the scaler parameters
Initializes a new instance of the class.
The container.
Index of the stream.
Gets the number of audio channels.
Gets the audio sample rate.
Gets the bits per sample.
Converts decoded, raw frame data in the frame source into a a usable frame.
The process includes performing picture, samples or text conversions
so that the decoded source frame data is easily usable in multimedia applications
The source frame to use as an input.
The target frame that will be updated with the source frame. If null is passed the frame will be instantiated.
The sibling blocks that may help guess some additional parameters for the input frame.
Return the updated output frame
input
Creates a frame source object given the raw FFmpeg frame reference.
The raw FFmpeg frame pointer.
The media frame
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Destroys the filtergraph releasing unmanaged resources.
Computes the frame filter arguments that are appropriate for the audio filtering chain.
The frame.
The base filter arguments
If necessary, disposes the existing filtergraph and creates a new one based on the frame arguments.
The frame.
avfilter_graph_create_filter
or
avfilter_graph_create_filter
or
avfilter_link
or
avfilter_graph_parse
or
avfilter_graph_config
Represents a wrapper from an unmanaged FFmpeg audio frame
Initializes a new instance of the class.
The frame.
The component.
Finalizes an instance of the class.
Gets the type of the media.
Gets the pointer to the unmanaged frame.
Releases unmanaged and - optionally - managed resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
A single codec option along with a stream specifier.
Initializes a new instance of the class.
The spec.
The key.
The value.
Gets or sets the stream specifier.
Gets or sets the option name
Gets or sets the option value.
The get format callback
Initializes static members of the class.
Prevents a default instance of the class from being created.
A dicitionary containing all Accelerators by pixel format
Gets the dxva2 accelerator.
Gets the CUDA video accelerator.
Gets the name of the HW accelerator.
Gets a value indicating whether the frame requires the transfer from
the hardware to RAM
Gets the hardware output pixel format.
Gets the type of the hardware device.
Attaches a hardware device context to the specified video component.
The component.
Throws when unable to initialize the hardware device
Detaches and disposes the hardware device context from the specified video component
The component.
Downloads the frame from the hardware into a software frame if possible.
The input hardware frame gets freed and the return value will point to the new software frame
The codec context.
The input.
if set to true [comes from hardware] otherwise, hardware decoding was not perfomred.
The frame downloaded from the device into RAM
Failed to transfer data to output frame
Gets the pixel format.
Port of (get_format) method in ffmpeg.c
The codec context.
The pixel formats.
The real pixel format that the codec will be using
Represents a media component of a given media type within a
media container. Derived classes must implement frame handling
logic.
Holds a reference to the Codec Context.
Holds a reference to the associated input context stream
Related to issue 94, looks like FFmpeg requires exclusive access when calling avcodec_open2()
Contains the packets pending to be sent to the decoder
The packets that have been sent to the decoder. We keep track of them in order to dispose them
once a frame has been decoded.
Detects redundant, unmanaged calls to the Dispose method.
Holds total bytes read in the lifetime of this object
Initializes a new instance of the class.
The container.
Index of the stream.
container
The container exception.
Finalizes an instance of the class.
Gets the media container associated with this component.
Gets the type of the media.
Gets the index of the associated stream.
Gets the component's stream start timestamp as reported
by the start time of the stream.
Gets the duration of this stream component.
If there is no such information it will return TimeSpan.MinValue
Gets the current length in bytes of the
packet buffer. Limit your Reads to something reasonable before
this becomes too large.
Gets the number of packets in the queue.
Decode packets until this number becomes 0.
Gets the total amount of bytes read by this component in the lifetime of this component.
Gets the ID of the codec for this component.
Gets the name of the codec for this component.
Gets the bitrate of this component as reported by the codec context.
Returns 0 for unknown.
Gets the stream information.
Clears the pending and sent Packet Queues releasing all memory held by those packets.
Additionally it flushes the codec buffered packets.
Sends a special kind of packet (an empty packet)
that tells the decoder to enter draining mode.
Pushes a packet into the decoding Packet Queue
and processes the packet in order to try to decode
1 or more frames.
The packet.
Decodes the next packet in the packet queue in this media component.
Returns the decoded frames.
The received Media Frames
Converts decoded, raw frame data in the frame source into a a usable frame.
The process includes performing picture, samples or text conversions
so that the decoded source frame data is easily usable in multimedia applications
The source frame to use as an input.
The target frame that will be updated with the source frame. If null is passed the frame will be instantiated.
The sibling blocks that may help guess some additional parameters for the input frame.
Return the updated output frame
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Determines whether the specified packet is a Null Packet (data = null, size = 0)
These null packets are used to read multiple frames from a single packet.
The packet.
true if [is empty packet] [the specified packet]; otherwise, false.
Creates a frame source object given the raw FFmpeg subtitle reference.
The raw FFmpeg subtitle pointer.
The media frame
Creates a frame source object given the raw FFmpeg frame reference.
The raw FFmpeg frame pointer.
The media frame
Releases the existing codec context and clears and disposes the packet queues.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Receives 0 or more frames from the next available packet in the Queue.
This sends the first available packet to dequeue to the decoder
and uses the decoded frames (if any) to their corresponding
ProcessFrame method.
The list of frames
Represents a set of Audio, Video and Subtitle components.
This class is useful in order to group all components into
a single set. Sending packets is automatically handled by
this class. This class is thread safe.
The internal Components
The synchronize lock
Provides a cached array to the components backing the All property.
To detect redundant Dispose calls
Initializes a new instance of the class.
Gets the available component media types.
Gets all the components in a read-only collection.
Gets the main media component of the stream to which time is synchronized.
By order of priority, first Audio, then Video
Gets the video component.
Returns null when there is no such stream component.
Gets the audio component.
Returns null when there is no such stream component.
Gets the subtitles component.
Returns null when there is no such stream component.
Gets the current length in bytes of the packet buffer.
These packets are the ones that have not been yet deecoded.
Gets the number of packets that have not been
fed to the decoders.
Gets the total bytes read by all components in the lifetime of this object.
Gets a value indicating whether this instance has a video component.
Gets a value indicating whether this instance has an audio component.
Gets a value indicating whether this instance has a subtitles component.
Gets or sets the with the specified media type.
Setting a new component on an existing media type component will throw.
Getting a non existing media component fro the given media type will return null.
Type of the media.
The media component
When the media type is invalid
MediaComponent
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Sends the specified packet to the correct component by reading the stream index
of the packet that is being sent. No packet is sent if the provided packet is set to null.
Returns the media type of the component that accepted the packet.
The packet.
The media type
Sends an empty packet to all media components.
When an EOF/EOS situation is encountered, this forces
the decoders to enter drainig mode until all frames are decoded.
Clears the packet queues for all components.
Additionally it flushes the codec buffered packets.
This is useful after a seek operation is performed or a stream
index is changed.
Removes the component of specified media type (if registered).
It calls the dispose method of the media component too.
Type of the media.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
A container capable of opening an input url,
reading packets from it, decoding frames, seeking, and pausing and resuming network streams
Code based on https://raw.githubusercontent.com/FFmpeg/FFmpeg/release/3.2/ffplay.c
The method pipeline should be:
1. Set Options (or don't, for automatic options) and Initialize,
2. Perform continuous packet reads,
3. Perform continuous frame decodes
4. Perform continuous block materialization
The exception message no input context
The read synchronize root
The decode synchronize root
The convert synchronize root
Determines if the stream seeks by bytes always
Hold the value for the internal property with the same name.
Picture attachments are required when video streams support them
and these attached packets must be read before reading the first frame
of the stream and after seeking.
The stream read interrupt callback.
Used to detect read rimeouts.
The stream read interrupt start time.
When a read operation is started, this is set to the ticks of UTC now.
The signal to request the abortion of the following read operation
If set to true, it will reset the abort requested flag to false.
Initializes a new instance of the class.
The media URL.
The stream options.
The logger.
mediaUrl
To detect redundat Dispose calls
Logging Messages will be sent to this parent object.
Gets the media URL. This is the input url, file or device that is read
by this container.
The stream initialization options.
Options are applied when creating the container.
After initialization, changing the options has no effect.
Represetnts options that applied before initializing media components and their corresponding
codecs. Once the container has created the media components, changing these options will have no effect.
Provides stream, chapter and program info held by this container.
This property is null if the the stream has not been opened.
Gets the name of the media format.
Gets the media bitrate (bits per second). Returns 0 if not available.
Holds the metadata of the media file when the stream is initialized.
Gets a value indicating whether an Input Context has been initialize.
Gets a value indicating whether this instance is open.
Gets the duration of the media.
If this information is not available (i.e. realtime media) it will
be set to TimeSpan.MinValue
Will be set to true whenever an End Of File situation is reached.
true if this instance is at end of stream; otherwise, false.
Gets the byte position at which the stream is being read.
Please note that this property gets updated after every Read.
Gets a value indicating whether the underlying media is seekable.
Gets a value indicating whether this container represents live media.
If the stream is classified as a network stream and it is not seekable, then this property will return true.
Gets a value indicating whether the input stream is a network stream.
If the format name is rtp, rtsp, or sdp or if the url starts with udp:, http:, https:, tcp:, or rtp:
then this property will be set to true.
Provides direct access to the individual Media components of the input stream.
Gets a value indicating whether reads are in the aborted state.
Gets the media start time by which all component streams are offset.
Typically 0 but it could be something other than 0.
Holds a reference to the input context.
Gets the seek start timestamp.
Gets the time the last packet was read from the input
Gets a value indicating whether a packet read delay witll be enforced.
RSTP formats or MMSH Urls will have this property set to true.
Reading packets will block for at most 10 milliseconds depending on the last read time.
This is a hack according to the source code in ffplay.c
Picture attachments are required when video streams support them
and these attached packets must be read before reading the first frame
of the stream and after seeking. This property is not part of the public API
and is meant more for internal purposes
Opens the individual stram components on the existing input context in order to start reading packets.
Any Media Options must be set before this method is called.
Seeks to the specified position in the stream. This method attempts to do so as
precisely as possible, returning decoded frames of all available media type components
just before or right on the requested position. The position must be given in 0-based time,
so it converts component stream start time offset to absolute, 0-based time.
Pass TimeSpan.Zero to seek to the beginning of the stream.
The position.
The list of media frames
No input context initialized
Reads the next available packet, sending the packet to the corresponding
internal media component. It also sets IsAtEndOfStream property.
Returns the media type if the packet was accepted by any of the media components.
Returns None if the packet was not accepted by any of the media components
or if reading failed (i.e. End of stream already or read error).
Packets are queued internally. To dequeue them you need to call the receive frames
method of each component until the packet buffer count becomes 0.
The media type of the packet that was read
No input context initialized
When a read error occurs
Decodes the next available packet in the packet queue for each of the components.
Returns the list of decoded frames. You can call this method until the Components.PacketBufferCount
becomes 0; The list of 0 or more decoded frames is returned in ascending StartTime order.
A Packet may contain 0 or more frames. Once the frame source objects are returned, you
are responsible for calling the Dispose method on them to free the underlying FFmpeg frame.
Note that even after releasing them you can still use the managed properties.
If you intend on Converting the frames to usable media frames (with Convert) you must not
release the frame. Specify the release input argument as true and the frame will be automatically
freed from memory.
The list of media frames
Performs audio, video and subtitle conversions on the decoded input frame so data
can be used as a Frame. Please note that if the output is passed as a reference.
This works as follows: if the output reference is null it will be automatically instantiated
and returned by this function. This enables to either instantiate or reuse a previously allocated Frame.
This is important because buffer allocations are exepnsive operations and this allows you
to perform the allocation once and continue reusing thae same buffer.
The raw frame source. Has to be compatiable with the target. (e.g. use VideoFrameSource to conver to VideoFrame)
The target frame. Has to be compatible with the source.
The siblings that may help guess additional output parameters.
if set to true releases the raw frame source from unmanaged memory.
The media block
No input context initialized
MediaType
input
input
or
input
Signals the packet reading operations to abort immediately.
if set to true, the read interrupt will reset the aborted state automatically
Signals the state for read operations to stop being in the aborted state
Closes the input context immediately releasing all resources.
This method is equivalent to calling the dispose method.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Initializes the input context to start read operations.
This does NOT create the stream components and therefore, there needs to be a call
to the Open method.
The input context has already been initialized.
When an error initializing the stream occurs.
Initializes the InputContext and applies format options.
https://www.ffmpeg.org/ffmpeg-formats.html#Format-Options
Opens the individual stream components to start reading packets.
Creates the stream components by first finding the best available streams.
Then it initializes the components of the correct type each.
The exception ifnromation
Reads the next packet in the underlying stream and enqueues in the corresponding media component.
Returns None of no packet was read.
The type of media packet that was read
Initialize
Raised when an error reading from the stream occurs.
The interrupt callback to handle stream reading timeouts
A pointer to the format input context
0 for OK, 1 for error (timeout)
Seeks to the exact or prior frame of the main stream.
Supports byte seeking.
The target time.
The list of media frames
Seeks to the position at the start of the stream.
Reads and decodes packets untill the required media components have frames on or right before the target time.
The list of frames that is currently being processed. Frames will be added here.
The target time in absolute 0-based time.
The requirement.
The number of decoded frames
Drops the seek frames that are no longer needed.
Target time should be provided in absolute, 0-based time
The frames.
The target time.
The number of dropped frames
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Represents a wrapper for an unmanaged frame.
Derived classes implement the specifics of each media type.
Initializes a new instance of the class.
The pointer.
The component.
Gets the type of the media.
The type of the media.
Gets the start time of the frame.
Gets the end time of the frame
Gets the index of the stream from which this frame was decoded.
Gets the amount of time this data has to be presented
Gets or sets a value indicating whether this frame obtained its start time
form a valid frame pts value
When the unmanaged frame is released (freed from unmanaged memory)
this property will return true.
Gets the time base of the stream that generated this frame.
Compares the current instance with another object of the same type and returns an integer that indicates whether the current instance precedes, follows, or occurs in the same position in the sort order as the other object.
An object to compare with this instance.
A value that indicates the relative order of the objects being compared. The return value has these meanings: Value Meaning Less than zero This instance precedes in the sort order. Zero This instance occurs in the same position in the sort order as . Greater than zero This instance follows in the sort order.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
A data structure containing a quque of packets to process.
This class is thread safe and disposable.
Enqueued, unmanaged packets are disposed automatically by this queue.
Dequeued packets are the responsibility of the calling code.
Gets the packet count.
Gets the sum of all the packet sizes contained
by this queue.
Gets the total duration in stream TimeBase units.
Gets or sets the at the specified index.
The .
The index.
The packet reference
Peeks the next available packet in the queue without removing it.
If no packets are available, null is returned.
The packet
Pushes the specified packet into the queue.
In other words, enqueues the packet.
The packet.
Dequeues a packet from this queue.
The dequeued packet
Clears and frees all the unmanaged packets from this queue.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Enumerates the seek target requirement levels.
Seek requirement is satisfied when
the main component has frames in the seek range.
This is the fastest option.
Seek requirement is satisfied when
the both audio and video comps have frames in the seek range.
This is the recommended option.
Seek requirement is satisfied when
ALL components have frames in the seek range
This is NOT recommended as it forces large amounts of
frames to get decoded in subtitle files.
A managed representation of an FFmpeg stream specifier
Initializes a new instance of the class.
Initializes a new instance of the class.
The stream identifier.
streamId
Initializes a new instance of the class.
Type of the media.
streamType
Initializes a new instance of the class.
Type of the media.
The stream identifier.
streamType
or
streamId
Provides suffixes for the different media types.
Gets the stream identifier.
Gets the stream suffix.
Returns a that represents this stream specifier.
A that represents this instance.
Performs subtitle stream extraction, decoding and text conversion.
Initializes a new instance of the class.
The container.
Index of the stream.
Converts decoded, raw frame data in the frame source into a a usable frame.
The process includes performing picture, samples or text conversions
so that the decoded source frame data is easily usable in multimedia applications
The source frame to use as an input.
The target frame that will be updated with the source frame. If null is passed the frame will be instantiated.
The sibling blocks that may help guess some additional parameters for the input frame.
Return the updated output frame
input cannot be null
Strips the SRT format and returns plain text.
The input.
The formatted string
Strips a line of text from the ASS format.
The input.
The formatted string
Creates a frame source object given the raw FFmpeg subtitle reference.
The raw FFmpeg subtitle pointer.
The managed frame
Represents a wrapper for an unmanaged Subtitle frame.
TODO: Only text (ASS and SRT) subtitles are supported currently.
There is no support to bitmap subtitles.
Initializes a new instance of the class.
The frame.
The component.
Finalizes an instance of the class.
Gets the type of the media.
Gets lines of text that the subtitle frame contains.
Gets the type of the text.
The type of the text.
Gets the pointer to the unmanaged subtitle struct
Releases unmanaged and - optionally - managed resources.
Allocates an AVSubtitle struct in unmanaged memory,
The subtitle struct pointer
Deallocates the subtitle struct used to create in managed memory.
The frame.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Performs video picture decoding, scaling and extraction logic.
Holds a reference to the video scaler
Initializes a new instance of the class.
The container.
Index of the stream.
Gets the video scaler flags used to perfom colorspace conversion (if needed).
Gets the base frame rate as reported by the stream component.
All discrete timestamps can be represented in this framerate.
Gets the current frame rate as guessed by the last processed frame.
Variable framerate might report different values at different times.
Gets the width of the picture frame.
Gets the height of the picture frame.
Converts decoded, raw frame data in the frame source into a a usable frame.
The process includes performing picture, samples or text conversions
so that the decoded source frame data is easily usable in multimedia applications
The source frame to use as an input.
The target frame that will be updated with the source frame. If null is passed the frame will be instantiated.
The siblings to help guess additional frame parameters.
Return the updated output frame
input
Creates a frame source object given the raw FFmpeg frame reference.
The raw FFmpeg frame pointer.
Create a managed fraome from an unmanaged one.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Gets the pixel format replacing deprecated pixel formats.
AV_PIX_FMT_YUVJ
The frame.
A normalized pixel format
Computes the frame filter arguments that are appropriate for the video filtering chain.
The frame.
The base filter arguments
If necessary, disposes the existing filtergraph and creates a new one based on the frame arguments.
The frame.
avfilter_graph_create_filter
or
avfilter_graph_create_filter
or
avfilter_link
or
avfilter_graph_parse
or
avfilter_graph_config
Destroys the filtergraph releasing unmanaged resources.
Represents a wrapper for an unmanaged ffmpeg video frame.
Initializes a new instance of the class.
The frame.
The component.
Finalizes an instance of the class.
Gets the type of the media.
Gets the closed caption data collected from the frame in CEA-708/EAS-608 format.
Gets the display picture number (frame number).
If not set by the decoder, this attempts to obtain it by dividing the start time by the
frame duration
Gets the coded picture number set by the decoder.
Gets the SMTPE time code.
Gets the pointer to the unmanaged frame.
Releases unmanaged and - optionally - managed resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Represents a Media Engine that contains underlying streams of audio and/or video.
It uses the fantastic FFmpeg library to perform reading and decoding of media streams.
Raises the MessageLogged event
The instance containing the message.
Raises the media failed event.
The ex.
Raises the media closed event.
Raises the media opened event.
Raises the media initializing event.
The options.
The URL.
Raises the media opening event.
Raises the buffering started event.
Raises the buffering ended event.
Raises the Seeking started event.
Raises the Seeking ended event.
Raises the media ended event.
Sends the on position changed.
The old value.
The new value.
Sends the on media state changed.
The old value.
The new value.
The open or close command done signalling object.
Open and close are synchronous commands.
The command queue to be executed in the order they were sent.
Represents a real-time time measuring device.
Rendering media should occur as requested by the clock.
The underlying media container that provides access to
individual media component streams
Opens the specified URI.
The URI.
The awaitable task
Source
Closes the currently loaded media.
The awaitable task
Begins or resumes playback of the currently loaded media.
The awaitable command
Pauses playback of the currently loaded media.
The awaitable command
Pauses and rewinds the currently loaded media.
The awaitable command
Seeks to the specified position.
New position for the player.
Sets the specified playback speed ratio.
New playback speed ratio.
Begins a synchronous command by locking the internal wait handle.
True if successful, false if unsuccessful
Ends a synchronous command by releasing the internal wait handle.
To detect redundant calls
Initializes a new instance of the class.
The associated parent object.
The parent implementing connector methods.
Thrown when the static Initialize method has not been called.
Contains the Media Status
Gets the internal real time clock position.
This is different from the position property and it is useful
in computing things like real-time latency in a render cycle.
Provides stream, chapter and program info of the underlying media.
Returns null when no media is loaded.
Gets a value indicating whether this instance is disposed.
Gets the associated parent object.
Gets the event connector (platform specific).
Logs the specified message into the logger queue.
Type of the message.
The message.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Releases unmanaged and - optionally - managed resources.
Please not that this call is non-blocking/asynchronous.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
The initialize lock
The has intialized flag
The ffmpeg directory
Stores the load mode flags
Gets the platform-specific implementation requirements.
Gets or sets the FFmpeg path from which to load the FFmpeg binaries.
You must set this path before setting the Source property for the first time on any instance of this control.
Settng this property when FFmpeg binaries have been registered will have no effect.
Gets or sets the bitwise library identifiers to load.
If FFmpeg is already loaded, the value cannot be changed.
Initializes the MedieElementCore.
The platform-specific implementation.
This partial class implements:
1. Packet reading from the Container
2. Frame Decoding from packet buffer and Block buffering
3. Block Rendering from block buffer
Holds the materialized block cache for each media type.
Gets the packet reading cycle control evenet.
Gets the frame decoding cycle control event.
Gets the block rendering cycle control event.
Gets the seeking done control event.
Gets or sets a value indicating whether the workedrs have been requested
an exit.
Gets or sets a value indicating whether the decoder has moved its byte position
to something other than the normal continuous reads in the last read cycle.
Holds the block renderers
Holds the last rendered StartTime for each of the media block types
Gets a value indicating whether more packets can be read from the stream.
This does not check if the packet queue is full.
Gets a value indicating whether room is available in the download cache.
Gets a value indicating whether more frames can be decoded from the packet queue.
That is, if we have packets in the packet buffer or if we are not at the end of the stream.
Initializes the media block buffers and
starts packet reader, frame decoder, and block rendering workers.
Stops the packet reader, frame decoder, and block renderers
Returns the value of a discrete video position if possible
The position.
The snapped position
Gets a value indicating whether more frames can be converted into blocks of the given type.
The t.
true if this instance [can read more frames of] the specified t; otherwise, false.
Sends the given block to its corresponding media renderer.
The block.
The clock position.
The number of blocks sent to the renderer
Adds the blocks of the given media type.
The t.
The number of blocks that were added
Continually decodes the available packet buffer to have as
many frames as possible in each frame queue and
up to the MaxFrames on each component
Runs the read task which keeps a packet buffer as full as possible.
It reports on DownloadProgress by enqueueing an update to the property
in order to avoid any kind of disruption to this thread caused by the UI thread.
Starts the block rendering worker.
Stops the block rendering worker.
Fast, atomioc boolean combining interlocked to write value and volatile to read values
Idea taken from Memory model and .NET operations in article:
http://igoro.com/archive/volatile-keyword-in-c-memory-model-explained/
Initializes a new instance of the class.
Initializes a new instance of the class.
if set to true [initial value].
Gets the latest value written by any of the processors in the machine
Setting
Fast, atomioc double combining interlocked to write value and volatile to read values
Idea taken from Memory model and .NET operations in article:
http://igoro.com/archive/volatile-keyword-in-c-memory-model-explained/
Initializes a new instance of the class.
Initializes a new instance of the class.
The initial value.
Gets or sets the latest value written by any of the processors in the machine
Fast, atomioc long combining interlocked to write value and volatile to read values
Idea taken from Memory model and .NET operations in article:
http://igoro.com/archive/volatile-keyword-in-c-memory-model-explained/
Initializes a new instance of the class.
Gets or sets the latest value written by any of the processors in the machine
A simple benchmarking class.
Starts measuring with the given identifier.
The identifier.
A disposable object that when disposed, adds a benchmark result.
Outputs the benchmark statistics.
A string containing human-readable statistics
Adds the specified result to the given identifier.
The identifier.
The elapsed.
Represents a disposable benchmark unit.
Initializes a new instance of the class.
The identifier.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
A fixed-size buffer that acts as an infinite length one.
This buffer is backed by unmanaged, very fast memory so ensure you call
the dispose method when you are donde using it.
The locking object to perform synchronization.
To detect redundant calls
The unmanaged buffer
Initializes a new instance of the class.
Length of the buffer.
Finalizes an instance of the class.
Gets the capacity of this buffer.
Gets the current, 0-based read index
Gets the maximum rewindable amount of bytes.
Gets the current, 0-based write index.
Gets an the object associated with the last write
Gets the available bytes to read.
Gets the number of bytes that can be written.
Gets percentage of used bytes (readbale/available, from 0.0 to 1.0).
Skips the specified amount requested bytes to be read.
The requested bytes.
When requested bytes GT readable count
Rewinds the read position by specified requested amount of bytes.
The requested bytes.
When requested GT rewindable
Reads the specified number of bytes into the target array.
The requested bytes.
The target.
The target offset.
When requested bytes is greater than readble count
Writes data to the backing buffer using the specified pointer and length.
and associating a write tag for this operation.
The source.
The length.
The write tag.
if set to true, overwrites the data even if it has not been read.
When read needs to be called more often!
Resets all states as if this buffer had just been created.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Defines a generic interface for synchronized locking mechanisms
Acquires a writer lock.
The lock is released when the returned locking object is disposed.
A disposable locking object.
Acquires a reader lock.
The lock is released when the returned locking object is disposed.
A disposable locking object.
Represents a set of preallocated media blocks of the same media type.
A block buffer contains playback and pool blocks. Pool blocks are blocks that
can be reused. Playback blocks are blocks that have been filled.
This class is thread safe.
The blocks that are available to be filled.
The blocks that are available for rendering.
Controls multiple reads and exclusive writes
Initializes a new instance of the class.
The capacity.
Type of the media.
Gets the media type of the block buffer.
Gets the start time of the first block.
Gets the end time of the last block.
Gets the range of time between the first block and the end time of the last block.
Gets the average duration of the currently available playback blocks.
Gets a value indicating whether all the durations of the blocks are equal
Gets the number of available playback blocks.
Gets the maximum count of this buffer.
Gets the usage percent from 0.0 to 1.0
Gets a value indicating whether the playback blocks are all allocated.
Holds the duration of all the blocks that have been added in the lifetime of this object.
Gets the at the specified index.
The .
The index.
The media block
Gets the at the specified timestamp.
The .
At time.
The media block
Gets the percentage of the range for the given time position.
The position.
The percent of the range
Retrieves the block following the provided current block
The current block.
The next media block
Determines whether the given render time is within the range of playback blocks.
The render time.
true if [is in range] [the specified render time]; otherwise, false.
Retrieves the index of the playback block corresponding to the specified
render time. This uses very fast binary and linear search commbinations.
If there are no playback blocks it returns -1.
If the render time is greater than the range end time, it returns the last playback block index.
If the render time is less than the range start time, it returns the first playback block index.
The render time.
The media block's index
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Adds a block to the playback blocks by converting the given frame.
If there are no more blocks in the pool, the oldest block is returned to the pool
and reused for the new block. The source frame is automatically disposed.
The source.
The container.
The filled block.
Clears all the playback blocks returning them to the
block pool.
Returns a formatted string with information about this buffer
The formatted string
Block factory method.
Type of the media.
MediaBlock does not have a valid type
An instance of the block of the specified type
Represents a very simple dictionary for MediaType keys
The type of the value.
Initializes a new instance of the class.
Gets or sets the item with the specified key.
return the default value of the value type when the key does not exist.
The key.
The item
Provides factory methods to create synchronized reader-writer locks
that support a generalized locking and releasing api and syntax.
Enumerates the locking operations
Creates a reader-writer lock backed by a standard ReaderWriterLock
The synchronized locker
Creates a reader-writer lock backed by a ReaderWriterLockSlim
The synchronized locker
A scaled, preallocated audio frame container.
The buffer is in 16-bit signed, interleaved sample data
Finalizes an instance of the class.
Gets a pointer to the first byte of the data buffer.
The format signed 16-bits per sample, channel interleaved
Gets the length of the buffer in bytes.
Gets the sample rate.
Gets the channel count.
Gets the available samples per channel.
Gets the media type of the data
The picture buffer length of the last allocated buffer
Holds a reference to the last allocated buffer
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Defaults and constants of the Media Engine
Gets the assembly location.
Defines Controller Value Defaults
The default speed ratio
The default balance
The default volume
The minimum speed ratio
The maximum speed ratio
The minimum balance
The maximum balance
The maximum volume
The minimum volume
Defines decoder output constants for audio streams
The audio buffer padding
The audio bits per sample (1 channel only)
The audio bytes per sample
The audio sample format
The audio channel count
The audio sample rate (per channel)
Defines decoder output constants for audio streams
The video bits per component
The video bits per pixel
The video bytes per pixel
The video pixel format. BGRX, 32bit
Defines timespans of different priority intervals
The timer high priority interval for stuff like rendering
The timer medium priority interval for stuff like property updates
The timer low priority interval for stuff like logging
Provides various helpers and extension methods.
Puts a short value in the target buffer as bytes
The target.
The offset.
The value.
Gets the a signed 16 bit integer at the guven offset.
The buffer.
The offset.
The signed integer.
Gets the audio sample amplitude (absolute value of the sample).
The buffer.
The offset.
The sample amplitude
Gets the audio sample level for 0 to 1.
The buffer.
The offset.
The amplitude level
Returns a formatted timestamp string in Seconds
The ts.
The formatted string
Returns a formatted string with elapsed milliseconds between now and
the specified date.
The dt.
The formatted string
Returns a fromatted string, dividing by the specified
factor. Useful for debugging longs with byte positions or sizes.
The ts.
The divide by.
The formatted string
Converts the given value to a value that is of the given multiple.
The value.
The multiple.
The value
Gets a timespan given a timestamp and a timebase.
The PTS.
The time base.
The TimeSpan
Gets a timespan given a timestamp and a timebase.
The PTS.
The time base.
The TimeSpan
Gets a timespan given a timestamp and a timebase.
The PTS in seconds.
The time base.
The TimeSpan
Gets a timespan given a timestamp and a timebase.
The PTS.
The time base.
The TimeSpan
Gets a timespan given a timestamp (in AV_TIME_BASE units)
The PTS.
The TimeSpan
Gets a timespan given a timestamp (in AV_TIME_BASE units)
The PTS.
The TimeSpan
Converts a fraction to a double
The rational.
The value
Normalizes precision of the TimeSpan to the nearest whole millisecond.
The source.
The normalized, whole-milliscond timespan
Clamps the specified value between the minimum and the maximum
The type of value to clamp
The value.
The minimum.
The maximum.
A value that indicates the relative order of the objects being compared
Determines whether the event is in its set state.
The event.
true if the specified m is set; otherwise, false.
Gets the fundamental (audio or video only) auxiliary media types.
All.
The main.
The non-main audio or video media types
Excludes the type of the media.
All.
The main.
An array without the media type
Joins the media types.
The main.
The with.
An array of the media types
Determines whether the array contains the media type
All.
The t.
True if it exists in the array
Deep-copies the array
All.
The copy of the array
Verifies all fundamental (audio and video) components are greater than zero
All.
The value.
True if all components are greater than the value
Gets the sum of all the values in the keyed dictionary.
All.
The sum of all values.
Gets the block count.
The blocks.
The block count for all components.
Gets the minimum start time.
The blocks.
The minimum Range Start Time
Computes the picture number.
The start time.
The duration.
The start number.
The serial picture number
Computes the smtpe time code.
The start time offset.
The duration.
The time base.
The display picture number.
The FFmpeg computed SMTPE Timecode
The load mode of FFmpeg Libraries
The full features. Tries to load everything
Loads everything except for AVDevice and AVFilter
Loads the minimum set for Audio-only programs
Loads the minimum set for Video-only programs
Connects handlers between the Media Engine event signals and a platfrom-secific implementation
Called when [media initializing].
The sender.
The stream options.
The media URL.
Called when [media opening].
The sender.
The media options.
The media information.
Called when [media opened].
The sender.
Called when [media closed].
The sender.
Called when [media failed].
The sender.
The e.
Called when [media ended].
The sender.
Called when [buffering started].
The sender.
Called when [buffering ended].
The sender.
Called when [seeking started].
The sender.
Called when [seeking ended].
The sender.
Called when [message logged].
The sender.
The instance containing the event data.
Called when [position changed].
The sender.
The old value.
The new value.
Called when [media state changed].
The sender.
The old value.
The new value.
A very simple and standard interface for message logging
Logs the specified message of the given type.
Type of the message.
The message.
Provides a unified API for media rendering classes
Gets the parent media engine.
Waits for the renderer to be ready to render.
This is called only once before all Render calls are made
Executed when the Play method is called on the parent MediaElement
Executed when the Pause method is called on the parent MediaElement
Executed when the Pause method is called on the parent MediaElement
Executed when the Close method is called on the parent MediaElement
Executed after a Seek operation is performed on the parent MediaElement
Called when a media block is due rendering.
This needs to return immediately so the calling thread is not disturbed.
The media block.
The clock position.
Called on every block rendering clock cycle just in case some update operation needs to be performed.
This needs to return immediately so the calling thread is not disturbed.
The clock position.
Defines platform-specific methods
Sets the DLL directory in which external dependencies can be located.
The path.
True for success. False for failure
Fast pointer memory block copy function
The target address.
The source address.
Length of the copy.
Fills the memory with the specified value.
The start address.
The length.
The value.
Contains factory methods and properties containing platfrom-specific implementations
of the functionality that is required by an instance of the Media Engine
Retrieves the platform-specific Native methods
Gets a value indicating whether this instance is in debug mode.
Gets a value indicating whether this instance is in design time.
Creates a renderer of the specified media type.
Type of the media.
The media engine.
The renderer
Handles global FFmpeg library messages
The message.
A base class for blocks of the deifferent MediaTypes.
Blocks are the result of decoding and scaling a frame.
Blocks have preallocated buffers wich makes them memory and CPU efficient.
Reuse blocks as much as possible. Once you create a block from a frame,
you don't need the frame anymore so make sure you dispose the frame.
Gets the media type of the data
Gets a value indicating whether the start time was guessed from siblings
or the source frame PTS comes from a NO PTS value
Gets the time at which this data should be presented (PTS)
Gets the amount of time this data has to be presented
Gets the end time.
Gets the index of the stream.
Gets a safe timestamp the the block can be displayed.
Returns StartTime if the duration is Zero or negative.
Determines whether this media block holds the specified position.
Returns false if it does not have a valid duration.
The position.
true if [contains] [the specified position]; otherwise, false.
Compares the current instance with another object of the same type and returns an integer that indicates whether the current instance precedes, follows, or occurs in the same position in the sort order as the other object.
An object to compare with this instance.
A value that indicates the relative order of the objects being compared. The return value has these meanings: Value Meaning Less than zero This instance precedes in the sort order. Zero This instance occurs in the same position in the sort order as . Greater than zero This instance follows in the sort order.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Represents a set of codec options associated with a stream specifier.
Holds the internal list of option items
Initializes a new instance of the class.
Adds an option
The key.
The value.
Type of the stream.
Adds an option
The key.
The value.
Index of the stream.
Adds an option
The key.
The value.
Type of the stream.
Index of the stream.
Retrieves a dictionary with the options for the specified codec.
Port of filter_codec_opts
The codec identifier.
The format.
The stream.
The codec.
The filtered options
Retrieves an array of dictionaries, one for each stream index
https://ffmpeg.org/ffplay.html#toc-Options
Port of setup_find_stream_info_opts.
The format.
The options per stream
Converts a character to a media type.
The c.
The media type
Well-known codec option names
The threads
The reference counted frames
The low resource
A Media Container Exception
Initializes a new instance of the class.
The message that describes the error.
Contains all the status properties of the stream being handled by the media engine.
Gets the guessed buffered bytes in the packet queue per second.
If bitrate information is available, then it returns the bitrate converted to byte rate.
Returns null if it has not been guessed.
Initializes static members of the class.
Initializes a new instance of the class.
The parent.
Gets or Sets the SpeedRatio property of the media.
Gets or Sets the Position property on the MediaElement.
Gets or Sets the Source on this MediaElement.
The Source property is the Uri of the media to be played.
Gets/Sets the Volume property on the MediaElement.
Note: Valid values are from 0 to 1
Gets/Sets the Balance property on the MediaElement.
Gets/Sets the IsMuted property on the MediaElement.
Provides key-value pairs of the metadata contained in the media.
Returns null when media has not been loaded.
Gets the media format. Returns null when media has not been loaded.
Gets the duration of a single frame step.
If there is a video component with a framerate, this propery returns the length of a frame.
If there is no video component it simply returns a tenth of a second.
Returns whether the given media has audio.
Only valid after the MediaOpened event has fired.
Returns whether the given media has video. Only valid after the
MediaOpened event has fired.
Gets the video codec.
Only valid after the MediaOpened event has fired.
Gets the video bitrate.
Only valid after the MediaOpened event has fired.
Returns the natural width of the media in the video.
Only valid after the MediaOpened event has fired.
Returns the natural height of the media in the video.
Only valid after the MediaOpened event has fired.
Gets the video frame rate.
Only valid after the MediaOpened event has fired.
Gets the duration in seconds of the video frame.
Only valid after the MediaOpened event has fired.
Gets the audio codec.
Only valid after the MediaOpened event has fired.
Gets the audio bitrate.
Only valid after the MediaOpened event has fired.
Gets the audio channels count.
Only valid after the MediaOpened event has fired.
Gets the audio sample rate.
Only valid after the MediaOpened event has fired.
Gets the audio bits per sample.
Only valid after the MediaOpened event has fired.
Gets the Media's natural duration
Only valid after the MediaOpened event has fired.
Returns whether the currently loaded media can be paused.
This is only valid after the MediaOpened event has fired.
Note that this property is computed based on wether the stream is detected to be a live stream.
Returns whether the currently loaded media is live or real-time and does not have a set duration
This is only valid after the MediaOpened event has fired.
Returns whether the currently loaded media is a network stream.
This is only valid after the MediaOpened event has fired.
Gets a value indicating whether the currently loaded media can be seeked.
Gets a value indicating whether the media is playing.
Gets a value indicating whether the media is paused.
Gets a value indicating whether this media element
currently has an open media url.
Gets the current playback state.
Gets a value indicating whether the media has reached its end.
Get a value indicating whether the media is buffering.
Gets a value indicating whether the media seeking is in progress.
Returns the current video SMTPE timecode if available.
If not available, this property returns an empty string.
Gets the name of the video hardware decoder in use.
Enabling hardware acceleration does not guarantee decoding will be performed in hardware.
When hardware decoding of frames is in use this will return the name of the HW accelerator.
Otherwise it will return an empty string.
Gets a value that indicates the percentage of buffering progress made.
Range is from 0 to 1
The packet buffer length.
It is adjusted to 1 second if bitrate information is available.
Otherwise, it's simply 512KB and it is guessed later on.
Gets a value that indicates the percentage of download progress made.
Range is from 0 to 1
Gets the maximum packet buffer length, according to the bitrate (if available).
If it's a realtime stream it will return 30 times the buffer cache length.
Otherwise, it will return 4 times of the buffer cache length.
Gets a value indicating whether the media is in the process of opening.
Updates the position.
The position.
Updates the MediaState property.
State of the media.
The new position value for this state.
Resets the controller properies.
Resets all the buffering properties to their defaults.
Updates the buffering properties: IsBuffering, BufferingProgress, DownloadProgress.
Guesses the bitrate of the input stream.
Holds media information about the input, its chapters, programs and individual stream components
Initializes a new instance of the class.
The container.
Gets the input URL string used to access and create the media container
Gets the name of the container format.
Gets the metadata for the input. This may include stuff like title, creation date, company name, etc.
Individual stream components, chapters and programs may contain additional metadata.
Gets the duration of the input as reported by the container format.
Individual stream components may have different values
Gets the start timestamp of the input as reported by the container format.
Individual stream components may have different values
If available, returns a non-zero value as reported by the container format.
Gets a list of chapters
Gets a list of programs with their associated streams.
Gets the dictionary of stream information components by stream index.
Provides access to the best streams of each media type found in the container.
This uses some internal FFmpeg heuristics.
Extracts the stream infos from the input.
The ic.
The list of stream infos
Finds the best streams for audio video, and subtitles.
The ic.
The streams.
The star infos
Extracts the chapters from the input.
The ic.
The chapters
Extracts the programs from the input and creates associations between programs and streams.
The ic.
The streams.
The program information
Represents media stream information
Gets the stream identifier. This is different from the stream index.
Typically this value is not very useful.
Gets the index of the stream.
Gets the type of the codec.
Gets the name of the codec type. Audio, Video, Subtitle, Data, etc.
Gets the codec identifier.
Gets the name of the codec.
Gets the codec profile. Only valid for H.264 or
video codecs that use profiles. Otherwise empty.
Gets the codec tag. Not very useful except for fixing bugs with
some demuxer scenarios.
Gets a value indicating whether this stream has closed captions.
Typically this is set for video streams.
Gets a value indicating whether this stream contains lossless compressed data.
Gets the pixel format. Only valid for Vide streams.
Gets the width of the video frames.
Gets the height of the video frames.
Gets the field order. This is useful to determine
if the video needs deinterlacing
Gets the video color range.
Gets the audio sample rate.
Gets the audio sample format.
Gets the stream time base unit in seconds.
Gets the sample aspect ratio.
Gets the display aspect ratio.
Gets the reported bit rate. 9 for unavalable.
Gets the maximum bit rate for variable bitrate streams. 0 if unavailable.
Gets the number of frames that were read to obtain the stream's information.
Gets the number of reference frames.
Gets the average FPS reported by the stream.
Gets the real (base) framerate of the stream
Gets the fundamental unit of time in 1/seconds used to represent timestamps in the stream, according to the stream data
Gets the fundamental unit of time in 1/seconds used to represent timestamps in the stream ,accoring to the codec
Gets the disposition flags.
Please see ffmpeg.AV_DISPOSITION_* fields.
Gets the start time.
Gets the duration.
Gets the stream's metadata.
Gets the language string from the stream's metadata.
Represents a chapter within a container
Gets the chapter index.
Gets the chapter identifier.
Gets the start time of the chapter.
Gets the end time of the chapter.
Gets the chapter metadata.
Represents a program and its associated streams within a container.
Gets the program number.
Gets the program identifier.
Gets the program metadata.
Gets the associated program streams.
Gets the name of the program. Empty if unavailable.
Represents the contents of a logging message that was sent to the log manager.
Initializes a new instance of the class.
The media element.
Type of the message.
The message.
Gets the intance of the MediaElement that generated this message.
When null, it means FFmpeg generated this message.
Gets the timestamp.
Gets the type of the message.
Gets the contents of the message.
Defines the different log message types received by the log handler
The none messge type
The information messge type
The debug messge type
The trace messge type
The error messge type
The warning messge type
Represetnts options that applied before initializing media components and their corresponding
codecs. Once the container has created the media components, changing these options will have no effect.
Gets the codec options.
Codec options are documented here: https://www.ffmpeg.org/ffmpeg-codecs.html#Codec-Options
Port of codec_opts
Gets or sets a value indicating whether [enable low resource].
In theroy this should be 0,1,2,3 for 1, 1/2, 1,4 and 1/8 resolutions.
TODO: We are for now just supporting 1/2 resolution (true value)
Port of lowres.
Gets or sets a value indicating whether [enable fast decoding].
Port of fast
Gets or sets a value indicating whether experimental hardware acceleration is enabled.
Defaults to false. This feature is experimental.
Prevent reading from audio stream components.
Port of audio_disable
Prevent reading from video stream components.
Port of video_disable
Prevent reading from subtitle stream components.
Port of subtitle_disable
Subtitles are not yet first-class citizens in FFmpeg and
this is why they are disabled by default.
Allows for a custom video filter string.
Please see: https://ffmpeg.org/ffmpeg-filters.html#Video-Filters
Initially contains the best suitable video stream.
Can be changed to a different stream reference.
Allows for a custom audio filter string.
Please see: https://ffmpeg.org/ffmpeg-filters.html#Audio-Filters
Initially contains the best suitable audio stream.
Can be changed to a different stream reference.
Initially contains the best suitable subititle stream.
Can be changed to a different stream reference.
Enumerates the different Media Types compatible with AVMEDIATYPE_* constants
defined by FFmpeg
Represents an unexisting media type (-1)
The video media type (0)
The audio media type (1)
The subtitle media type (3)
Media States compatible with MediaState enumeration
The manual status
The play status
The close status
The pause status
The stop status
Contains options for the format context as documented:
https://ffmpeg.org/ffmpeg-formats.html#Format-Options
TODO: There are still quite a bit of options that have not been implemented.
Initializes a new instance of the class.
Port of avioflags direct
Set probing size in bytes, i.e. the size of the data to analyze to get stream information.
A higher value will enable detecting more information in case it is dispersed into the stream,
but will increase latency. Must be an integer not lesser than 32. It is 5000000 by default.
Set packet size.
Ignore index.
Port of ffflags
Enable fast, but inaccurate seeks for some formats.
Port of ffflags
Generate PTS.
Port of genpts
Do not fill in missing values that can be exactly calculated.
Port of ffflags
Ignore DTS.
Port of ffflags
Discard corrupted frames.
Port of ffflags
Try to interleave output packets by DTS.
Port of ffflags
Do not merge side data.
Port of ffflags
Enable RTP MP4A-LATM payload.
Port of ffflags
Reduce the latency introduced by optional buffering
Port of ffflags
Stop muxing at the end of the shortest stream.
It may be needed to increase max_interleave_delta to avoid flushing the longer streams before EOF.
Port of ffflags
Allow seeking to non-keyframes on demuxer level when supported if set to 1. Default is 0.
Gets or sets the maximum duration to be analyzed before ifentifying stream information.
In realtime streams this can be reduced to reduce latency (i.e. TimeSpan.Zero)
Set decryption key.
A dictionary containing generic input options for both:
Global Codec Options: https://www.ffmpeg.org/ffmpeg-all.html#Codec-Options
Demuxer-Private options: https://ffmpeg.org/ffmpeg-all.html#Demuxers
Initializes a new instance of the class.
Gets or sets the forced input format. If let null or empty,
the input format will be selected automatically.
Gets or sets the amount of time to wait for a an open or read operation to complete.
A collection of well-known demuxer-specific, non-global format options
TODO: Implement some of the more common names maybe?
mpegts
Represents a set of options that are used to initialize a media container before opening the stream.
Initializes a new instance of the class.
Contains options for the format context as documented:
https://ffmpeg.org/ffmpeg-formats.html#Format-Options
A dictionary containing generic input options for both:
Global Codec Options: https://www.ffmpeg.org/ffmpeg-all.html#Codec-Options
Demuxer-Private Options: https://ffmpeg.org/ffmpeg-all.html#Demuxers
Gets the protocol prefix.
Typically async for local files and empty for other types.
A subtitle frame container. Simply contains text lines.
Gets the media type of the data
Gets the lines of text for this subtitle frame with all formatting stripped out.
Gets the original text in SRT or ASS fromat.
Gets the type of the original text.
Returns None when it's a bitmap or when it's None
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
A pre-allocated, scaled video block. The buffer is in BGR, 24-bit format
Finalizes an instance of the class.
Gets the media type of the data
Gets a pointer to the first byte of the data buffer.
The format is 32-bit BGRA
Gets the length of the buffer in bytes.
The picture buffer stride.
Pixel Width * 32-bit color (4 byes) + alignment (typically 0 for modern hw).
Gets the number of horizontal pixels in the image.
Gets the number of vertical pixels in the image.
Gets the width of the aspect ratio.
Gets the height of the aspect ratio.
Gets the SMTPE time code.
Gets the display picture number (frame number).
If not set by the decoder, this attempts to obtain it by dividing the start time by the
frame duration
Gets the coded picture number set by the decoder.
Gets the closed caption packets for this video block.
The picture buffer length of the last allocated buffer
Holds a reference to the last allocated buffer
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.