ffme
Implements the logic to close a media stream.
Initializes a new instance of the class.
The media element.
Executes this command.
Implements the logic to open a media stream.
Initializes a new instance of the class.
The manager.
The source.
Gets the source uri of the media stream.
Performs the actions that this command implements.
Creates a new instance of the renderer of the given type.
Type of the media.
The renderer that was created
mediaType has to be of a vild type
Implements the logic to pause the media stream
Initializes a new instance of the class.
The manager.
Performs the actions that this command implements.
Implements the logic to start or resume media playback
Initializes a new instance of the class.
The media element.
Performs the actions that this command implements.
Represents a command to be executed against an intance of the MediaElement
Set when the command has finished execution.
Do not use this field directly. It is managed internally by the command manager.
Initializes a new instance of the class.
The command manager.
Type of the command.
Gets the associated parent command manager
Gets the type of the command.
Gets a value indicating whether this command is marked as completed.
Marks the command as completed.
Executes the code for the command
Performs the actions that this command implements.
Represents a singlo point of contact for media command excution.
Initializes a new instance of the class.
The media element.
Gets the number of commands pending execution.
Gets the parent media element.
Opens the specified URI.
The command is processed in a Thread Pool Thread.
The URI.
Starts playing the open media URI.
Pauses the media.
Pauses and rewinds the media
Seeks to the specified position within the media.
The position.
Closes the specified media.
This command gets processed in a threadpool thread.
Sets the playback speed ratio.
The target speed ratio.
Processes the next command in the command queue.
This method is called in every block rendering cycle.
Gets the pending count of the given command type.
The t.
The amount of commands of the given type
Enqueues the command for execution.
The command.
Waits for the command to complete execution.
The command.
Calls the execution of the given command instance
and wait for its completion without blocking the dispatcher
The command.
Enumerates the different available Media Command Types
The open command
The seek command
The play command
The pause command
The stop command
The close command
The set speed ratio command
Implements the logic to seek on the media stream
Initializes a new instance of the class.
The media element.
The target position.
Gets or sets the target position.
The target position.
Performs the actions that this command implements.
A command to change speed ratio asynchronously
Initializes a new instance of the class.
The manager.
The speed ratio.
The target speed ratio
Performs the actions that this command implements.
Implements the logic to pause and rewind the media stream
Initializes a new instance of the class.
The media element.
Performs the actions that this command implements.
Fast, atomioc boolean combining interlocked to write value and volatile to read values
Idea taken from Memory model and .NET operations in article:
http://igoro.com/archive/volatile-keyword-in-c-memory-model-explained/
Initializes a new instance of the class.
Gets the latest value written by any of the processors in the machine
Setting
Fast, atomioc double combining interlocked to write value and volatile to read values
Idea taken from Memory model and .NET operations in article:
http://igoro.com/archive/volatile-keyword-in-c-memory-model-explained/
Initializes a new instance of the class.
Gets or sets the latest value written by any of the processors in the machine
Fast, atomioc long combining interlocked to write value and volatile to read values
Idea taken from Memory model and .NET operations in article:
http://igoro.com/archive/volatile-keyword-in-c-memory-model-explained/
Initializes a new instance of the class.
Gets or sets the latest value written by any of the processors in the machine
Manual additions to API calls not available in FFmpeg.Autogen
Gets the FFmpeg error mesage based on the error code
The code.
The decoded error message
A reference counter to keep track of unmanaged objects
The synchronization lock
The current reference counter instance
The instances
The types of tracked unmanaged types
The packet
The frame
The filter graph
The SWR context
The codec context
The SWS context
Gets the singleton instance of the reference counter
Gets the number of instances by location.
Adds the specified unmanaged object reference.
The t.
The r.
The location.
Removes the specified unmanaged object reference
The PTR.
Removes the specified unmanaged object reference.
The unmanaged object reference.
Adds the specified packet.
The packet.
The location.
Adds the specified context.
The context.
The location.
Adds the specified context.
The context.
The location.
Adds the specified codec.
The codec.
The location.
Adds the specified frame.
The frame.
The location.
Adds the specified filtergraph.
The filtergraph.
The location.
A reference entry
Represents a generic Logger
The sender's concrete type
Initializes a new instance of the class.
The sender.
Holds a reference to the sender.
Logs the specified message.
Type of the message.
The message.
A very simple and standard interface for message logging
Logs the specified message of the given type.
Type of the message.
The message.
Represents a very simple dictionary for MediaType keys
The type of the value.
Initializes a new instance of the class.
Gets or sets the item with the specified key.
return the default value of the value type when the key does not exist.
The key.
The item
FFmpeg Registration Native Methods
Sets the DLL directory in which external dependencies can be located.
the full path.
True if set, false if not set
Fast pointer memory block copy function
The destination.
The source.
The length.
Fills the memory.
The destination.
The length.
The fill.
Provides helpers tor un code in different modes on the UI dispatcher.
Gets the UI dispatcher.
Synchronously invokes the given instructions on the main application dispatcher.
The priority.
The action.
Enqueues the given instructions with the given arguments on the main application dispatcher.
This is a way to execute code in a fire-and-forget style
The priority.
The action.
The arguments.
Exits the execution frame.
The f.
Always a null value
A fixed-size buffer that acts as an infinite length one.
This buffer is backed by unmanaged, very fast memory so ensure you call
the dispose method when you are donde using it.
The locking object to perform synchronization.
To detect redundant calls
The unbmanaged buffer
Initializes a new instance of the class.
Length of the buffer.
Finalizes an instance of the class.
Gets the capacity of this buffer.
Gets the current, 0-based read index
Gets the maximum rewindable amount of bytes.
Gets the current, 0-based write index.
Gets an the object associated with the last write
Gets the available bytes to read.
Gets the number of bytes that can be written.
Gets percentage of used bytes (readbale/available, from 0.0 to 1.0).
Skips the specified amount requested bytes to be read.
The requested bytes.
When requested bytes GT readable count
Rewinds the read position by specified requested amount of bytes.
The requested bytes.
When requested GT rewindable
Reads the specified number of bytes into the target array.
The requested bytes.
The target.
The target offset.
When requested GT readble
Writes data to the backing buffer using the specified pointer and length.
and associating a write tag for this operation.
The source.
The length.
The write tag.
if set to true, overwrites the data even if it has not been read.
Read
When read needs to be called more!
Resets all states as if this buffer had just been created.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
A time measurement artifact.
Initializes a new instance of the class.
The clock starts poaused and at the 0 position.
Gets or sets the clock position.
Gets a value indicating whether the clock is running.
Gets or sets the speed ratio at which the clock runs.
Starts or resumes the clock.
Pauses the clock.
Sets the clock position to 0 and stops it.
The speed ratio is not modified.
Defines library-wide constants
Determines if the av_lockmgr_register is called.
If this is set to false, then the number of threads will be set to 1.
Contains audio format properties essential
to audio resampling
The standard output audio spec
Initializes static members of the class.
Prevents a default instance of the class from being created.
Initializes a new instance of the class.
The frame.
Gets the channel count.
Gets the channel layout.
Gets the samples per channel.
Gets the audio sampling rate.
Gets the sample format.
Gets the length of the buffer required to store
the samples in the current format.
Creates a source audio spec based on the info in the given audio frame
The frame.
The audio parameters
Creates a target audio spec using the sample quantities provided
by the given source audio frame
The frame.
The audio parameters
Determines if the audio specs are compatible between them.
They must share format, channel count, layout and sample rate
a.
The b.
True if the params are compatible, flase otherwise.
A single codec option along with a stream specifier.
Initializes a new instance of the class.
The spec.
The key.
The value.
Gets or sets the stream specifier.
Gets or sets the option name
Gets or sets the option value.
Enumerates the different Media Types
Represents an unexisting media type (-1)
The video media type (0)
The audio media type (1)
The subtitle media type (3)
An AVDictionaryEntry wrapper
Initializes a new instance of the class.
The entry pointer.
Gets the key.
Gets the value.
An AVDictionary management class
To detect redundant Dispose calls
Initializes a new instance of the class.
Initializes a new instance of the class.
The other.
Gets the number of elements in the dictionary
The count.
Gets or sets the value with the specified key.
The .
The key.
The entry
Converts the AVDictionary to a regular dictionary.
The dictionary to convert from.
the converterd dictionary
A wrapper for the av_dict_get method
The dictionary.
The key.
if set to true [match case].
The Entry
Fills this dictionary with a set of options
The other dictionary (source)
Gets the first entry. Null if no entries.
The entry
Gets the next entry based on the provided prior entry.
The prior entry.
The entry
Determines if the given key exists in the dictionary
The key.
if set to true [match case].
True or False
Gets the entry given the key.
The key.
if set to true [match case].
The entry
Gets the value with specified key.
The key.
The value
Sets the value for the specified key.
The key.
The value.
Sets the value for the specified key.
The key.
The value.
if set to true [dont overwrite].
Removes the entry with the specified key.
The key.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
A managed representation of an FFmpeg stream specifier
Initializes a new instance of the class.
Initializes a new instance of the class.
The stream identifier.
streamId
Initializes a new instance of the class.
Type of the media.
streamType
Initializes a new instance of the class.
Type of the media.
The stream identifier.
streamType
or
streamId
Provides suffixes for the different media types.
Gets the stream identifier.
Gets the stream suffix.
Returns a that represents this stream specifier.
A that represents this instance.
Provides a set of utilities to perfrom logging, text formatting,
conversion and other handy calculations.
Initializes static members of the class.
Determines if we are currently in Design Time
true if this instance is in design time; otherwise, false.
Gets a value indicating whether this instance is in debug mode.
Gets the assembly location.
Converts a byte pointer to a string
The byte PTR.
The string
Converts a byte pointer to a UTF8 encoded string.
The byte PTR.
The string
Converts the given value to a value that is of the given multiple.
The value.
The multiple.
The value
Gets a timespan given a timestamp and a timebase.
The PTS.
The time base.
The TimeSpan
Gets a timespan given a timestamp and a timebase.
The PTS.
The time base.
The TimeSpan
Gets a timespan given a timestamp and a timebase.
The PTS in seconds.
The time base.
The TimeSpan
Gets a timespan given a timestamp and a timebase.
The PTS.
The time base.
The TimeSpan
Gets a timespan given a timestamp (in AV_TIME_BASE units)
The PTS.
The TimeSpan
Gets a timespan given a timestamp (in AV_TIME_BASE units)
The PTS.
The TimeSpan
Converts a fraction to a double
The rational.
The value
Registers FFmpeg library and initializes its components.
It only needs to be called once but calling it more than once
has no effect. Returns the path that FFmpeg was registered from.
The override path.
Returns the path that FFmpeg was registered from.
When the folder is not found
Logs the specified message.
The sender.
Type of the message.
The message.
sender
Logs a block rendering operation as a Trace Message
if the debugger is attached.
The media element.
The block.
The clock position.
Index of the render.
Returns a formatted timestamp string in Seconds
The ts.
The formatted string
Returns a formatted string with elapsed milliseconds between now and
the specified date.
The dt.
The formatted string
Returns a fromatted string, dividing by the specified
factor. Useful for debugging longs with byte positions or sizes.
The ts.
The divide by.
The formatted string
Strips the SRT format and returns plain text.
The input.
The formatted string
Strips a line of text from the ASS format.
The input.
The formatted string
Handles the Tick event of the LogOutputter timer.
The source of the event.
The instance containing the event data.
Manages FFmpeg Multithreaded locking
The mutex.
The op.
0 for success, 1 for error
Log message callback from ffmpeg library.
The p0.
The level.
The format.
The vl.
Enumerates the differen Closed-Captioning Colors
No color
The white color
The white transparent color
The green color
The green transparent color
The blue color
The blue transparent color
The cyan color
The cyan transparent color
The red color
The red transparent color
The yellow color
The yellow transparent color
The magenta color
The magenta transparent color
The white italics color
The white italics transparent color
The background transparent color
The foreground black color
The foreground black underline color
Enumerates the Closed-Captioning misc commands
No command
The resume command
The backspace command
The alarm off command
The alarm on command
The clear line command
The roll up2 command
The roll up3 command
The roll up4 command
The start caption command
The star non caption command
The resume non caption command
The clear screen command
The new line command
The clear buffer command
The end caption command
Defines Closed-Captioning Packet types
The unrecognized packet type
The null pad packet type
The XDS class packet type
The misc command packet type
The text packet type
The mid row packet type
The preamble packet type
The color packet type
The charset packet type
The tabs packet type
Enumerates the differen Closed-Captioning Styles
The none style
The white style
The white underline style
The green style
The green underline style
The blue style
The blue underline style
The cyan style
The cyan underline style
The red style
The red underline style
The yellow style
The yellow underline style
The magenta style
The magenta underline style
The white italics style
The white italics underline style
The white indent0 style
The white indent0 underline style
The white indent4 style
The white indent4 underline style
The white indent8 style
The white indent8 underline style
The white indent12 style
The white indent12 underline style
The white indent16 style
The white indent16 underline style
The white indent20 style
The white indent20 underline style
The white indent24 style
The white indent24 underline style
The white indent28 style
The white indent28 underline style
Defines Closed-Captioning XDS Packet Classes
The none XDS Class
The current start XDS Class
The current continue XDS Class
The future start XDS Class
The future continue XDS Class
The channel start XDS Class
The channel continue XDS Class
The misc start XDS Class
The misc continue XDS Class
The public service start XDS Class
The public service continue XDS Class
The reserved start XDS Class
The reserved continue XDS Class
The private start XDS Class
The private continue XDS Class
The end all XDS Class
Represents a set of Closed Captioning Tracks
in a stream of CC packets.
The CC1 Track Packets
The CC2 Track Packets
The CC3 Track Packets
The CC4 Track Packets
Adds the specified packet and automatically places it on the right track.
If the track requires sorting it does so by reordering packets based on their timestamp.
The item.
Represents a 3-byte packet of closed-captioning data in EIA-608 format.
See: http://jackyjung.tistory.com/attachment/499e14e28c347DB.pdf
Holds the data bytes
Initializes a new instance of the class.
The timestamp.
The source.
The offset.
Initializes a new instance of the class.
The timestamp.
The header.
The d0.
The d1.
Gets the first of the two-byte packet data
Gets the second of the two-byte packet data
Gets the timestamp this packet applies to.
Gets the NTSC field (1 or 2).
0 for unknown/null packet
Gets the channel. 0 for any, 1 or 2 for specific channel toggle.
0 just means to use what a prior packet had specified.
Gets the type of the packet.
Gets the number of tabs, if the packet type is of Tabs
Gets the Misc Command, if the packet type is of Misc Command
Gets the Color, if the packet type is of Color
Gets the Style, if the packet type is of Mid Row Style
Gets the XDS Class, if the packet type is of XDS
Gets the Preamble Row Number (1 through 15), if the packet type is of Preamble
Gets the Style, if the packet type is of Preamble
Gets the text, if the packet type is of text.
Returns a that represents this instance.
A that represents this instance.
Compares the current instance with another object of the same type and returns an integer that indicates whether the current instance precedes, follows, or occurs in the same position in the sort order as the other object.
An object to compare with this instance.
A value that indicates the relative order of the objects being compared. The return value has these meanings: Value Meaning Less than zero This instance precedes in the sort order. Zero This instance occurs in the same position in the sort order as . Greater than zero This instance follows in the sort order.
Checks that the header byte starts with 11111b (5 ones binary)
The data.
If header has markers
Determines whether the valid flag of the header byte is set.
The data.
true if [is header valid falg set] [the specified data]; otherwise, false.
Gets the NTSC field type (1 or 2).
Returns 0 for unknown.
The data.
The field type
Determines whether the data is null padding
The d0.
The d1.
true if [is empty channel data] [the specified d0]; otherwise, false.
Drops the parity bit from the data byte.
The input.
The byte without a parity bit.
Converst an ASCII character code to an EIA-608 char (in Unicode)
The input.
The charset char.
The get format callback
Initializes static members of the class.
Prevents a default instance of the class from being created.
A dicitionary containing all Accelerators by pixel format
Gets the dxva2 accelerator.
Gets the hardware output pixel format.
Gets the type of the hardware device.
Attaches a hardware device context to the specified video component.
The component.
Throws when unable to initialize the hardware device
Detaches and disposes the hardware device context from the specified video component
The component.
Downloads the frame from the hardware into a software frame if possible.
The input hardware frame gets freed and the return value will point to the new software frame
The codec context.
The input.
if set to true [comes from hardware] otherwise, hardware decoding was not perfomred.
The frame downloaded from the device into RAM
Failed to transfer data to output frame
Gets the pixel format.
Port of (get_format) method in ffmpeg.c
The codec context.
The pixel formats.
The real pixel format that the codec will be using
Enumerates the seek target requirement levels.
Seek requirement is satisfied when
the main component has frames in the seek range.
This is the fastest option.
Seek requirement is satisfied when
the both audio and video comps have frames in the seek range.
This is the recommended option.
Seek requirement is satisfied when
ALL components have frames in the seek range
This is NOT recommended as it forces large amounts of
frames to get decoded in subtitle files.
A scaled, preallocated audio frame container.
The buffer is in 16-bit signed, interleaved sample data
Finalizes an instance of the class.
Gets a pointer to the first byte of the data buffer.
The format signed 16-bits per sample, channel interleaved
Gets the length of the buffer in bytes.
Gets the sample rate.
Gets the channel count.
Gets the available samples per channel.
Gets the media type of the data
The picture buffer length of the last allocated buffer
Holds a reference to the last allocated buffer
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Represents a wrapper from an unmanaged FFmpeg audio frame
Initializes a new instance of the class.
The frame.
The component.
Finalizes an instance of the class.
Gets the type of the media.
Gets the pointer to the unmanaged frame.
Releases unmanaged and - optionally - managed resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Represents a set of preallocated media blocks of the same media type.
A block buffer contains playback and pool blocks. Pool blocks are blocks that
can be reused. Playback blocks are blocks that have been filled.
This class is thread safe.
The blocks that are available to be filled.
The blocks that are available for rendering.
Initializes a new instance of the class.
The capacity.
Type of the media.
Gets the media type of the block buffer.
Gets the start time of the first block.
Gets the end time of the last block.
Gets the range of time between the first block and the end time of the last block.
Gets the average duration of the currently available playback blocks.
Gets a value indicating whether all the durations of the blocks are equal
Gets the number of available playback blocks.
Gets the maximum count of this buffer.
Gets the usage percent from 0.0 to 1.0
Gets a value indicating whether the playback blocks are all allocated.
Gets the at the specified index.
The .
The index.
The media block
Gets the at the specified timestamp.
The .
At time.
The media block
Gets the percentage of the range for the given time position.
The position.
The percent of the range
Retrieves the block following the provided current block
The current block.
The next media block
Adds a block to the playback blocks by converting the given frame.
If there are no more blocks in the pool, the oldest block is returned to the pool
and reused for the new block. The source frame is automatically disposed.
The source.
The container.
The filled block.
Clears all the playback blocks returning them to the
block pool.
Determines whether the given render time is within the range of playback blocks.
The render time.
true if [is in range] [the specified render time]; otherwise, false.
Retrieves the index of the playback block corresponding to the specified
render time. This uses very fast binary and linear search commbinations.
If there are no playback blocks it returns -1.
If the render time is greater than the range end time, it returns the last playback block index.
If the render time is less than the range start time, it returns the first playback block index.
The render time.
The media block's index
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Returns a formatted string with information about this buffer
The formatted string
Block factory method.
The media frame
MediaBlock
Provides audio sample extraction, decoding and scaling functionality.
Holds a reference to the audio resampler
This resampler gets disposed upon disposal of this object.
Used to determine if we have to reset the scaler parameters
Initializes a new instance of the class.
The container.
Index of the stream.
Gets the number of audio channels.
Gets the audio sample rate.
Gets the bits per sample.
Converts decoded, raw frame data in the frame source into a a usable frame.
The process includes performing picture, samples or text conversions
so that the decoded source frame data is easily usable in multimedia applications
The source frame to use as an input.
The target frame that will be updated with the source frame. If null is passed the frame will be instantiated.
The sibling blocks that may help guess some additional parameters for the input frame.
Return the updated output frame
input
Creates a frame source object given the raw FFmpeg frame reference.
The raw FFmpeg frame pointer.
The media frame
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Destroys the filtergraph releasing unmanaged resources.
Computes the frame filter arguments that are appropriate for the audio filtering chain.
The frame.
The base filter arguments
If necessary, disposes the existing filtergraph and creates a new one based on the frame arguments.
The frame.
avfilter_graph_create_filter
or
avfilter_graph_create_filter
or
avfilter_link
or
avfilter_graph_parse
or
avfilter_graph_config
Represents a wrapper for an unmanaged frame.
Derived classes implement the specifics of each media type.
Initializes a new instance of the class.
The pointer.
The component.
Gets the type of the media.
The type of the media.
Gets the start time of the frame.
Gets the end time of the frame
Gets the index of the stream from which this frame was decoded.
Gets the amount of time this data has to be presented
Gets or sets a value indicating whether this frame obtained its start time
form a valid frame pts value
When the unmanaged frame is released (freed from unmanaged memory)
this property will return true.
Gets the time base of the stream that generated this frame.
Compares the current instance with another object of the same type and returns an integer that indicates whether the current instance precedes, follows, or occurs in the same position in the sort order as the other object.
An object to compare with this instance.
A value that indicates the relative order of the objects being compared. The return value has these meanings: Value Meaning Less than zero This instance precedes in the sort order. Zero This instance occurs in the same position in the sort order as . Greater than zero This instance follows in the sort order.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
A base class for blocks of the deifferent MediaTypes.
Blocks are the result of decoding and scaling a frame.
Blocks have preallocated buffers wich makes them memory and CPU efficient
Reue blocks as much as possible. Once you create a block from a frame,
you don't need the frame anymore so make sure you dispose the frame.
Gets the media type of the data
Gets or sets a value indicating whether the start time was guessed from siblings
or the source frame PTS comes from a NO PTS value
Gets the time at which this data should be presented (PTS)
Gets the amount of time this data has to be presented
Gets the end time.
Gets or sets the index of the stream.
Gets the middle timestamp between the start and end time.
Returns Zero if the duration is Zero or negative.
Determines whether this media block holds the specified position.
Returns false if it does not have a valid duration.
The position.
true if [contains] [the specified position]; otherwise, false.
Compares the current instance with another object of the same type and returns an integer that indicates whether the current instance precedes, follows, or occurs in the same position in the sort order as the other object.
An object to compare with this instance.
A value that indicates the relative order of the objects being compared. The return value has these meanings: Value Meaning Less than zero This instance precedes in the sort order. Zero This instance occurs in the same position in the sort order as . Greater than zero This instance follows in the sort order.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Represents a media component of a given media type within a
media container. Derived classes must implement frame handling
logic.
Holds a reference to the Codec Context.
Holds a reference to the associated input context stream
Contains the packets pending to be sent to the decoder
The packets that have been sent to the decoder. We keep track of them in order to dispose them
once a frame has been decoded.
Detects redundant, unmanaged calls to the Dispose method.
The m total bytes read
Initializes a new instance of the class.
The container.
Index of the stream.
container
The container exception.
Finalizes an instance of the class.
Gets the media container associated with this component.
Gets the type of the media.
Gets the index of the associated stream.
Returns the component's stream start timestamp as reported
by the start time of the stream.
Gets the duration of this stream component.
If there is no such information it will return TimeSpan.MinValue
Gets the current length in bytes of the
packet buffer. Limit your Reads to something reasonable before
this becomes too large.
Gets the number of packets in the queue.
Decode packets until this number becomes 0.
Gets the total amount of bytes read by this component.
Gets the ID of the codec for this component.
Gets the name of the codec for this component.
Gets the bitrate of this component as reported by the codec context.
Returns 0 for unknown.
Gets the stream information.
Clears the pending and sent Packet Queues releasing all memory held by those packets.
Additionally it flushes the codec buffered packets.
Sends a special kind of packet (an empty packet)
that tells the decoder to enter draining mode.
Pushes a packet into the decoding Packet Queue
and processes the packet in order to try to decode
1 or more frames. The packet has to be within the range of
the start time and end time of
The packet.
Decodes the next packet in the packet queue in this media component.
Returns the decoded frames.
The received Media Frames
Converts decoded, raw frame data in the frame source into a a usable frame.
The process includes performing picture, samples or text conversions
so that the decoded source frame data is easily usable in multimedia applications
The source frame to use as an input.
The target frame that will be updated with the source frame. If null is passed the frame will be instantiated.
The sibling blocks that may help guess some additional parameters for the input frame.
Return the updated output frame
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Determines whether the specified packet is a Null Packet (data = null, size = 0)
These null packets are used to read multiple frames from a single packet.
The packet.
true if [is empty packet] [the specified packet]; otherwise, false.
Creates a frame source object given the raw FFmpeg subtitle reference.
The raw FFmpeg subtitle pointer.
The media frame
Creates a frame source object given the raw FFmpeg frame reference.
The raw FFmpeg frame pointer.
The media frame
Releases the existing codec context and clears and disposes the packet queues.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Receives 0 or more frames from the next available packet in the Queue.
This sends the first available packet to dequeue to the decoder
and uses the decoded frames (if any) to their corresponding
ProcessFrame method.
The list of frames
Represents a set of Audio, Video and Subtitle components.
This class is useful in order to group all components into
a single set. Sending packets is automatically handled by
this class. This class is thread safe.
The internal Components
The synchronize lock
Provides a cached array to the components backing the All property.
To detect redundant Dispose calls
Initializes a new instance of the class.
Gets the available component media types.
Gets all the components in a read-only collection.
Gets the main media component of the stream to which time is synchronized.
By order of priority, first Audio, then Video
Gets the video component.
Returns null when there is no such stream component.
Gets the audio component.
Returns null when there is no such stream component.
Gets the subtitles component.
Returns null when there is no such stream component.
Gets the current length in bytes of the packet buffer.
These packets are the ones that have not been yet deecoded.
Gets the number of packets that have not been
fed to the decoders.
Gets the total bytes read by all components.
Gets a value indicating whether this instance has a video component.
Gets a value indicating whether this instance has an audio component.
Gets a value indicating whether this instance has a subtitles component.
Gets or sets the with the specified media type.
Setting a new component on an existing media type component will throw.
Getting a non existing media component fro the given media type will return null.
Type of the media.
The media component
When the media type is invalid
MediaComponent
Removes the component of specified media type (if registered).
It calls the dispose method of the media component too.
Type of the media.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Sends the specified packet to the correct component by reading the stream index
of the packet that is being sent. No packet is sent if the provided packet is set to null.
Returns the media type of the component that accepted the packet.
The packet.
The media type
Sends an empty packet to all media components.
When an EOF/EOS situation is encountered, this forces
the decoders to enter drainig mode untill all frames are decoded.
Clears the packet queues for all components.
Additionally it flushes the codec buffered packets.
This is useful after a seek operation is performed or a stream
index is changed.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
A subtitle frame container. Simply contains text lines.
Gets the media type of the data
Gets the lines of text for this subtitle frame with all formatting stripped out.
Gets the original text in SRT or ASS fromat.
Gets the type of the original text.
Returns None when it's a bitmap or when it's None
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Represents a wrapper for an unmanaged Subtitle frame.
TODO: Only text (ASS and SRT) subtitles are supported currently.
There is no support to bitmap subtitles.
Initializes a new instance of the class.
The frame.
The component.
Finalizes an instance of the class.
Gets the type of the media.
Gets lines of text that the subtitle frame contains.
Gets the type of the text.
The type of the text.
Gets the pointer to the unmanaged subtitle struct
Releases unmanaged and - optionally - managed resources.
Allocates an AVSubtitle struct in unmanaged memory,
The subtitle struct pointer
Deallocates the subtitle struct used to create in managed memory.
The frame.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
A pre-allocated, scaled video block. The buffer is in BGR, 24-bit format
Finalizes an instance of the class.
Gets the media type of the data
Gets a pointer to the first byte of the data buffer.
The format is 24bit BGR
Gets the length of the buffer in bytes.
The picture buffer stride.
Pixel Width * 24-bit color (3 byes) + alignment (typically 0 for modern hw).
Gets the number of horizontal pixels in the image.
Gets the number of vertical pixels in the image.
Gets or sets the width of the aspect ratio.
Gets or sets the height of the aspect ratio.
Gets the SMTPE time code.
Gets the display picture number (frame number).
If not set by the decoder, this attempts to obtain it by dividing the start time by the
frame duration
Gets the coded picture number set by the decoder.
The picture buffer length of the last allocated buffer
Holds a reference to the last allocated buffer
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Represents a wrapper for an unmanaged ffmpeg video frame.
Initializes a new instance of the class.
The frame.
The component.
Finalizes an instance of the class.
Gets the type of the media.
Gets the closed caption data collected from the frame in CEA-708/EAS-608 format.
Gets the display picture number (frame number).
If not set by the decoder, this attempts to obtain it by dividing the start time by the
frame duration
Gets the coded picture number set by the decoder.
Gets the SMTPE time code.
Gets the pointer to the unmanaged frame.
Releases unmanaged and - optionally - managed resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
A container capable of opening an input url,
reading packets from it, decoding frames, seeking, and pausing and resuming network streams
Code heavily based on https://raw.githubusercontent.com/FFmpeg/FFmpeg/release/3.2/ffplay.c
The method pipeline should be:
1. Set Options (or don't, for automatic options) and Initialize,
2. Perform continuous Reads,
3. Perform continuous Decodes and Converts/Materialize
The logger
Holds a reference to an input context.
The read synchronize root
The decode synchronize root
The convert synchronize root
Holds the set of components.
To detect redundat Dispose calls
Determines if the stream seeks by bytes always
Hold the value for the internal property with the same name.
Picture attachments are required when video streams support them
and these attached packets must be read before reading the first frame
of the stream and after seeking.
The stream read interrupt callback.
Used to detect read rimeouts.
The stream read interrupt start time.
When a read operation is started, this is set to the ticks of UTC now.
Initializes a new instance of the class.
The media URL.
The logger.
The protocol prefix. See https://ffmpeg.org/ffmpeg-protocols.html
Leave null if setting it is not intended.
mediaUrl
Finalizes an instance of the class.
Gets the media URL. This is the input url, file or device that is read
by this container.
Gets the protocol prefix.
Typically async for local files and empty for other types.
The media initialization options.
Options are applied when calling the Initialize method.
After initialization, changing the options has no effect.
Provides stream, chapter and program info held by this container.
Gets the name of the media format.
Gets the media bitrate (bits per second). Returns 0 if not available.
Holds the metadata of the media file when the stream is initialized.
Gets a value indicating whether an Input Context has been initialize.
Gets a value indicating whether this instance is open.
Gets the duration of the media.
If this information is not available (i.e. realtime media) it will
be set to TimeSpan.MinValue
Will be set to true whenever an End Of File situation is reached.
Gets the byte position at which the stream is being read.
Please note that this property gets updated after every Read.
Gets a value indicating whether the underlying media is seekable.
Gets a value indicating whether this container represents realtime media.
If the format name is rtp, rtsp, or sdp or if the url starts with udp: or rtp:
then this property will be set to true.
Provides direct access to the individual Media components of the input stream.
Gets the media start time by which all component streams are offset.
Typically 0 but it could be something other than 0.
Gets the seek start timestamp.
Gets the time the last packet was read from the input
For RTSP and other realtime streams reads can be suspended.
For RTSP and other realtime streams reads can be suspended.
This property will return true if reads have been suspended.
Gets a value indicating whether a packet read delay witll be enforced.
RSTP formats or MMSH Urls will have this property set to true.
Reading packets will block for at most 10 milliseconds depending on the last read time.
This is a hack according to the source code in ffplay.c
Picture attachments are required when video streams support them
and these attached packets must be read before reading the first frame
of the stream and after seeking. This property is not part of the public API
and is meant more for internal purposes
Opens the individual stram components on the existing input context in order to start reading packets.
Any Media Options must be set before this method is called.
Seeks to the specified position in the stream. This method attempts to do so as
precisely as possible, returning decoded frames of all available media type components
just before or right on the requested position. The position must be given in 0-based time,
so it converts component stream start time offset to absolute, 0-based time.
Pass TimeSpan.Zero to seek to the beginning of the stream.
The position.
The list of media frames
Reads the next available packet, sending the packet to the corresponding
internal media component. It also sets IsAtEndOfStream property.
Returns the media type if the packet was accepted by any of the media components.
Returns None if the packet was not accepted by any of the media components
or if reading failed (i.e. End of stream already or read error).
Packets are queued internally. To dequeue them you need to call the receive frames
method of each component until the packet buffer count becomes 0.
The media type of the packet that was read
No input context initialized
When a read error occurs
Decodes the next available packet in the packet queue for each of the components.
Returns the list of decoded frames. You can call this method until the Components.PacketBufferCount
becomes 0; The list of 0 or more decoded frames is returned in ascending StartTime order.
A Packet may contain 0 or more frames. Once the frame source objects are returned, you
are responsible for calling the Dispose method on them to free the underlying FFmpeg frame.
Note that even after releasing them you can still use the managed properties.
If you intend on Converting the frames to usable media frames (with Convert) you must not
release the frame. Specify the release input argument as true and the frame will be automatically
freed from memory.
The list of media frames
Performs audio, video and subtitle conversions on the decoded input frame so data
can be used as a Frame. Please note that if the output is passed as a reference.
This works as follows: if the output reference is null it will be automatically instantiated
and returned by this function. This enables to either instantiate or reuse a previously allocated Frame.
This is important because buffer allocations are exepnsive operations and this allows you
to perform the allocation once and continue reusing thae same buffer.
The raw frame source. Has to be compatiable with the target. (e.g. use VideoFrameSource to conver to VideoFrame)
The target frame. Has to be compatible with the source.
The siblings that may help guess additional output parameters.
if set to true releases the raw frame source from unmanaged memory.
The media block
No input context initialized
MediaType
input
input
or
input
Closes the input context immediately releasing all resources.
This method is equivalent to calling the dispose method.
Releases unmanaged and - optionally - managed resources.
Initializes the input context to start read operations.
This does NOT create the stream components and therefore, there needs to be a call
to the Open method.
The input context has already been initialized.
When an error initializing the stream occurs.
Opens the individual stream components to start reading packets.
Creates the stream components by first finding the best available streams.
Then it initializes the components of the correct type each.
The exception ifnromation
The interrupt callback to handle stream reading timeouts
A pointer to the format input context
0 for OK, 1 for error (timeout)
Reads the next packet in the underlying stream and enqueues in the corresponding media component.
Returns None of no packet was read.
The type of media packet that was read
Initialize
Raised when an error reading from the stream occurs.
Suspends / pauses network streams
This should only be called upon Dispose
Resumes the reads of network streams
Drops the seek frames that are no longer needed.
Target time should be provided in absolute, 0-based time
The frames.
The target time.
The number of dropped frames
Seeks to the position at the start of the stream.
Seeks to the exact or prior frame of the main stream.
Supports byte seeking.
The target time.
The list of media frames
Reads and decodes packets untill the required media components have frames on or right before the target time.
The list of frames that is currently being processed. Frames will be added here.
The target time in absolute 0-based time.
The requirement.
The number of decoded frames
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
A data structure containing a quque of packets to process.
This class is thread safe and disposable.
Enqueued, unmanaged packets are disposed automatically by this queue.
Dequeued packets are the responsibility of the calling code.
Gets the packet count.
Gets the sum of all the packet sizes contained
by this queue.
Gets the total duration in stream TimeBase units.
Gets or sets the at the specified index.
The .
The index.
The packet reference
Peeks the next available packet in the queue without removing it.
If no packets are available, null is returned.
The packet
Pushes the specified packet into the queue.
In other words, enqueues the packet.
The packet.
Dequeues a packet from this queue.
The dequeued packet
Clears and frees all the unmanaged packets from this queue.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Performs subtitle stream extraction, decoding and text conversion.
Initializes a new instance of the class.
The container.
Index of the stream.
Converts decoded, raw frame data in the frame source into a a usable frame.
The process includes performing picture, samples or text conversions
so that the decoded source frame data is easily usable in multimedia applications
The source frame to use as an input.
The target frame that will be updated with the source frame. If null is passed the frame will be instantiated.
The sibling blocks that may help guess some additional parameters for the input frame.
Return the updated output frame
input cannot be null
Creates a frame source object given the raw FFmpeg subtitle reference.
The raw FFmpeg subtitle pointer.
The managed frame
Performs video picture decoding, scaling and extraction logic.
The output pixel format of the scaler: 24-bit BGR
Holds a reference to the video scaler
Initializes a new instance of the class.
The container.
Index of the stream.
Gets the video scaler flags used to perfom colorspace conversion (if needed).
Gets the base frame rate as reported by the stream component.
All discrete timestamps can be represented in this framerate.
Gets the current frame rate as guessed by the last processed frame.
Variable framerate might report different values at different times.
Gets the width of the picture frame.
Gets the height of the picture frame.
Converts decoded, raw frame data in the frame source into a a usable frame.
The process includes performing picture, samples or text conversions
so that the decoded source frame data is easily usable in multimedia applications
The source frame to use as an input.
The target frame that will be updated with the source frame. If null is passed the frame will be instantiated.
The siblings to help guess additional frame parameters.
Return the updated output frame
input
Creates a frame source object given the raw FFmpeg frame reference.
The raw FFmpeg frame pointer.
Create a managed fraome from an unmanaged one.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Gets the pixel format replacing deprecated pixel formats.
AV_PIX_FMT_YUVJ
The frame.
A normalized pixel format
Computes the frame filter arguments that are appropriate for the video filtering chain.
The frame.
The base filter arguments
If necessary, disposes the existing filtergraph and creates a new one based on the frame arguments.
The frame.
avfilter_graph_create_filter
or
avfilter_graph_create_filter
or
avfilter_link
or
avfilter_graph_parse
or
avfilter_graph_config
Destroys the filtergraph releasing unmanaged resources.
Represents a control that contains audio and/or video.
In contrast with System.Windows.Controls.MediaElement, this version uses
the FFmpeg library to perform reading and decoding of media streams.
Occurs right before the video is presented on the screen.
You can update the pizels on the bitmap before it is rendered on the screen.
Or you could take a screenshot.
Ensure you handle this very quickly as it runs on the UI thread.
Occurs right before the audio is added to the audio buffer.
You can update the bytes before they are enqueued.
Ensure you handle this quickly before you get choppy audio.
Occurs right before the subtitles are rendered.
You can update the text.
Ensure you handle this quickly before you get choppy subtitles.
Raises the rendering video event.
The bitmap.
The stream.
The smtpe timecode.
The picture number.
The start time.
The duration.
The clock.
Raises the rendering audio event.
The audio block.
The clock.
Raises the rendering subtitles event.
The block.
The clock.
True if the rendering should be prevented
This partial class implements:
1. Packet reading from the Container
2. Frame Decoding from packet buffer and Block buffering
3. Block Rendering from block buffer
Gets the packet reading cycle control evenet.
Gets the frame decoding cycle control event.
Gets the block rendering cycle control event.
Gets the seeking done control event.
Gets or sets a value indicating whether the workedrs have been requested
an exit.
Gets or sets a value indicating whether the decoder has moved its byte position
to something other than the normal continuous reads in the last read cycle.
Holds the blocks
Holds the block renderers
Holds the last rendered StartTime for each of the media block types
Gets a value indicating whether more packets can be read from the stream.
This does not check if the packet queue is full.
Gets a value indicating whether more frames can be decoded from the packet queue.
That is, if we have packets in the packet buffer or if we are not at the end of the stream.
Runs the read task which keeps a packet buffer as full as possible.
It reports on DownloadProgress by enqueueing an update to the property
in order to avoid any kind of disruption to this thread caused by the UI thread.
Continually decodes the available packet buffer to have as
many frames as possible in each frame queue and
up to the MaxFrames on each component
Continuously converts frmes and places them on the corresponding
block buffer. This task is responsible for keeping track of the clock
and calling the render methods appropriate for the current clock position.
Sets the clock to a discrete video position if possible
The position.
Gets a value indicating whether more frames can be converted into blocks of the given type.
The t.
true if this instance [can read more frames of] the specified t; otherwise, false.
Sends the given block to its corresponding media renderer.
The block.
The clock position.
The number of blocks sent to the renderer
Adds the blocks of the given media type.
The t.
The number of blocks that were added
The command queue to be executed in the order they were sent.
Represents a real-time time measuring device.
Rendering media should occur as requested by the clock.
The underlying media container that provides access to
individual media component streams
Begins or resumes playback of the currently loaded media.
Pauses playback of the currently loaded media.
Pauses and rewinds the currently loaded media.
Closes the currently loaded media.
The logger
This is the image that will display the video from a Writeable Bitmap
To detect redundant calls
The ffmpeg directory
IUriContext BaseUri backing
The position update timer
When position is being set from within this control, this field will
be set to true. This is useful to detect if the user is setting the position
or if the Position property is being driven from within
Flag when disposing process start but not finished yet
Initializes static members of the class.
Initializes a new instance of the class.
Occurs when a logging message from the FFmpeg library has been received.
This is shared across all instances of Media Elements
Multicast event for property change notifications.
Occurs when a logging message has been logged.
This does not include FFmpeg messages.
Gets or sets the FFmpeg path from which to load the FFmpeg binaries.
You must set this path before setting the Source property for the first time on any instance of this control.
Settng this property when FFmpeg binaries have been registered will throw an exception.
Gets or sets the horizontal alignment characteristics applied to this element when it is
composed within a parent element, such as a panel or items control.
Gets or sets the base URI of the current application context.
When position is being set from within this control, this field will
be set to true. This is useful to detect if the user is setting the position
or if the Position property is being driven from within
Gets the grid control holding the rest of the controls.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Raises the FFmpegMessageLogged event
The instance containing the event data.
Updates the position property signaling the update is
coming internally. This is to distinguish between user/binding
written value to the Position Porperty and value set by this control's
internal clock.
The current position.
Raises the MessageLogged event
The instance containing the event data.
Checks if a property already matches a desired value. Sets the property and
notifies listeners only when necessary.
Type of the property.
Reference to a property with both getter and setter.
Desired value for the property.
Name of the property used to notify listeners. This
value is optional and can be provided automatically when invoked from compilers that
support CallerMemberName.
True if the value was changed, false if the existing value matched the
desired value.
Notifies listeners that a property value has changed.
Name of the property used to notify listeners. This
value is optional and can be provided automatically when invoked from compilers
that support .
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
DependencyProperty for FFmpegMediaElement Source property.
DependencyProperty for Stretch property.
DependencyProperty for StretchDirection property.
The DependencyProperty for the MediaElement.Balance property.
The DependencyProperty for the MediaElement.IsMuted property.
The DependencyProperty for the MediaElement.SpeedRatio property.
The DependencyProperty for the MediaElement.Volume property.
The DependencyProperty for the MediaElement.ScrubbingEnabled property.
The DependencyProperty for the MediaElement.UnloadedBehavior property.
TODO: Currently this property has no effect. Needs implementation.
The DependencyProperty for the MediaElement.LoadedBehavior property.
The DependencyProperty for the MediaElement.Position property.
Gets/Sets the Source on this MediaElement.
The Source property is the Uri of the media to be played.
Gets/Sets the Stretch on this MediaElement.
The Stretch property determines how large the MediaElement will be drawn.
Gets/Sets the stretch direction of the Viewbox, which determines the restrictions on
scaling that are applied to the content inside the Viewbox. For instance, this property
can be used to prevent the content from being smaller than its native size or larger than
its native size.
Specifies the behavior that the media element should have when it
is loaded. The default behavior is that it is under manual control
(i.e. the caller should call methods such as Play in order to play
the media). If a source is set, then the default behavior changes to
to be playing the media. If a source is set and a loaded behavior is
also set, then the loaded behavior takes control.
Gets/Sets the SpeedRatio property on the MediaElement.
Specifies how the underlying media should behave when
it has ended. The default behavior is to Close the media.
Gets/Sets the Volume property on the MediaElement.
Note: Valid values are from 0 to 1
Gets/Sets the Balance property on the MediaElement.
Gets/Sets the IsMuted property on the MediaElement.
Gets or sets a value that indicates whether the MediaElement will update frames
for seek operations while paused. This is a dependency property.
Gets/Sets the Position property on the MediaElement.
Provides key-value pairs of the metadata contained in the media.
Returns null when media has not been loaded.
Gets the media format. Returns null when media has not been loaded.
Gets the duration of a single frame step.
If there is a video component with a framerate, this propery returns the length of a frame.
If there is no video component it simply returns a tenth of a second.
Returns whether the given media has audio.
Only valid after the MediaOpened event has fired.
Returns whether the given media has video. Only valid after the
MediaOpened event has fired.
Gets the video codec.
Only valid after the MediaOpened event has fired.
Gets the video bitrate.
Only valid after the MediaOpened event has fired.
Returns the natural width of the media in the video.
Only valid after the MediaOpened event has fired.
Returns the natural height of the media in the video.
Only valid after the MediaOpened event has fired.
Gets the video frame rate.
Only valid after the MediaOpened event has fired.
Gets the duration in seconds of the video frame.
Only valid after the MediaOpened event has fired.
Gets the name of the video hardware decoder in use.
Enabling hardware acceleration does not guarantee decoding will be performed in hardware.
When hardware decoding of frames is in use this will return the name of the HW accelerator.
Otherwise it will return an empty string.
Gets the audio codec.
Only valid after the MediaOpened event has fired.
Gets the audio bitrate.
Only valid after the MediaOpened event has fired.
Gets the audio channels count.
Only valid after the MediaOpened event has fired.
Gets the audio sample rate.
Only valid after the MediaOpened event has fired.
Gets the audio bits per sample.
Only valid after the MediaOpened event has fired.
Gets the Media's natural duration
Only valid after the MediaOpened event has fired.
Returns whether the currently loaded media can be paused.
This is only valid after the MediaOpened event has fired.
Note that this property is computed based on wether the stream is detected to be a live stream.
Returns whether the currently loaded media is live or realtime
This is only valid after the MediaOpened event has fired.
Gets a value indicating whether the currently loaded media can be seeked.
Gets a value indicating whether the media is playing.
Gets a value indicating whether the media has reached its end.
Get a value indicating whether the media is buffering.
Gets a value indicating whether the media seeking is in progress.
Returns the current video SMTPE timecode if available.
If not available, this property returns an empty string.
Gets a value that indicates the percentage of buffering progress made.
Range is from 0 to 1
The wait packet buffer length.
It is adjusted to 1 second if bitrate information is available.
Otherwise, it's simply 512KB
Gets a value that indicates the percentage of download progress made.
Range is from 0 to 1
Gets the maximum packet buffer length, according to the bitrate (if available).
If it's a realtime stream it will return 30 times the buffer cache length.
Otherwise, it will return 4 times of the buffer cache length.
Gets a value indicating whether the media is in the process of opening.
Gets a value indicating whether this media element
currently has an open media url.
Gets the current playback state.
Updates the metada property.
Updates the media properties notifying that there are new values to be read from all of them.
Call this method only when necessary because it creates a lot of events.
Resets the dependency properies.
BufferingStarted is a routed event
BufferingEnded is a routed event
SeekingStarted is a routed event
SeekingEnded is a routed event
MediaFailedEvent is a routed event.
MediaOpened is a routed event.
MediaOpeningEvent is a routed event.
MediaEnded is a routed event
Occurs when buffering of packets was started
Occurs when buffering of packets was Ended
Occurs when Seeking of packets was started
Occurs when Seeking of packets was Ended
Raised when the media fails to load or a fatal error has occurred which prevents playback.
Raised when the media is opened
Raised before the input stream of the media is opened.
Use this method to modify the input options.
Raised when the corresponding media ends.
Raises the media failed event.
The ex.
Raises the media opened event.
Raises the media opening event.
Creates a new instance of exception routed event arguments.
This method exists because the constructor has not been made public for that class.
The routed event.
The sender.
The error exception.
The event arguments
Logs the start of an event
The event.
Logs the end of an event.
The event.
Raises the buffering started event.
Raises the buffering ended event.
Raises the Seeking started event.
Raises the Seeking ended event.
Raises the media ended event.
A base class to represent media block
rendering event arguments.
Initializes a new instance of the class.
The stream.
The position.
The duration.
The clock.
Provides Stream Information coming from the media container.
Gets the clock position at which the media
was called for rendering
Gets the starting time at which this media
has to be presented.
Gets how long this media has to be presented.
Provides the audio samples rendering payload as event arguments.
Initializes a new instance of the class.
The buffer.
The length.
The stream.
The start time.
The duration.
The clock.
Gets a pointer to the samples buffer.
Samples are provided in PCM 16-bit signed, interleaved stereo.
Gets the length in bytes of the samples buffer.
Gets the number of samples in 1 second.
Gets the number of channels.
Gets the number of bits per sample.
Gets the number of samples in the buffer for all channels.
Gets the number of samples in the buffer per channel.
Provides the subtitles rendering payload as event arguments.
Initializes a new instance of the class.
The text.
The original text.
The format.
The stream.
The start time.
The duration.
The clock.
Gets the text stripped out of ASS or SRT formatting.
This is what the default subtitle renderer will display
on the screen.
Gets the text as originally decoded including
all markup and formatting.
Gets the type of subtitle format the original
subtitle text is in.
When set to true, clears the current subtitle and
prevents the subtitle block from being rendered.
The video rendering event arguments
Initializes a new instance of the class.
The bitmap.
The stream.
The smtpe timecode.
The picture number.
The start time.
The duration.
The clock.
Gets the writable bitmap filled with the video frame pixels.
Feel free to capture or change this image.
Gets the display picture number (frame number).
If not set by the decoder, this attempts to obtain it by dividing the start time by the
frame duration
Gets the SMTPE time code.
Holds media information about the input, its chapters, programs and individual stream components
Initializes a new instance of the class.
The container.
Gets the input URL string used to access and create the media container
Gets the name of the container format.
Gets the metadata for the input. This may include stuff like title, creation date, company name, etc.
Individual stream components may contain additional metadata.
The metadata
Gets the duration of the input as reported by the container format.
Individual stream components may have different values
Gets the start timestamp of the input as reported by the container format.
Individual stream components may have different values
If available, returns a non-zero value as reported by the container format.
Gets a list of chapters
Gets a list of programs with their associated streams.
Gets the dictionary of stream information components by stream index.
Provides access to the best streams of each media type found in the container.
This uses some internal FFmpeg heuristics.
Extracts the stream infos from the input.
The ic.
The list of stream infos
Finds the best streams for audio video, and subtitles.
The ic.
The streams.
The star infos
Extracts the chapters from the input.
The ic.
The chapters
Extracts the programs from the input and creates associations between programs and streams.
The ic.
The streams.
The program information
Represents media stream information
Gets the stream identifier. This is different from the stream index.
Typically this value is not very useful.
Gets the index of the stream.
Gets the type of the codec.
Gets the name of the codec type. Audio, Video, Subtitle, Data, etc.
Gets the codec identifier.
Gets the name of the codec.
Gets the codec profile. Only valid for H.264 or
video codecs that use profiles. Otherwise empty.
Gets the codec tag. Not very useful except for fixing bugs with
some demuxer scenarios.
Gets a value indicating whether this stream has closed captions.
Typically this is set for video streams.
Gets a value indicating whether this stream contains lossless compressed data.
Gets the pixel format. Only valid for Vide streams.
Gets the width of the video frames.
Gets the height of the video frames.
Gets the field order. This is useful to determine
if the video needs deinterlacing
Gets the video color range.
Gets the audio sample rate.
Gets the audio sample format.
Gets the stream time base unit in seconds.
Gets the sample aspect ratio.
Gets the display aspect ratio.
Gets the reported bit rate. 9 for unavalable.
Gets the maximum bit rate for variable bitrate streams. 0 if unavailable.
Gets the number of frames that were read to obtain the stream's information.
Gets the number of reference frames.
Gets the average FPS reported by the stream.
Gets the real (base) framerate of the stream
Gets the fundamental unit of time in 1/seconds used to represent timestamps in the stream, according to the stream data
Gets the fundamental unit of time in 1/seconds used to represent timestamps in the stream ,accoring to the codec
Gets the disposition flags.
Please see ffmpeg.AV_DISPOSITION_* fields.
Gets the start time.
Gets the duration.
Gets the stream's metadata.
Gets the language string from the stream's metadata.
Represents a chapter within a container
Gets the chapter index.
Gets the chapter identifier.
Gets the start time of the chapter.
Gets the end time of the chapter.
Gets the chapter metadata.
Represents a program and its associated streams within a container.
Gets the program number.
Gets the program identifier.
Gets the program metadata.
Gets the associated program streams.
Gets the name of the program. Empty if unavailable.
Represents the contents of alogging message that was sent to the log manager.
Initializes a new instance of the class.
The media element.
Type of the message.
The message.
Gets the intance of the MediaElement that generated this message.
When null, it means FFmpeg generated this message.
Gets the timestamp.
Gets the type of the message.
Gets the contents of the message.
Generic interface for all WaveProviders.
Gets the WaveFormat of this WaveProvider.
Fill the specified buffer with wave data.
The buffer to fill of wave data.
Offset into buffer
The number of bytes to read
the number of bytes written to the buffer.
Windows multimedia error codes from mmsystem.h.
no error, MMSYSERR_NOERROR
unspecified error, MMSYSERR_ERROR
device ID out of range, MMSYSERR_BADDEVICEID
driver failed enable, MMSYSERR_NOTENABLED
device already allocated, MMSYSERR_ALLOCATED
device handle is invalid, MMSYSERR_INVALHANDLE
no device driver present, MMSYSERR_NODRIVER
memory allocation error, MMSYSERR_NOMEM
function isn't supported, MMSYSERR_NOTSUPPORTED
error value out of range, MMSYSERR_BADERRNUM
invalid flag passed, MMSYSERR_INVALFLAG
invalid parameter passed, MMSYSERR_INVALPARAM
handle being used simultaneously on another thread (eg callback),MMSYSERR_HANDLEBUSY
specified alias not found, MMSYSERR_INVALIDALIAS
bad registry database, MMSYSERR_BADDB
registry key not found, MMSYSERR_KEYNOTFOUND
registry read error, MMSYSERR_READERROR
registry write error, MMSYSERR_WRITEERROR
registry delete error, MMSYSERR_DELETEERROR
registry value not found, MMSYSERR_VALNOTFOUND
driver does not call DriverCallback, MMSYSERR_NODRIVERCB
more data to be returned, MMSYSERR_MOREDATA
unsupported wave format, WAVERR_BADFORMAT
still something playing, WAVERR_STILLPLAYING
header not prepared, WAVERR_UNPREPARED
device is synchronous, WAVERR_SYNC
Conversion not possible (ACMERR_NOTPOSSIBLE)
Busy (ACMERR_BUSY)
Header Unprepared (ACMERR_UNPREPARED)
Cancelled (ACMERR_CANCELED)
invalid line (MIXERR_INVALLINE)
invalid control (MIXERR_INVALCONTROL)
invalid value (MIXERR_INVALVALUE)
http://msdn.microsoft.com/en-us/library/dd757347(v=VS.85).aspx
Enumerates the various wave output playback states
Stopped
Playing
Paused
Supported wave formats for WaveOutCapabilities
11.025 kHz, Mono, 8-bit
11.025 kHz, Stereo, 8-bit
11.025 kHz, Mono, 16-bit
11.025 kHz, Stereo, 16-bit
22.05 kHz, Mono, 8-bit
22.05 kHz, Stereo, 8-bit
22.05 kHz, Mono, 16-bit
22.05 kHz, Stereo, 16-bit
44.1 kHz, Mono, 8-bit
44.1 kHz, Stereo, 8-bit
44.1 kHz, Mono, 16-bit
44.1 kHz, Stereo, 16-bit
44.1 kHz, Mono, 8-bit
44.1 kHz, Stereo, 8-bit
44.1 kHz, Mono, 16-bit
44.1 kHz, Stereo, 16-bit
48 kHz, Mono, 8-bit
48 kHz, Stereo, 8-bit
48 kHz, Mono, 16-bit
48 kHz, Stereo, 16-bit
96 kHz, Mono, 8-bit
96 kHz, Stereo, 8-bit
96 kHz, Mono, 16-bit
96 kHz, Stereo, 16-bit
Represents a Wave file format
The format tag -- always 0x0001 PCM
number of channels
sample rate
for buffer estimation
block size of data
number of bits per sample of mono data
number of following bytes
Initializes a new instance of the class.
PCM 48Khz stereo 16 bit signed, interleaved, 2-channel format
Initializes a new instance of the class.
Sample Rate
Number of channels
Initializes a new instance of the class.
The rate.
The bits.
The channels.
channels - channels
Returns the number of channels (1=mono,2=stereo etc)
Returns the sample rate (samples per second)
Returns the average number of bytes used per second
Returns the block alignment
Returns the number of bits per sample (usually 16 or 32, sometimes 24 or 8)
Can be 0 for some codecs
Returns the number of extra bytes used by this waveformat. Often 0,
except for compressed formats which store extra data after the WAVEFORMATEX header
Gets the size of a wave buffer equivalent to the latency in milliseconds.
The milliseconds.
The size
Reports this WaveFormat as a string
String describing the wave format
Compares with another WaveFormat object
Object to compare to
True if the objects are the same
Provides a Hashcode for this WaveFormat
A hashcode
WaveHeader interop structure (WAVEHDR)
http://msdn.microsoft.com/en-us/library/dd743837%28VS.85%29.aspx
pointer to locked data buffer (lpData)
length of data buffer (dwBufferLength)
used for input only (dwBytesRecorded)
for client's use (dwUser)
assorted flags (dwFlags)
loop control counter (dwLoops)
PWaveHdr, reserved for driver (lpNext)
reserved for driver
Wave Header Flags enumeration
WHDR_BEGINLOOP
This buffer is the first buffer in a loop. This flag is used only with output buffers.
WHDR_DONE
Set by the device driver to indicate that it is finished with the buffer and is returning it to the application.
WHDR_ENDLOOP
This buffer is the last buffer in a loop. This flag is used only with output buffers.
WHDR_INQUEUE
Set by Windows to indicate that the buffer is queued for playback.
WHDR_PREPARED
Set by Windows to indicate that the buffer has been prepared with the waveInPrepareHeader or waveOutPrepareHeader function.
MME Wave function interop
CALLBACK_NULL
No callback
CALLBACK_FUNCTION
dwCallback is a FARPROC
CALLBACK_EVENT
dwCallback is an EVENT handle
CALLBACK_WINDOW
dwCallback is a HWND
CALLBACK_THREAD
callback is a thread ID
WIM_OPEN
WIM_CLOSE
WIM_DATA
WOM_CLOSE
WOM_DONE
WOM_OPEN
A wrapper class for MmException.
Initializes a new instance of the class.
The result returned by the Windows API call
The name of the Windows API that failed
Returns the Windows API result
Helper function to automatically raise an exception on failure
The result of the API call
The API function name
Creates an error message base don an erro result.
The result.
The function.
A descriptive rror message
A buffer of Wave samples for streaming to a Wave Output device
Initializes a new instance of the class.
WaveOut device to write to
Buffer size in bytes
Stream to provide more data
Lock to protect WaveOut API's from being called on >1 thread
Finalizes an instance of the class.
Whether the header's in queue flag is set
The buffer size in bytes
Releases resources held by this WaveBuffer
this is called by the Wave callback and should be used to refill the buffer.
This calls the .Read method on the stream
true when bytes were written. False if no bytes were written.
Releases resources held by this WaveBuffer
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Writes to wave out.
waveOutWrite
WaveOutCapabilities structure (based on WAVEOUTCAPS2 from mmsystem.h)
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/multimed/htm/_win32_waveoutcaps_str.asp
wMid
wPid
vDriverVersion
Product Name (szPname)
Supported formats (bit flags) dwFormats
Supported channels (1 for mono 2 for stereo) (wChannels)
Seems to be set to -1 on a lot of devices
wReserved1
Optional functionality supported by the device
Number of channels supported
Whether playback rate control is supported
Whether volume control is supported
Gets a value indicating whether this device supports independent channel volume control.
Gets a value indicating whether this device supports pitch control.
Gets a value indicating whether the device returns sample-accurate position information.
Gets a value indicating whether the driver is synchronous and will block while playing a buffer.
The product name
The device name Guid (if provided)
The product name Guid (if provided)
The manufacturer guid (if provided)
Checks to see if a given SupportedWaveFormat is supported
The SupportedWaveFormat
true if supported
Flags indicating what features this WaveOut device supports
supports pitch control (WAVECAPS_PITCH)
supports playback rate control (WAVECAPS_PLAYBACKRATE)
supports volume control (WAVECAPS_VOLUME)
supports separate left-right volume control (WAVECAPS_LRVOLUME)
(WAVECAPS_SYNC)
(WAVECAPS_SAMPLEACCURATE)
A wave player that opens an audio device and continuously feeds it
with audio samples using a wave provider.
Initializes a new instance of the class.
The renderer.
Finalizes an instance of the class.
Gets or sets the desired latency in milliseconds
Should be set before a call to Init
Gets or sets the number of buffers used
Should be set before a call to Init
Gets or sets the device number
Should be set before a call to Init
This must be between -1 and DeviceCount - 1.
-1 means stick to default device even default device is changed
Gets a instance indicating the format the hardware is using.
Playback State
Gets the capabilities.
Initializes the specified wave provider.
The wave provider.
Can't re-initialize during playback
Start playing the audio from the WaveStream
Pause the audio
Stop and reset the WaveOut device
Gets the current position in bytes from the wave output device.
(n.b. this is not the same thing as the position within your reader
stream - it calls directly into waveOutGetPosition)
Position in bytes
Closes this WaveOut device
Closes the WaveOut device and disposes of buffers
True if called from Dispose
Resume playing after a pause from the same position
Starts the playback thread.
Performs the continuous playback.
Closes the wave device.
Disposes the buffers.
Provides Audio Output capabilities by writing samples to the default audio output device.
Initializes a new instance of the class.
The media element.
Gets the output format of the audio
Gets the parent media element.
Gets or sets the volume.
The volume.
Gets or sets the balance (-1.0 to 1.0).
Gets or sets a value indicating whether the wave output is muted.
Gets the realtime latency of the audio relative to the internal wall clock.
A negative value means audio is ahead of the wall clock.
A positive value means audio is behind of the wall clock.
Gets current audio the position.
Gets the desired latency odf the audio device.
Value is always positive and typically 200ms. This means audio gets rendered up to this late behind the wall clock.
Gets the speed ratio.
Renders the specified media block.
The media block.
The clock position.
Called on every block rendering clock cycle just in case some update operation needs to be performed.
This needs to return immediately so the calling thread is not disturbed.
The clock position.
Executed when the Play method is called on the parent MediaElement
Executed when the Pause method is called on the parent MediaElement
Executed when the Pause method is called on the parent MediaElement
Executed when the Close method is called on the parent MediaElement
Executed after a Seek operation is performed on the parent MediaElement
Waits for the renderer to be ready to render.
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Called whenever the audio driver requests samples.
Do not call this method directly.
The render buffer.
The render buffer offset.
The requested bytes.
The number of bytes that were read.
Called when [application exit].
The sender.
The instance containing the event data.
Initializes the audio renderer.
Call the Play Method to start reading samples
Destroys the audio renderer.
Makes it useless.
Synchronizes audio rendering to the wall clock.
Returns true if additional samples need to be read.
Returns false if silence has been written and no further reading is required.
The target buffer.
The target buffer offset.
The requested bytes.
True to continue processing. False to write silence.
Reads from the Audio Buffer and stretches the samples to the required requested bytes.
This will make audio samples sound stretched (low pitch).
The result is put to the first requestedBytes count of the ReadBuffer.
requested
The requested bytes.
Reads from the Audio Buffer and shrinks (averages) the samples to the required requested bytes.
This will make audio samples sound shrunken (high pitch).
The result is put to the first requestedBytes count of the ReadBuffer.
The requested number of bytes.
if set to true average samples per block. Otherwise, take the first sample per block only
Applies volume and balance to the audio samples storead in RedBuffer and writes them
to the specified target buffer.
The target buffer.
The target buffer offset.
The requested number of bytes.
Releases unmanaged and - optionally - managed resources.
true to release both managed and unmanaged resources; false to release only unmanaged resources.
Provides a unified API for media rendering classes
Gets the parent media element.
Waits for the renderer to be ready to render.
Executed when the Play method is called on the parent MediaElement
Executed when the Pause method is called on the parent MediaElement
Executed when the Pause method is called on the parent MediaElement
Executed when the Close method is called on the parent MediaElement
Executed after a Seek operation is performed on the parent MediaElement
Called when a media block is due rendering.
This needs to return immediately so the calling thread is not disturbed.
The media block.
The clock position.
Called on every block rendering clock cycle just in case some update operation needs to be performed.
This needs to return immediately so the calling thread is not disturbed.
The clock position.
Subtitle Renderer - Does nothing at this point.
The synchronize lock
Holds the text to be rendered when the Update method is called.
Holds the text that was last rendered when Update was called.
Initializes a new instance of the class.
The media element.
Gets the parent media element.
Executed when the Close method is called on the parent MediaElement
Executed when the Pause method is called on the parent MediaElement
Executed when the Play method is called on the parent MediaElement
Executed when the Pause method is called on the parent MediaElement
Executed after a Seek operation is performed on the parent MediaElement
Waits for the renderer to be ready to render.
Renders the specified media block.
The media block.
The clock position.
Called when a media block must stop being rendered.
This needs to return immediately so the calling thread is not disturbed.
The clock position.
Gets or creates the tex blocks that make up the subtitle text and outline.
The text blocks including the fill and outline (5 total)
Sets the text to be rendered on the text blocks.
Returns immediately because it enqueues the action on the UI thread.
The text.
Provides Video Image Rendering via a WPF Writable Bitmap
The bitmap that is presented to the user.
Set when a bitmap is being written to the target bitmap
Initializes a new instance of the class.
The media element.
Gets the parent media element.
Executed when the Play method is called on the parent MediaElement
Executed when the Pause method is called on the parent MediaElement
Executed when the Pause method is called on the parent MediaElement
Executed when the Close method is called on the parent MediaElement
Executed after a Seek operation is performed on the parent MediaElement
Waits for the renderer to be ready to render.
Renders the specified media block.
This needs to return immediately so the calling thread is not disturbed.
The media block.
The clock position.
Called on every block rendering clock cycle just in case some update operation needs to be performed.
This needs to return immediately so the calling thread is not disturbed.
The clock position.
Initializes the target bitmap. Pass a null block to initialize with the default video properties.
The block.
Applies the scale transform according to the block's aspect ratio.
The b.
Defines the different log message types received by the log handler
The none messge type
The information messge type
The debug messge type
The trace messge type
The error messge type
The warning messge type
A Media Container Exception
Initializes a new instance of the class.
The message that describes the error.
Represents a set of options that are used to initialize a media container.
Initializes a new instance of the class.
Gets or sets the forced input format. If let null or empty,
the input format will be selected automatically.
Gets or sets a value indicating whether [enable low resource].
In theroy this should be 0,1,2,3 for 1, 1/2, 1,4 and 1/8 resolutions.
TODO: We are for now just supporting 1/2 rest (true value)
Port of lowres.
Gets or sets a value indicating whether [enable fast decoding].
Port of fast
A dictionary of Format options.
Supported format options are specified in https://www.ffmpeg.org/ffmpeg-formats.html#Format-Options
Gets the codec options.
Codec options are documented here: https://www.ffmpeg.org/ffmpeg-codecs.html#Codec-Options
Port of codec_opts
Gets or sets a value indicating whether experimental hardware acceleration is enabled.
Defaults to false. This feature is experimental.
Gets or sets a value indicating whether PTS are generated automatically and not read
from the packets themselves. Defaults to false.
Port of genpts
Gets or sets the maximum duration to be analyzed before ifentifying stream information.
In realtime streams this can be reduced to reduce latency (i.e. TimeSpan.Zero)
Gets or sets the amount of bytes to probe before getting the stram info
In realtime streams probesize can be reduced to reduce latency.
Minimum value is 32.
Gets or sets the amount of time to wait for a an open or read operation to complete.
Prevent reading from audio stream components.
Port of audio_disable
Prevent reading from video stream components.
Port of video_disable
Prevent reading from subtitle stream components.
Port of subtitle_disable
Subtitles are not yet first-class citizens in FFmpeg and
this is why they are disabled by default.
Allows for a custom video filter string.
Please see: https://ffmpeg.org/ffmpeg-filters.html#Video-Filters
Initially contains the best suitable video stream.
Can be changed to a different stream reference.
Allows for a custom audio filter string.
Please see: https://ffmpeg.org/ffmpeg-filters.html#Audio-Filters
Initially contains the best suitable audio stream.
Can be changed to a different stream reference.
Initially contains the best suitable subititle stream.
Can be changed to a different stream reference.
Represents a set of codec options associated with a stream specifier.
Holds the internal list of option items
Initializes a new instance of the class.
Adds an option
The key.
The value.
Type of the stream.
Adds an option
The key.
The value.
Index of the stream.
Adds an option
The key.
The value.
Type of the stream.
Index of the stream.
Retrieves a dictionary with the options for the specified codec.
Port of filter_codec_opts
The codec identifier.
The format.
The stream.
The codec.
The filtered options
Retrieves an array of dictionaries, one for each stream index
https://ffmpeg.org/ffplay.html#toc-Options
Port of setup_find_stream_info_opts.
The format.
The options per stream
Converts a character to a media type.
The c.
The media type
Represents the event arguments of the MediaOpening routed event.
Initializes a new instance of the class.
The routed event.
The source.
The options.
The input information.
Set or change the options before the media is opened.
Provides internal details of the media, inclusing its component streams.
Typically, options are set based on what this information contains.
A strongly-typed resource class, for looking up localized strings, etc.
Returns the cached ResourceManager instance used by this class.
Overrides the current thread's CurrentUICulture property for all
resource lookups using this strongly typed resource class.
Looks up a localized resource of type System.Drawing.Bitmap.