Update
This commit is contained in:
@@ -0,0 +1,357 @@
|
||||
---
|
||||
short-description: Gstreamer Elements, Pipeline and the Bus
|
||||
...
|
||||
|
||||
{{ ALERT_JS.md }}
|
||||
|
||||
# Basic tutorial 2: GStreamer concepts
|
||||
|
||||
|
||||
|
||||
## Goal
|
||||
|
||||
The previous tutorial showed how to build a pipeline automatically. Now
|
||||
we are going to build a pipeline manually by instantiating each element
|
||||
and linking them all together. In the process, we will learn:
|
||||
|
||||
- What is a GStreamer element and how to create one.
|
||||
|
||||
- How to connect elements to each other.
|
||||
|
||||
- How to customize an element's behavior.
|
||||
|
||||
- How to watch the bus for error conditions and extract information
|
||||
from GStreamer messages.
|
||||
|
||||
## Manual Hello World
|
||||
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
Copy this code into a text file named `basic-tutorial-2.c` (or find it
|
||||
in your GStreamer installation).
|
||||
|
||||
**basic-tutorial-2.c**
|
||||
|
||||
{{ tutorials/basic-tutorial-2.c }}
|
||||
|
||||
> 
|
||||
> Need help?
|
||||
>
|
||||
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux], [Mac OS X] or [Windows], or use this specific command on Linux:
|
||||
>
|
||||
> `` gcc basic-tutorial-2.c -o basic-tutorial-2 `pkg-config --cflags --libs gstreamer-1.0` ``
|
||||
>
|
||||
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux][1], [Mac OS X][[2]] or [Windows][3].
|
||||
>
|
||||
>This tutorial opens a window and displays a test pattern, without audio
|
||||
>
|
||||
>Required libraries: `gstreamer-1.0`
|
||||
{{ END_LANG.md }}
|
||||
|
||||
|
||||
{{ PY.md }}
|
||||
Copy this code into a text file named `basic-tutorial-2.py` (or find it
|
||||
in your GStreamer installation).
|
||||
**basic-tutorial-2.py**
|
||||
|
||||
{{ tutorials/python/basic-tutorial-2.py }}
|
||||
|
||||
Then, you can run the file with `python3 basic-tutorial-2.py`
|
||||
{{ END_LANG.md }}
|
||||
|
||||
|
||||
|
||||
## Walkthrough
|
||||
|
||||
The *elements* are GStreamer's basic construction blocks. They process
|
||||
the data as it flows *downstream* from the source elements (data producers)
|
||||
to the sink elements (data consumers), passing through filter elements.
|
||||
|
||||

|
||||
|
||||
**Figure 1**. Example pipeline
|
||||
|
||||
### Element creation
|
||||
|
||||
We will skip GStreamer initialization, since it is the same as the
|
||||
previous tutorial:
|
||||
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
{{ tutorials/basic-tutorial-2.c[13:16] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
{{ PY.md }}
|
||||
{{ tutorials/python/basic-tutorial-2.py[18:21] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
|
||||
As seen in this code, new elements can be created
|
||||
with [gst_element_factory_make]\(). The first parameter is the type of
|
||||
element to create ([Basic tutorial 14: Handy
|
||||
elements] shows a
|
||||
few common types, and [Basic tutorial 10: GStreamer
|
||||
tools] shows how to
|
||||
obtain the list of all available types). The second parameter is the
|
||||
name we want to give to this particular instance. Naming your elements
|
||||
is useful to retrieve them later if you didn't keep a pointer (and for
|
||||
more meaningful debug output). If you pass [NULL] for the name, however,
|
||||
GStreamer will provide a unique name for you.
|
||||
|
||||
For this tutorial we create two elements: a [videotestsrc] and
|
||||
an [autovideosink]. There are no filter elements. Hence, the pipeline would
|
||||
look like the following:
|
||||
|
||||

|
||||
|
||||
**Figure 2**. Pipeline built in this tutorial
|
||||
|
||||
[videotestsrc] is a source element (it produces data), which creates a
|
||||
test video pattern. This element is useful for debugging purposes (and
|
||||
tutorials) and is not usually found in real applications.
|
||||
|
||||
[autovideosink] is a sink element (it consumes data), which displays on
|
||||
a window the images it receives. There exist several video sinks,
|
||||
depending on the operating system, with a varying range of capabilities.
|
||||
[autovideosink] automatically selects and instantiates the best one, so
|
||||
you do not have to worry with the details, and your code is more
|
||||
platform-independent.
|
||||
|
||||
### Pipeline creation
|
||||
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
{{ tutorials/basic-tutorial-2.c[17:19] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
{{ PY.md }}
|
||||
{{ tutorials/python/basic-tutorial-2.py[22:24] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
All elements in GStreamer must typically be contained inside a pipeline
|
||||
before they can be used, because it takes care of some clocking and
|
||||
messaging functions. We create the pipeline with [gst_pipeline_new]\().
|
||||
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
{{ tutorials/basic-tutorial-2.c[25:32] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
{{ PY.md }}
|
||||
{{ tutorials/python/basic-tutorial-2.py[30:35] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
|
||||
A pipeline is a particular type of [bin], which is the element used to
|
||||
contain other elements. Therefore all methods which apply to bins also
|
||||
apply to pipelines.
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
In our case, we call [gst_bin_add_many]\() to add the
|
||||
elements to the pipeline (mind the cast). This function accepts a list
|
||||
of elements to be added, ending with [NULL]. Individual elements can be
|
||||
added with [gst_bin_add]\().
|
||||
{{ END_LANG.md }}
|
||||
{{ PY.md }}
|
||||
In our case, we call [gst_bin_add]\() to add elements to the pipeline.
|
||||
The function accepts any number of Gst Elements as its arguments
|
||||
{{ END_LANG.md }}
|
||||
These elements, however, are not linked with each other yet. For this,
|
||||
we need to use [gst_element_link]\(). Its first parameter is the source,
|
||||
and the second one the destination. The order counts, because links must
|
||||
be established following the data flow (this is, from source elements to
|
||||
sink elements). Keep in mind that only elements residing in the same bin
|
||||
can be linked together, so remember to add them to the pipeline before
|
||||
trying to link them!
|
||||
|
||||
### Properties
|
||||
|
||||
GStreamer elements are all a particular kind of [GObject], which is the
|
||||
entity offering **property** facilities.
|
||||
|
||||
Most GStreamer elements have customizable properties: named attributes
|
||||
that can be modified to change the element's behavior (writable
|
||||
properties) or inquired to find out about the element's internal state
|
||||
(readable properties).
|
||||
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
Properties are read from with [g_object_get]\() and written to
|
||||
with [g_object_set]\().
|
||||
|
||||
[g_object_set]\() accepts a [NULL]-terminated list of property-name,
|
||||
property-value pairs, so multiple properties can be changed in one go.
|
||||
|
||||
This is why the property handling methods have the `g_` prefix.
|
||||
{{ END_LANG.md }}
|
||||
|
||||
{{ PY.md }}
|
||||
For understanding how to get and set [properties](https://pygobject.readthedocs.io/en/latest/guide/api/properties.html),
|
||||
let us assume we have a Gst Element `source` with a property `pattern`
|
||||
|
||||
The current state of a property can be fetched by either:
|
||||
1. Accessing the property as an attribute of the `props` attribute of an
|
||||
element. Ex: `_ = source.props.pattern` to print it on the screen
|
||||
2. Using the `get_property` method of the element.
|
||||
Ex: `_ = source.get_property("pattern")`
|
||||
|
||||
And properties can be set by one of three methods:
|
||||
1. Setting the property as an attribute of the `props` attribute.
|
||||
Ex: `source.props.pattern = 1` or equivalently `source.props.pattern="snow"`
|
||||
2. Using the `set_property` method of the element.
|
||||
Ex: `source.set_property("pattern", 1)` or equivalently `source.set_property("pattern", "snow")`
|
||||
3. Using the `Gst.util_set_object_arg()` method. This mode also allows you to
|
||||
pass Gst Caps and other structures. Ex: `Gst.util_set_object_arg(source, "pattern", "snow")`,
|
||||
or equivalently, `Gst.util_set_object_arg(source, "pattern", 1)`
|
||||
|
||||
Note: In all three methods of setting a property, if a string is passed as
|
||||
the value to set, it has to be the serialized version of a flag or value
|
||||
(using [gst_value_serialize]\())
|
||||
{{ END_LANG.md }}
|
||||
|
||||
Coming back to what's in the example above,
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
{{ tutorials/basic-tutorial-2.c[33:35] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
{{ PY.md }}
|
||||
{{ tutorials/python/basic-tutorial-2.py[36:40] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
|
||||
The line of code above changes the “pattern” property of [videotestsrc],
|
||||
which controls the type of test video the element outputs. Try different
|
||||
values!
|
||||
|
||||
The names and possible values of all the properties an element exposes
|
||||
can be found using the gst-inspect-1.0 tool described in [Basic tutorial 10:
|
||||
GStreamer tools] or alternatively in the docs for that element
|
||||
([here](GstVideoTestSrcPattern) in the case of videotestsrc).
|
||||
|
||||
### Error checking
|
||||
|
||||
At this point, we have the whole pipeline built and setup, and the rest
|
||||
of the tutorial is very similar to the previous one, but we are going to
|
||||
add more error checking:
|
||||
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
{{ tutorials/basic-tutorial-2.c[36:43] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
{{ PY.md }}
|
||||
{{ tutorials/python/basic-tutorial-2.py[41:46] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
We call [gst_element_set_state]\(), but this time we check its return
|
||||
value for errors. Changing states is a delicate process and a few more
|
||||
details are given in [Basic tutorial 3: Dynamic
|
||||
pipelines].
|
||||
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
{{ tutorials/basic-tutorial-2.c[44:75] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
{{ PY.md }}
|
||||
{{ tutorials/python/basic-tutorial-2.py[47:62] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
[gst_bus_timed_pop_filtered]\() waits for execution to end and returns
|
||||
with a [GstMessage] which we previously ignored. We
|
||||
asked [gst_bus_timed_pop_filtered]\() to return when GStreamer
|
||||
encountered either an error condition or an [EOS], so we need to check
|
||||
which one happened, and print a message on screen (Your application will
|
||||
probably want to undertake more complex actions).
|
||||
|
||||
[GstMessage] is a very versatile structure which can deliver virtually
|
||||
any kind of information. Fortunately, GStreamer provides a series of
|
||||
parsing functions for each kind of message.
|
||||
|
||||
In this case, once we know the message contains an error (by using the
|
||||
[GST_MESSAGE_TYPE]\() macro), we can use
|
||||
[gst_message_parse_error]\() which returns a GLib [GError] error
|
||||
structure and a string useful for debugging. Examine the code to see how
|
||||
these are used and freed afterward.
|
||||
|
||||
### The GStreamer bus
|
||||
|
||||
At this point it is worth introducing the GStreamer bus a bit more
|
||||
formally. It is the object responsible for delivering to the application
|
||||
the [GstMessage]s generated by the elements, in order and to the
|
||||
application thread. This last point is important, because the actual
|
||||
streaming of media is done in another thread than the application.
|
||||
|
||||
Messages can be extracted from the bus synchronously with
|
||||
[gst_bus_timed_pop_filtered]\() and its siblings, or asynchronously,
|
||||
using signals (shown in the next tutorial). Your application should
|
||||
always keep an eye on the bus to be notified of errors and other
|
||||
playback-related issues.
|
||||
|
||||
The rest of the code is the cleanup sequence, which is the same as
|
||||
in [Basic tutorial 1: Hello
|
||||
world!].
|
||||
|
||||
## Exercise
|
||||
|
||||
If you feel like practicing, try this exercise: Add a video filter
|
||||
element in between the source and the sink of this pipeline. Use
|
||||
[vertigotv] for a nice effect. You will need to create it, add it to the
|
||||
pipeline, and link it with the other elements.
|
||||
|
||||
Depending on your platform and available plugins, you might get a
|
||||
“negotiation” error, because the sink does not understand what the
|
||||
filter is producing (more about negotiation in [Basic tutorial 6: Media
|
||||
formats and Pad
|
||||
Capabilities]).
|
||||
In this case, try to add an element called [videoconvert] after the
|
||||
filter (this is, build a pipeline of 4 elements. More on
|
||||
[videoconvert] in [Basic tutorial 14: Handy
|
||||
elements]).
|
||||
|
||||
## Conclusion
|
||||
|
||||
This tutorial showed:
|
||||
|
||||
- How to create elements with [gst_element_factory_make]\()
|
||||
|
||||
- How to create an empty pipeline with [gst_pipeline_new]\()
|
||||
|
||||
- How to add elements to the pipeline with [gst_bin_add_many]\()
|
||||
|
||||
- How to link the elements with each other with [gst_element_link]\()
|
||||
|
||||
This concludes the first of the two tutorials devoted to basic GStreamer
|
||||
concepts. The second one comes next.
|
||||
|
||||
Remember that attached to this page you should find the complete source
|
||||
code of the tutorial and any accessory files needed to build it.
|
||||
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
|
||||
[Linux]: installing/on-linux.md#InstallingonLinux-Build
|
||||
[Mac OS X]: installing/on-mac-osx.md#InstallingonMacOSX-Build
|
||||
[Windows]: installing/on-windows.md#InstallingonWindows-Build
|
||||
[1]: installing/on-linux.md#InstallingonLinux-Run
|
||||
[2]: installing/on-mac-osx.md#InstallingonMacOSX-Run
|
||||
[3]: installing/on-windows.md#InstallingonWindows-Run
|
||||
[Basic tutorial 14: Handy elements]: tutorials/basic/handy-elements.md
|
||||
[Basic tutorial 10: GStreamer tools]: tutorials/basic/gstreamer-tools.md
|
||||
[Basic tutorial 10: GStreamer tools]: tutorials/basic/gstreamer-tools.md
|
||||
[Basic tutorial 3: Dynamic pipelines]: tutorials/basic/dynamic-pipelines.md
|
||||
[Basic tutorial 1: Hello world!]: tutorials/basic/hello-world.md
|
||||
[Basic tutorial 6: Media formats and Pad Capabilities]: tutorials/basic/media-formats-and-pad-capabilities.md
|
||||
[gst_element_factory_make]: gst_element_factory_make
|
||||
[videotestsrc]: videotestsrc
|
||||
[autovideosink]: autovideosink
|
||||
[bin]: GstBin
|
||||
[NULL]: NULL
|
||||
[gst_bin_add_many]: gst_bin_add_many
|
||||
[gst_bin_add]: gst_bin_add
|
||||
[gst_element_link]: gst_element_link
|
||||
[GObject]: GObject
|
||||
[gst_value_serialize]: gst_value_serialize
|
||||
[g_object_get]: g_object_get
|
||||
[g_object_set]: g_object_set
|
||||
[gst_element_set_state]: gst_element_set_state
|
||||
[gst_bus_timed_pop_filtered]: gst_bus_timed_pop_filtered
|
||||
[GstMessage]: GstMessage
|
||||
[EOS]: GST_MESSAGE_EOS
|
||||
[GST_MESSAGE_TYPE]: GST_MESSAGE_TYPE
|
||||
[gst_message_parse_error]: gst_message_parse_error
|
||||
[GError]: GError
|
||||
[vertigotv]: vertigotv
|
||||
[videoconvert]: videoconvert
|
||||
[gst_pipeline_new]: gst_pipeline_new
|
||||
@@ -0,0 +1,216 @@
|
||||
# Basic tutorial 11: Debugging tools
|
||||
|
||||
## Goal
|
||||
|
||||
Sometimes things won’t go as expected and the error messages retrieved
|
||||
from the bus (if any) just don’t provide enough information. Luckily,
|
||||
GStreamer ships with massive amounts of debug information, which usually
|
||||
hint what the problem might be. This tutorial shows:
|
||||
|
||||
- How to get more debug information from GStreamer.
|
||||
|
||||
- How to print your own debug information into the GStreamer log.
|
||||
|
||||
- How to get pipeline graphs
|
||||
|
||||
## Printing debug information
|
||||
|
||||
### The debug log
|
||||
|
||||
GStreamer and its plugins are full of debug traces, this is, places in
|
||||
the code where a particularly interesting piece of information is
|
||||
printed to the console, along with time stamping, process, category,
|
||||
source code file, function and element information.
|
||||
|
||||
The debug output is controlled with the `GST_DEBUG` environment
|
||||
variable. Here’s an example with `GST_DEBUG=2`:
|
||||
|
||||
```
|
||||
0:00:00.868050000 1592 09F62420 WARN filesrc gstfilesrc.c:1044:gst_file_src_start:<filesrc0> error: No such file "non-existing-file.webm"
|
||||
```
|
||||
|
||||
As you can see, this is quite a bit of information. In fact, the
|
||||
GStreamer debug log is so verbose, that when fully enabled it can render
|
||||
applications unresponsive (due to the console scrolling) or fill up
|
||||
megabytes of text files (when redirected to a file). For this reason,
|
||||
the logs are categorized, and you seldom need to enable all categories
|
||||
at once.
|
||||
|
||||
The first category is the Debug Level, which is a number specifying the
|
||||
amount of desired output:
|
||||
|
||||
```
|
||||
| # | Name | Description |
|
||||
|---|---------|----------------------------------------------------------------|
|
||||
| 0 | none | No debug information is output. |
|
||||
| 1 | ERROR | Logs all fatal errors. These are errors that do not allow the |
|
||||
| | | core or elements to perform the requested action. The |
|
||||
| | | application can still recover if programmed to handle the |
|
||||
| | | conditions that triggered the error. |
|
||||
| 2 | WARNING | Logs all warnings. Typically these are non-fatal, but |
|
||||
| | | user-visible problems are expected to happen. |
|
||||
| 3 | FIXME | Logs all "fixme" messages. Those typically that a codepath that|
|
||||
| | | is known to be incomplete has been triggered. It may work in |
|
||||
| | | most cases, but may cause problems in specific instances. |
|
||||
| 4 | INFO | Logs all informational messages. These are typically used for |
|
||||
| | | events in the system that only happen once, or are important |
|
||||
| | | and rare enough to be logged at this level. |
|
||||
| 5 | DEBUG | Logs all debug messages. These are general debug messages for |
|
||||
| | | events that happen only a limited number of times during an |
|
||||
| | | object's lifetime; these include setup, teardown, change of |
|
||||
| | | parameters, etc. |
|
||||
| 6 | LOG | Logs all log messages. These are messages for events that |
|
||||
| | | happen repeatedly during an object's lifetime; these include |
|
||||
| | | streaming and steady-state conditions. This is used for log |
|
||||
| | | messages that happen on every buffer in an element for example.|
|
||||
| 7 | TRACE | Logs all trace messages. Those are message that happen very |
|
||||
| | | very often. This is for example is each time the reference |
|
||||
| | | count of a GstMiniObject, such as a GstBuffer or GstEvent, is |
|
||||
| | | modified. |
|
||||
| 9 | MEMDUMP | Logs all memory dump messages. This is the heaviest logging and|
|
||||
| | | may include dumping the content of blocks of memory. |
|
||||
+------------------------------------------------------------------------------+
|
||||
```
|
||||
|
||||
To enable debug output, set the `GST_DEBUG` environment variable to the
|
||||
desired debug level. All levels below that will also be shown (i.e., if
|
||||
you set `GST_DEBUG=2`, you will get both `ERROR` and
|
||||
`WARNING` messages).
|
||||
|
||||
Furthermore, each plugin or part of the GStreamer defines its own
|
||||
category, so you can specify a debug level for each individual category.
|
||||
For example, `GST_DEBUG=2,audiotestsrc:6`, will use Debug Level 6 for
|
||||
the `audiotestsrc` element, and 2 for all the others.
|
||||
|
||||
The `GST_DEBUG` environment variable, then, is a comma-separated list of
|
||||
*category*:*level* pairs, with an optional *level* at the beginning,
|
||||
representing the default debug level for all categories.
|
||||
|
||||
The `'*'` wildcard is also available. For example
|
||||
`GST_DEBUG=2,audio*:6` will use Debug Level 6 for all categories
|
||||
starting with the word `audio`. `GST_DEBUG=*:2` is equivalent to
|
||||
`GST_DEBUG=2`.
|
||||
|
||||
Use `gst-launch-1.0 --gst-debug-help` to obtain the list of all
|
||||
registered categories. Bear in mind that each plugin registers its own
|
||||
categories, so, when installing or removing plugins, this list can
|
||||
change.
|
||||
|
||||
Use `GST_DEBUG` when the error information posted on the GStreamer bus
|
||||
does not help you nail down a problem. It is common practice to redirect
|
||||
the output log to a file, and then examine it later, searching for
|
||||
specific messages.
|
||||
|
||||
GStreamer allows for custom debugging information handlers but when
|
||||
using the default one, the content of each line in the debug output
|
||||
looks like:
|
||||
|
||||
```
|
||||
0:00:00.868050000 1592 09F62420 WARN filesrc gstfilesrc.c:1044:gst_file_src_start:<filesrc0> error: No such file "non-existing-file.webm"
|
||||
```
|
||||
|
||||
And this is how the information formatted:
|
||||
|
||||
```
|
||||
| Example | Explained |
|
||||
|------------------|-----------------------------------------------------------|
|
||||
|0:00:00.868050000 | Time stamp in HH:MM:SS.sssssssss format since the start of|
|
||||
| | the program. |
|
||||
|1592 | Process ID from which the message was issued. Useful when |
|
||||
| | your problem involves multiple processes. |
|
||||
|09F62420 | Thread ID from which the message was issued. Useful when |
|
||||
| | your problem involves multiple threads. |
|
||||
|WARN | Debug level of the message. |
|
||||
|filesrc | Debug Category of the message. |
|
||||
|gstfilesrc.c:1044 | Source file and line in the GStreamer source code where |
|
||||
| | this message was issued. |
|
||||
|gst_file_src_start| Function that issued the message. |
|
||||
|<filesrc0> | Name of the object that issued the message. It can be an |
|
||||
| | element, a pad, or something else. Useful when you have |
|
||||
| | multiple elements of the same kind and need to distinguish|
|
||||
| | among them. Naming your elements with the name property |
|
||||
| | makes this debug output more readable but GStreamer |
|
||||
| | assigns each new element a unique name by default. |
|
||||
| error: No such | |
|
||||
| file .... | The actual message. |
|
||||
+------------------------------------------------------------------------------+
|
||||
```
|
||||
|
||||
### Adding your own debug information
|
||||
|
||||
In the parts of your code that interact with GStreamer, it is
|
||||
interesting to use GStreamer’s debugging facilities. In this way, you
|
||||
have all debug output in the same file and the temporal relationship
|
||||
between different messages is preserved.
|
||||
|
||||
To do so, use the `GST_ERROR()`, `GST_WARNING()`, `GST_INFO()`,
|
||||
`GST_LOG()` and `GST_DEBUG()` macros. They accept the same parameters as
|
||||
`printf`, and they use the `default` category (`default` will be shown
|
||||
as the Debug category in the output log).
|
||||
|
||||
To change the category to something more meaningful, add these two lines
|
||||
at the top of your code:
|
||||
|
||||
``` c
|
||||
GST_DEBUG_CATEGORY_STATIC (my_category);
|
||||
#define GST_CAT_DEFAULT my_category
|
||||
```
|
||||
|
||||
And then this one after you have initialized GStreamer with
|
||||
`gst_init()`:
|
||||
|
||||
``` c
|
||||
GST_DEBUG_CATEGORY_INIT (my_category, "my category", 0, "This is my very own");
|
||||
```
|
||||
|
||||
This registers a new category (this is, for the duration of your
|
||||
application: it is not stored in any file), and sets it as the default
|
||||
category for your code. See the documentation
|
||||
for `GST_DEBUG_CATEGORY_INIT()`.
|
||||
|
||||
### Getting pipeline graphs
|
||||
|
||||
For those cases where your pipeline starts to grow too large and you
|
||||
lose track of what is connected with what, GStreamer has the capability
|
||||
to output graph files. These are `.dot` files, readable with free
|
||||
programs like [GraphViz](http://www.graphviz.org), that describe the
|
||||
topology of your pipeline, along with the caps negotiated in each link.
|
||||
|
||||
This is also very handy when using all-in-one elements like `playbin`
|
||||
or `uridecodebin`, which instantiate several elements inside them. Use
|
||||
the `.dot` files to learn what pipeline they have created inside (and
|
||||
learn a bit of GStreamer along the way).
|
||||
|
||||
To obtain `.dot` files, simply set
|
||||
the `GST_DEBUG_DUMP_DOT_DIR` environment variable to point to the
|
||||
folder where you want the files to be placed. `gst-launch-1.0` will create
|
||||
a `.dot` file at each state change, so you can see the evolution of the
|
||||
caps negotiation. Unset the variable to disable this facility. From
|
||||
within your application, you can use the
|
||||
`GST_DEBUG_BIN_TO_DOT_FILE()` and
|
||||
`GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS()` macros to generate `.dot` files
|
||||
at your convenience.
|
||||
|
||||
Here you have an example of the kind of pipelines that playbin
|
||||
generates. It is very complex because `playbin` can handle many
|
||||
different cases: Your manual pipelines normally do not need to be this
|
||||
long. If your manual pipeline is starting to get very big, consider
|
||||
using `playbin`.
|
||||
|
||||

|
||||
|
||||
To download the full-size picture, use the attachments link at the top
|
||||
of this page (It's the paperclip icon).
|
||||
|
||||
## Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to get more debug information from GStreamer using the
|
||||
`GST_DEBUG` environment variable.
|
||||
- How to print your own debug information into the GStreamer log with
|
||||
the `GST_ERROR()` macro and relatives.
|
||||
- How to get pipeline graphs with the
|
||||
`GST_DEBUG_DUMP_DOT_DIR` environment variable.
|
||||
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
@@ -0,0 +1,541 @@
|
||||
# Basic tutorial 3: Dynamic pipelines
|
||||
|
||||
|
||||
{{ ALERT_PY.md }}
|
||||
|
||||
{{ ALERT_JS.md }}
|
||||
|
||||
## Goal
|
||||
|
||||
This tutorial shows the rest of the basic concepts required to use
|
||||
GStreamer, which allow building the pipeline "on the fly", as
|
||||
information becomes available, instead of having a monolithic pipeline
|
||||
defined at the beginning of your application.
|
||||
|
||||
After this tutorial, you will have the necessary knowledge to start the
|
||||
[Playback tutorials](tutorials/playback/index.md). The points reviewed
|
||||
here will be:
|
||||
|
||||
- How to attain finer control when linking elements.
|
||||
|
||||
- How to be notified of interesting events so you can react in time.
|
||||
|
||||
- The various states in which an element can be.
|
||||
|
||||
## Introduction
|
||||
|
||||
As you are about to see, the pipeline in this tutorial is not
|
||||
completely built before it is set to the playing state. This is OK. If
|
||||
we did not take further action, data would reach the end of the
|
||||
pipeline and the pipeline would produce an error message and stop. But
|
||||
we are going to take further action...
|
||||
|
||||
In this example we are opening a file which is multiplexed (or *muxed)*,
|
||||
this is, audio and video are stored together inside a *container* file.
|
||||
The elements responsible for opening such containers are called
|
||||
*demuxers*, and some examples of container formats are Matroska (MKV),
|
||||
Quick Time (QT, MOV), Ogg, or Advanced Systems Format (ASF, WMV, WMA).
|
||||
|
||||
If a container embeds multiple streams (one video and two audio tracks,
|
||||
for example), the demuxer will separate them and expose them through
|
||||
different output ports. In this way, different branches can be created
|
||||
in the pipeline, dealing with different types of data.
|
||||
|
||||
The ports through which GStreamer elements communicate with each other
|
||||
are called pads (`GstPad`). There exists sink pads, through which data
|
||||
enters an element, and source pads, through which data exits an element.
|
||||
It follows naturally that source elements only contain source pads, sink
|
||||
elements only contain sink pads, and filter elements contain
|
||||
both.
|
||||
|
||||
  
|
||||
|
||||
**Figure 1**. GStreamer elements with their pads.
|
||||
|
||||
A demuxer contains one sink pad, through which the muxed data arrives,
|
||||
and multiple source pads, one for each stream found in the container:
|
||||
|
||||

|
||||
|
||||
**Figure 2**. A demuxer with two source pads.
|
||||
|
||||
For completeness, here you have a simplified pipeline containing a
|
||||
demuxer and two branches, one for audio and one for video. This is
|
||||
**NOT** the pipeline that will be built in this example:
|
||||
|
||||

|
||||
|
||||
**Figure 3**. Example pipeline with two branches.
|
||||
|
||||
The main complexity when dealing with demuxers is that they cannot
|
||||
produce any information until they have received some data and have had
|
||||
a chance to look at the container to see what is inside. This is,
|
||||
demuxers start with no source pads to which other elements can link, and
|
||||
thus the pipeline must necessarily terminate at them.
|
||||
|
||||
The solution is to build the pipeline from the source down to the
|
||||
demuxer, and set it to run (play). When the demuxer has received enough
|
||||
information to know about the number and kind of streams in the
|
||||
container, it will start creating source pads. This is the right time
|
||||
for us to finish building the pipeline and attach it to the newly added
|
||||
demuxer pads.
|
||||
|
||||
For simplicity, in this example, we will only link to the audio pad and
|
||||
ignore the video.
|
||||
|
||||
## Dynamic Hello World
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-3.c` (or find it
|
||||
in your GStreamer installation).
|
||||
|
||||
**basic-tutorial-3.c**
|
||||
|
||||
``` c
|
||||
#include <gst/gst.h>
|
||||
|
||||
/* Structure to contain all our information, so we can pass it to callbacks */
|
||||
typedef struct _CustomData {
|
||||
GstElement *pipeline;
|
||||
GstElement *source;
|
||||
GstElement *convert;
|
||||
GstElement *resample;
|
||||
GstElement *sink;
|
||||
} CustomData;
|
||||
|
||||
/* Handler for the pad-added signal */
|
||||
static void pad_added_handler (GstElement *src, GstPad *pad, CustomData *data);
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstBus *bus;
|
||||
GstMessage *msg;
|
||||
GstStateChangeReturn ret;
|
||||
gboolean terminate = FALSE;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the elements */
|
||||
data.source = gst_element_factory_make ("uridecodebin", "source");
|
||||
data.convert = gst_element_factory_make ("audioconvert", "convert");
|
||||
data.resample = gst_element_factory_make ("audioresample", "resample");
|
||||
data.sink = gst_element_factory_make ("autoaudiosink", "sink");
|
||||
|
||||
/* Create the empty pipeline */
|
||||
data.pipeline = gst_pipeline_new ("test-pipeline");
|
||||
|
||||
if (!data.pipeline || !data.source || !data.convert || !data.resample || !data.sink) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Build the pipeline. Note that we are NOT linking the source at this
|
||||
* point. We will do it later. */
|
||||
gst_bin_add_many (GST_BIN (data.pipeline), data.source, data.convert, data.resample, data.sink, NULL);
|
||||
if (!gst_element_link_many (data.convert, data.resample, data.sink, NULL)) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (data.pipeline);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Set the URI to play */
|
||||
g_object_set (data.source, "uri", "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL);
|
||||
|
||||
/* Connect to the pad-added signal */
|
||||
g_signal_connect (data.source, "pad-added", G_CALLBACK (pad_added_handler), &data);
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (data.pipeline, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (data.pipeline);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Listen to the bus */
|
||||
bus = gst_element_get_bus (data.pipeline);
|
||||
do {
|
||||
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE,
|
||||
GST_MESSAGE_STATE_CHANGED | GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
|
||||
|
||||
/* Parse message */
|
||||
if (msg != NULL) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
switch (GST_MESSAGE_TYPE (msg)) {
|
||||
case GST_MESSAGE_ERROR:
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
terminate = TRUE;
|
||||
break;
|
||||
case GST_MESSAGE_EOS:
|
||||
g_print ("End-Of-Stream reached.\n");
|
||||
terminate = TRUE;
|
||||
break;
|
||||
case GST_MESSAGE_STATE_CHANGED:
|
||||
/* We are only interested in state-changed messages from the pipeline */
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data.pipeline)) {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
g_print ("Pipeline state changed from %s to %s:\n",
|
||||
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
|
||||
}
|
||||
break;
|
||||
default:
|
||||
/* We should not reach here */
|
||||
g_printerr ("Unexpected message received.\n");
|
||||
break;
|
||||
}
|
||||
gst_message_unref (msg);
|
||||
}
|
||||
} while (!terminate);
|
||||
|
||||
/* Free resources */
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (data.pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (data.pipeline);
|
||||
return 0;
|
||||
}
|
||||
|
||||
/* This function will be called by the pad-added signal */
|
||||
static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *data) {
|
||||
GstPad *sink_pad = gst_element_get_static_pad (data->convert, "sink");
|
||||
GstPadLinkReturn ret;
|
||||
GstCaps *new_pad_caps = NULL;
|
||||
GstStructure *new_pad_struct = NULL;
|
||||
const gchar *new_pad_type = NULL;
|
||||
|
||||
g_print ("Received new pad '%s' from '%s':\n", GST_PAD_NAME (new_pad), GST_ELEMENT_NAME (src));
|
||||
|
||||
/* If our converter is already linked, we have nothing to do here */
|
||||
if (gst_pad_is_linked (sink_pad)) {
|
||||
g_print ("We are already linked. Ignoring.\n");
|
||||
goto exit;
|
||||
}
|
||||
|
||||
/* Check the new pad's type */
|
||||
new_pad_caps = gst_pad_get_current_caps (new_pad);
|
||||
new_pad_struct = gst_caps_get_structure (new_pad_caps, 0);
|
||||
new_pad_type = gst_structure_get_name (new_pad_struct);
|
||||
if (!g_str_has_prefix (new_pad_type, "audio/x-raw")) {
|
||||
g_print ("It has type '%s' which is not raw audio. Ignoring.\n", new_pad_type);
|
||||
goto exit;
|
||||
}
|
||||
|
||||
/* Attempt the link */
|
||||
ret = gst_pad_link (new_pad, sink_pad);
|
||||
if (GST_PAD_LINK_FAILED (ret)) {
|
||||
g_print ("Type is '%s' but link failed.\n", new_pad_type);
|
||||
} else {
|
||||
g_print ("Link succeeded (type '%s').\n", new_pad_type);
|
||||
}
|
||||
|
||||
exit:
|
||||
/* Unreference the new pad's caps, if we got them */
|
||||
if (new_pad_caps != NULL)
|
||||
gst_caps_unref (new_pad_caps);
|
||||
|
||||
/* Unreference the sink pad */
|
||||
gst_object_unref (sink_pad);
|
||||
}
|
||||
```
|
||||
|
||||
> 
|
||||
> Need help?
|
||||
>
|
||||
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Build), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Build) or [Windows](installing/on-windows.md#InstallingonWindows-Build), or use this specific command on Linux:
|
||||
> ``gcc basic-tutorial-3.c -o basic-tutorial-3 `pkg-config --cflags --libs gstreamer-1.0` ``
|
||||
>
|
||||
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Run), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](installing/on-windows.md#InstallingonWindows-Run).
|
||||
>
|
||||
> This tutorial only plays audio. The media is fetched from the Internet, so it might take a few seconds to start, depending on your connection speed.
|
||||
>
|
||||
>Required libraries: `gstreamer-1.0`
|
||||
|
||||
## Walkthrough
|
||||
|
||||
``` c
|
||||
/* Structure to contain all our information, so we can pass it to callbacks */
|
||||
typedef struct _CustomData {
|
||||
GstElement *pipeline;
|
||||
GstElement *source;
|
||||
GstElement *convert;
|
||||
GstElement *resample;
|
||||
GstElement *sink;
|
||||
} CustomData;
|
||||
```
|
||||
|
||||
So far we have kept all the information we needed (pointers
|
||||
to `GstElement`s, basically) as local variables. Since this tutorial
|
||||
(and most real applications) involves callbacks, we will group all our
|
||||
data in a structure for easier handling.
|
||||
|
||||
``` c
|
||||
/* Handler for the pad-added signal */
|
||||
static void pad_added_handler (GstElement *src, GstPad *pad, CustomData *data);
|
||||
```
|
||||
|
||||
This is a forward reference, to be used later.
|
||||
|
||||
``` c
|
||||
/* Create the elements */
|
||||
data.source = gst_element_factory_make ("uridecodebin", "source");
|
||||
data.convert = gst_element_factory_make ("audioconvert", "convert");
|
||||
data.resample = gst_element_factory_make ("audioresample", "resample");
|
||||
data.sink = gst_element_factory_make ("autoaudiosink", "sink");
|
||||
```
|
||||
|
||||
We create the elements as usual. `uridecodebin` will internally
|
||||
instantiate all the necessary elements (sources, demuxers and decoders)
|
||||
to turn a URI into raw audio and/or video streams. It does half the work
|
||||
that `playbin` does. Since it contains demuxers, its source pads are
|
||||
not initially available and we will need to link to them on the fly.
|
||||
|
||||
`audioconvert` is useful for converting between different audio formats,
|
||||
making sure that this example will work on any platform, since the
|
||||
format produced by the audio decoder might not be the same that the
|
||||
audio sink expects.
|
||||
|
||||
`audioresample` is useful for converting between different audio sample rates,
|
||||
similarly making sure that this example will work on any platform, since the
|
||||
audio sample rate produced by the audio decoder might not be one that the audio
|
||||
sink supports.
|
||||
|
||||
The `autoaudiosink` is the equivalent of `autovideosink` seen in the
|
||||
previous tutorial, for audio. It will render the audio stream to the
|
||||
audio card.
|
||||
|
||||
``` c
|
||||
if (!gst_element_link_many (data.convert, data.resample, data.sink, NULL)) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (data.pipeline);
|
||||
return -1;
|
||||
}
|
||||
```
|
||||
|
||||
Here we link the elements converter, resample and sink, but we **DO NOT** link
|
||||
them with the source, since at this point it contains no source pads. We
|
||||
just leave this branch (converter + sink) unlinked, until later on.
|
||||
|
||||
``` c
|
||||
/* Set the URI to play */
|
||||
g_object_set (data.source, "uri", "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL);
|
||||
```
|
||||
|
||||
We set the URI of the file to play via a property, just like we did in
|
||||
the previous tutorial.
|
||||
|
||||
### Signals
|
||||
|
||||
``` c
|
||||
/* Connect to the pad-added signal */
|
||||
g_signal_connect (data.source, "pad-added", G_CALLBACK (pad_added_handler), &data);
|
||||
```
|
||||
|
||||
`GSignals` are a crucial point in GStreamer. They allow you to be
|
||||
notified (by means of a callback) when something interesting has
|
||||
happened. Signals are identified by a name, and each `GObject` has its
|
||||
own signals.
|
||||
|
||||
In this line, we are *attaching* to the “pad-added” signal of our source
|
||||
(an `uridecodebin` element). To do so, we use `g_signal_connect()` and
|
||||
provide the callback function to be used (`pad_added_handler`) and a
|
||||
data pointer. GStreamer does nothing with this data pointer, it just
|
||||
forwards it to the callback so we can share information with it. In this
|
||||
case, we pass a pointer to the `CustomData` structure we built specially
|
||||
for this purpose.
|
||||
|
||||
The signals that a `GstElement` generates can be found in its
|
||||
documentation or using the `gst-inspect-1.0` tool as described in [Basic
|
||||
tutorial 10: GStreamer
|
||||
tools](tutorials/basic/gstreamer-tools.md).
|
||||
|
||||
We are now ready to go! Just set the pipeline to the `PLAYING` state and
|
||||
start listening to the bus for interesting messages (like `ERROR` or `EOS`),
|
||||
just like in the previous tutorials.
|
||||
|
||||
### The callback
|
||||
|
||||
When our source element finally has enough information to start
|
||||
producing data, it will create source pads, and trigger the “pad-added”
|
||||
signal. At this point our callback will be
|
||||
called:
|
||||
|
||||
``` c
|
||||
static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *data) {
|
||||
```
|
||||
|
||||
`src` is the `GstElement` which triggered the signal. In this example,
|
||||
it can only be the `uridecodebin`, since it is the only signal to which
|
||||
we have attached. The first parameter of a signal handler is always the object
|
||||
that has triggered it.
|
||||
|
||||
`new_pad` is the `GstPad` that has just been added to the `src` element.
|
||||
This is usually the pad to which we want to link.
|
||||
|
||||
`data` is the pointer we provided when attaching to the signal. In this
|
||||
example, we use it to pass the `CustomData` pointer.
|
||||
|
||||
``` c
|
||||
GstPad *sink_pad = gst_element_get_static_pad (data->convert, "sink");
|
||||
```
|
||||
|
||||
From `CustomData` we extract the converter element, and then retrieve
|
||||
its sink pad using `gst_element_get_static_pad ()`. This is the pad to
|
||||
which we want to link `new_pad`. In the previous tutorial we linked
|
||||
element against element, and let GStreamer choose the appropriate pads.
|
||||
Now we are going to link the pads directly.
|
||||
|
||||
``` c
|
||||
/* If our converter is already linked, we have nothing to do here */
|
||||
if (gst_pad_is_linked (sink_pad)) {
|
||||
g_print ("We are already linked. Ignoring.\n");
|
||||
goto exit;
|
||||
}
|
||||
```
|
||||
|
||||
`uridecodebin` can create as many pads as it sees fit, and for each one,
|
||||
this callback will be called. These lines of code will prevent us from
|
||||
trying to link to a new pad once we are already linked.
|
||||
|
||||
``` c
|
||||
/* Check the new pad's type */
|
||||
new_pad_caps = gst_pad_get_current_caps (new_pad, NULL);
|
||||
new_pad_struct = gst_caps_get_structure (new_pad_caps, 0);
|
||||
new_pad_type = gst_structure_get_name (new_pad_struct);
|
||||
if (!g_str_has_prefix (new_pad_type, "audio/x-raw")) {
|
||||
g_print ("It has type '%s' which is not raw audio. Ignoring.\n", new_pad_type);
|
||||
goto exit;
|
||||
}
|
||||
```
|
||||
|
||||
Now we will check the type of data this new pad is going to output, because we
|
||||
are only interested in pads producing audio. We have previously created a
|
||||
piece of pipeline which deals with audio (an `audioconvert` linked with an
|
||||
`audioresample` and an `autoaudiosink`), and we will not be able to link it to
|
||||
a pad producing video, for example.
|
||||
|
||||
`gst_pad_get_current_caps()` retrieves the current *capabilities* of the pad
|
||||
(that is, the kind of data it currently outputs), wrapped in a `GstCaps`
|
||||
structure. All possible caps a pad can support can be queried with
|
||||
`gst_pad_query_caps()`. A pad can offer many capabilities, and hence `GstCaps`
|
||||
can contain many `GstStructure`, each representing a different capability. The
|
||||
current caps on a pad will always have a single `GstStructure` and represent a
|
||||
single media format, or if there are no current caps yet `NULL` will be
|
||||
returned.
|
||||
|
||||
Since, in this case, we know that the pad we want only had one
|
||||
capability (audio), we retrieve the first `GstStructure` with
|
||||
`gst_caps_get_structure()`.
|
||||
|
||||
Finally, with `gst_structure_get_name()` we recover the name of the
|
||||
structure, which contains the main description of the format (its *media
|
||||
type*, actually).
|
||||
|
||||
If the name is not `audio/x-raw`, this is not a decoded
|
||||
audio pad, and we are not interested in it.
|
||||
|
||||
Otherwise, attempt the link:
|
||||
|
||||
``` c
|
||||
/* Attempt the link */
|
||||
ret = gst_pad_link (new_pad, sink_pad);
|
||||
if (GST_PAD_LINK_FAILED (ret)) {
|
||||
g_print ("Type is '%s' but link failed.\n", new_pad_type);
|
||||
} else {
|
||||
g_print ("Link succeeded (type '%s').\n", new_pad_type);
|
||||
}
|
||||
```
|
||||
|
||||
`gst_pad_link()` tries to link two pads. As it was the case
|
||||
with `gst_element_link()`, the link must be specified from source to
|
||||
sink, and both pads must be owned by elements residing in the same bin
|
||||
(or pipeline).
|
||||
|
||||
And we are done! When a pad of the right kind appears, it will be
|
||||
linked to the rest of the audio-processing pipeline and execution will
|
||||
continue until ERROR or EOS. However, we will squeeze a bit more content
|
||||
from this tutorial by also introducing the concept of State.
|
||||
|
||||
### GStreamer States
|
||||
|
||||
We already talked a bit about states when we said that playback does not
|
||||
start until you bring the pipeline to the `PLAYING` state. We will
|
||||
introduce here the rest of states and their meaning. There are 4 states
|
||||
in GStreamer:
|
||||
|
||||
| State | Description |
|
||||
|-----------|--------------------|
|
||||
| `NULL` | the NULL state or initial state of an element. |
|
||||
| `READY` | the element is ready to go to PAUSED. |
|
||||
| `PAUSED` | the element is PAUSED, it is ready to accept and process data. Sink elements however only accept one buffer and then block. |
|
||||
| `PLAYING` | the element is PLAYING, the clock is running and the data is flowing. |
|
||||
|
||||
|
||||
You can only move between adjacent ones, this is, you can't go from `NULL`
|
||||
to `PLAYING`, you have to go through the intermediate `READY` and `PAUSED`
|
||||
states. If you set the pipeline to `PLAYING`, though, GStreamer will make
|
||||
the intermediate transitions for you.
|
||||
|
||||
``` c
|
||||
case GST_MESSAGE_STATE_CHANGED:
|
||||
/* We are only interested in state-changed messages from the pipeline */
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data.pipeline)) {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
g_print ("Pipeline state changed from %s to %s:\n",
|
||||
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
|
||||
}
|
||||
break;
|
||||
```
|
||||
|
||||
We added this piece of code that listens to bus messages regarding state
|
||||
changes and prints them on screen to help you understand the
|
||||
transitions. Every element puts messages on the bus regarding its
|
||||
current state, so we filter them out and only listen to messages coming
|
||||
from the pipeline.
|
||||
|
||||
Most applications only need to worry about going to `PLAYING` to start
|
||||
playback, then to `PAUSED` to perform a pause, and then back to `NULL` at
|
||||
program exit to free all resources.
|
||||
|
||||
## Exercise
|
||||
|
||||
Dynamic pad linking has traditionally been a difficult topic for a lot
|
||||
of programmers. Prove that you have achieved its mastery by
|
||||
instantiating an `autovideosink` (probably with an `videoconvert` in
|
||||
front) and link it to the demuxer when the right pad appears. Hint: You
|
||||
are already printing on screen the type of the video pads.
|
||||
|
||||
You should now see (and hear) the same movie as in [Basic tutorial 1:
|
||||
Hello world!](tutorials/basic/hello-world.md). In
|
||||
that tutorial you used `playbin`, which is a handy element that
|
||||
automatically takes care of all the demuxing and pad linking for you.
|
||||
Most of the [Playback tutorials](tutorials/playback/index.md) are devoted
|
||||
to `playbin`.
|
||||
|
||||
## Conclusion
|
||||
|
||||
In this tutorial, you learned:
|
||||
|
||||
- How to be notified of events using `GSignals`
|
||||
- How to connect `GstPad`s directly instead of their parent elements.
|
||||
- The various states of a GStreamer element.
|
||||
|
||||
You also combined these items to build a dynamic pipeline, which was not
|
||||
defined at program start, but was created as information regarding the
|
||||
media was available.
|
||||
|
||||
You can now continue with the basic tutorials and learn about performing
|
||||
seeks and time-related queries in [Basic tutorial 4: Time
|
||||
management](tutorials/basic/time-management.md) or move
|
||||
to the [Playback tutorials](tutorials/playback/index.md), and gain more
|
||||
insight about the `playbin` element.
|
||||
|
||||
Remember that attached to this page you should find the complete source
|
||||
code of the tutorial and any accessory files needed to build it.
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
@@ -0,0 +1,450 @@
|
||||
# Basic tutorial 10: GStreamer tools
|
||||
|
||||
## Goal
|
||||
|
||||
GStreamer comes with a set of tools which range from handy to
|
||||
absolutely essential. There is no code in this tutorial, just sit back
|
||||
and relax, and we will teach you:
|
||||
|
||||
- How to build and run GStreamer pipelines from the command line,
|
||||
without using C at all!
|
||||
- How to find out what GStreamer elements you have available and their
|
||||
capabilities.
|
||||
- How to discover the internal structure of media files.
|
||||
|
||||
## Introduction
|
||||
|
||||
These tools are available in the bin directory of the GStreamer
|
||||
binaries. You need to move to this directory to execute them, because
|
||||
it is not added to the system’s `PATH` environment variable (to avoid
|
||||
polluting it too much).
|
||||
|
||||
Just open a terminal (or console window) and go to the `bin` directory
|
||||
of your GStreamer installation (Read again the [Installing
|
||||
GStreamer](installing/index.md) section to find out where this is),
|
||||
and you are ready to start typing the commands given in this tutorial.
|
||||
|
||||
|
||||
> 
|
||||
>
|
||||
> On Linux, you should use the GStreamer version installed with your
|
||||
> distribution, the tools should be installed with a package named `gstreamer1`
|
||||
> on Fedora style distributions, or `gstreamer1.0-tools` on Debian/Ubuntu style
|
||||
> distributions.
|
||||
|
||||
In order to allow for multiple versions of GStreamer to coexists in the
|
||||
same system, these tools are versioned, this is, a GStreamer version
|
||||
number is appended to their name. This version is based on
|
||||
GStreamer 1.0, so the tools are called `gst-launch-1.0`,
|
||||
`gst-inspect-1.0` and `gst-discoverer-1.0`
|
||||
|
||||
## `gst-launch-1.0`
|
||||
|
||||
This tool accepts a textual description of a pipeline, instantiates it,
|
||||
and sets it to the PLAYING state. It allows you to quickly check if a
|
||||
given pipeline works, before going through the actual implementation
|
||||
using GStreamer API calls.
|
||||
|
||||
Bear in mind that it can only create simple pipelines. In particular, it
|
||||
can only simulate the interaction of the pipeline with the application
|
||||
up to a certain level. In any case, it is extremely handy to test
|
||||
pipelines quickly, and is used by GStreamer developers around the world
|
||||
on a daily basis.
|
||||
|
||||
Please note that `gst-launch-1.0` is primarily a debugging tool for
|
||||
developers. You should not build applications on top of it. Instead, use
|
||||
the `gst_parse_launch()` function of the GStreamer API as an easy way to
|
||||
construct pipelines from pipeline descriptions.
|
||||
|
||||
Although the rules to construct pipeline descriptions are very simple,
|
||||
the concatenation of multiple elements can quickly make such
|
||||
descriptions resemble black magic. Fear not, for everyone learns the
|
||||
`gst-launch-1.0` syntax, eventually.
|
||||
|
||||
The command line for gst-launch-1.0 consists of a list of options followed
|
||||
by a PIPELINE-DESCRIPTION. Some simplified instructions are given next,
|
||||
see the complete documentation at [the reference page](tools/gst-launch.md)
|
||||
for `gst-launch-1.0`.
|
||||
|
||||
### Elements
|
||||
|
||||
In simple form, a PIPELINE-DESCRIPTION is a list of element types
|
||||
separated by exclamation marks (!). Go ahead and type in the following
|
||||
command:
|
||||
|
||||
```
|
||||
gst-launch-1.0 videotestsrc ! videoconvert ! autovideosink
|
||||
```
|
||||
|
||||
You should see a windows with an animated video pattern. Use CTRL+C on
|
||||
the terminal to stop the program.
|
||||
|
||||
This instantiates a new element of type `videotestsrc` (an element which
|
||||
generates a sample video pattern), an `videoconvert` (an element
|
||||
which does raw video format conversion, making sure other elements can
|
||||
understand each other), and an `autovideosink` (a window to which video
|
||||
is rendered). Then, GStreamer tries to link the output of each element
|
||||
to the input of the element appearing on its right in the description.
|
||||
If more than one input or output Pad is available, the Pad Caps are used
|
||||
to find two compatible Pads.
|
||||
|
||||
### Properties
|
||||
|
||||
Properties may be appended to elements, in the form
|
||||
*property=value *(multiple properties can be specified, separated by
|
||||
spaces). Use the `gst-inspect-1.0` tool (explained next) to find out the
|
||||
available properties for an
|
||||
element.
|
||||
|
||||
```
|
||||
gst-launch-1.0 videotestsrc pattern=11 ! videoconvert ! autovideosink
|
||||
```
|
||||
|
||||
You should see a static video pattern, made of circles.
|
||||
|
||||
### Named elements
|
||||
|
||||
Elements can be named using the `name` property, in this way complex
|
||||
pipelines involving branches can be created. Names allow linking to
|
||||
elements created previously in the description, and are indispensable to
|
||||
use elements with multiple output pads, like demuxers or tees, for
|
||||
example.
|
||||
|
||||
Named elements are referred to using their name followed by a
|
||||
dot.
|
||||
|
||||
```
|
||||
gst-launch-1.0 videotestsrc ! videoconvert ! tee name=t ! queue ! autovideosink t. ! queue ! autovideosink
|
||||
```
|
||||
|
||||
You should see two video windows, showing the same sample video pattern.
|
||||
If you see only one, try to move it, since it is probably on top of the
|
||||
second window.
|
||||
|
||||
This example instantiates a `videotestsrc`, linked to a
|
||||
`videoconvert`, linked to a `tee` (Remember from [](tutorials/basic/multithreading-and-pad-availability.md) that
|
||||
a `tee` copies to each of its output pads everything coming through its
|
||||
input pad). The `tee` is named simply ‘t’ (using the `name` property)
|
||||
and then linked to a `queue` and an `autovideosink`. The same `tee` is
|
||||
referred to using ‘t.’ (mind the dot) and then linked to a second
|
||||
`queue` and a second `autovideosink`.
|
||||
|
||||
To learn why the queues are necessary read [](tutorials/basic/multithreading-and-pad-availability.md).
|
||||
|
||||
### Pads
|
||||
|
||||
Instead of letting GStreamer choose which Pad to use when linking two
|
||||
elements, you may want to specify the Pads directly. You can do this by
|
||||
adding a dot plus the Pad name after the name of the element (it must be
|
||||
a named element). Learn the names of the Pads of an element by using
|
||||
the `gst-inspect-1.0` tool.
|
||||
|
||||
This is useful, for example, when you want to retrieve one particular
|
||||
stream out of a
|
||||
demuxer:
|
||||
|
||||
```
|
||||
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_0 ! matroskamux ! filesink location=sintel_video.mkv
|
||||
```
|
||||
|
||||
This fetches a media file from the internet using `souphttpsrc`, which
|
||||
is in webm format (a special kind of Matroska container, see [](tutorials/basic/concepts.md)). We
|
||||
then open the container using `matroskademux`. This media contains both
|
||||
audio and video, so `matroskademux` will create two output Pads, named
|
||||
`video_0` and `audio_0`. We link `video_0` to a `matroskamux` element
|
||||
to re-pack the video stream into a new container, and finally link it to
|
||||
a `filesink`, which will write the stream into a file named
|
||||
"sintel\_video.mkv" (the `location` property specifies the name of the
|
||||
file).
|
||||
|
||||
All in all, we took a webm file, stripped it of audio, and generated a
|
||||
new matroska file with the video. If we wanted to keep only the
|
||||
audio:
|
||||
|
||||
```
|
||||
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux name=d d.audio_0 ! vorbisparse ! matroskamux ! filesink location=sintel_audio.mka
|
||||
```
|
||||
|
||||
The `vorbisparse` element is required to extract some information from
|
||||
the stream and put it in the Pad Caps, so the next element,
|
||||
`matroskamux`, knows how to deal with the stream. In the case of video
|
||||
this was not necessary, because `matroskademux` already extracted this
|
||||
information and added it to the Caps.
|
||||
|
||||
Note that in the above two examples no media has been decoded or played.
|
||||
We have just moved from one container to another (demultiplexing and
|
||||
re-multiplexing again).
|
||||
|
||||
### Caps filters
|
||||
|
||||
When an element has more than one output pad, it might happen that the
|
||||
link to the next element is ambiguous: the next element may have more
|
||||
than one compatible input pad, or its input pad may be compatible with
|
||||
the Pad Caps of all the output pads. In these cases GStreamer will link
|
||||
using the first pad that is available, which pretty much amounts to
|
||||
saying that GStreamer will choose one output pad at random.
|
||||
|
||||
Consider the following
|
||||
pipeline:
|
||||
|
||||
```
|
||||
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux ! filesink location=test
|
||||
```
|
||||
|
||||
This is the same media file and demuxer as in the previous example. The
|
||||
input Pad Caps of `filesink` are `ANY`, meaning that it can accept any
|
||||
kind of media. Which one of the two output pads of `matroskademux` will
|
||||
be linked against the filesink? `video_0` or `audio_0`? You cannot
|
||||
know.
|
||||
|
||||
You can remove this ambiguity, though, by using named pads, as in the
|
||||
previous sub-section, or by using **Caps
|
||||
Filters**:
|
||||
|
||||
```
|
||||
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux ! video/x-vp8 ! matroskamux ! filesink location=sintel_video.mkv
|
||||
```
|
||||
|
||||
A Caps Filter behaves like a pass-through element which does nothing and
|
||||
only accepts media with the given Caps, effectively resolving the
|
||||
ambiguity. In this example, between `matroskademux` and `matroskamux` we
|
||||
added a `video/x-vp8` Caps Filter to specify that we are interested in
|
||||
the output pad of `matroskademux` which can produce this kind of video.
|
||||
|
||||
To find out the Caps an element accepts and produces, use the
|
||||
`gst-inspect-1.0` tool. To find out the Caps contained in a particular file,
|
||||
use the `gst-discoverer-1.0` tool. To find out the Caps an element is
|
||||
producing for a particular pipeline, run `gst-launch-1.0` as usual, with the
|
||||
`–v` option to print Caps information.
|
||||
|
||||
### Examples
|
||||
|
||||
Play a media file using `playbin` (as in [](tutorials/basic/hello-world.md)):
|
||||
|
||||
```
|
||||
gst-launch-1.0 playbin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm
|
||||
```
|
||||
|
||||
A fully operation playback pipeline, with audio and video (more or less
|
||||
the same pipeline that `playbin` will create
|
||||
internally):
|
||||
|
||||
```
|
||||
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! matroskademux name=d ! queue ! vp8dec ! videoconvert ! autovideosink d. ! queue ! vorbisdec ! audioconvert ! audioresample ! autoaudiosink
|
||||
```
|
||||
|
||||
A transcoding pipeline, which opens the webm container and decodes both
|
||||
streams (via uridecodebin), then re-encodes the audio and video branches
|
||||
with a different codec, and puts them back together in an Ogg container
|
||||
(just for the sake of
|
||||
it).
|
||||
|
||||
```
|
||||
gst-launch-1.0 uridecodebin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm name=d ! queue ! theoraenc ! oggmux name=m ! filesink location=sintel.ogg d. ! queue ! audioconvert ! audioresample ! flacenc ! m.
|
||||
```
|
||||
|
||||
A rescaling pipeline. The `videoscale` element performs a rescaling
|
||||
operation whenever the frame size is different in the input and the
|
||||
output caps. The output caps are set by the Caps Filter to
|
||||
320x200.
|
||||
|
||||
```
|
||||
gst-launch-1.0 uridecodebin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! queue ! videoscale ! video/x-raw-yuv,width=320,height=200 ! videoconvert ! autovideosink
|
||||
```
|
||||
|
||||
This short description of `gst-launch-1.0` should be enough to get you
|
||||
started. Remember that you have the [complete documentation available
|
||||
here](tools/gst-launch.md).
|
||||
|
||||
## `gst-inspect-1.0`
|
||||
|
||||
This tool has three modes of operation:
|
||||
|
||||
- Without arguments, it lists all available elements types, this is,
|
||||
the types you can use to instantiate new elements.
|
||||
- With a file name as an argument, it treats the file as a GStreamer
|
||||
plugin, tries to open it, and lists all the elements described
|
||||
inside.
|
||||
- With a GStreamer element name as an argument, it lists all
|
||||
information regarding that element.
|
||||
|
||||
Let's see an example of the third mode:
|
||||
|
||||
```
|
||||
gst-inspect-1.0 vp8dec
|
||||
|
||||
Factory Details:
|
||||
Rank primary (256)
|
||||
Long-name On2 VP8 Decoder
|
||||
Klass Codec/Decoder/Video
|
||||
Description Decode VP8 video streams
|
||||
Author David Schleef <ds@entropywave.com>, Sebastian Dröge <sebastian.droege@collabora.co.uk>
|
||||
|
||||
Plugin Details:
|
||||
Name vpx
|
||||
Description VP8 plugin
|
||||
Filename /usr/lib64/gstreamer-1.0/libgstvpx.so
|
||||
Version 1.6.4
|
||||
License LGPL
|
||||
Source module gst-plugins-good
|
||||
Source release date 2016-04-14
|
||||
Binary package Fedora GStreamer-plugins-good package
|
||||
Origin URL http://download.fedoraproject.org
|
||||
|
||||
GObject
|
||||
+----GInitiallyUnowned
|
||||
+----GstObject
|
||||
+----GstElement
|
||||
+----GstVideoDecoder
|
||||
+----GstVP8Dec
|
||||
|
||||
Pad Templates:
|
||||
SINK template: 'sink'
|
||||
Availability: Always
|
||||
Capabilities:
|
||||
video/x-vp8
|
||||
|
||||
SRC template: 'src'
|
||||
Availability: Always
|
||||
Capabilities:
|
||||
video/x-raw
|
||||
format: I420
|
||||
width: [ 1, 2147483647 ]
|
||||
height: [ 1, 2147483647 ]
|
||||
framerate: [ 0/1, 2147483647/1 ]
|
||||
|
||||
|
||||
Element Flags:
|
||||
no flags set
|
||||
|
||||
Element Implementation:
|
||||
Has change_state() function: gst_video_decoder_change_state
|
||||
|
||||
Element has no clocking capabilities.
|
||||
Element has no URI handling capabilities.
|
||||
|
||||
Pads:
|
||||
SINK: 'sink'
|
||||
Pad Template: 'sink'
|
||||
SRC: 'src'
|
||||
Pad Template: 'src'
|
||||
|
||||
Element Properties:
|
||||
name : The name of the object
|
||||
flags: readable, writable
|
||||
String. Default: "vp8dec0"
|
||||
parent : The parent of the object
|
||||
flags: readable, writable
|
||||
Object of type "GstObject"
|
||||
post-processing : Enable post processing
|
||||
flags: readable, writable
|
||||
Boolean. Default: false
|
||||
post-processing-flags: Flags to control post processing
|
||||
flags: readable, writable
|
||||
Flags "GstVP8DecPostProcessingFlags" Default: 0x00000403, "mfqe+demacroblock+deblock"
|
||||
(0x00000001): deblock - Deblock
|
||||
(0x00000002): demacroblock - Demacroblock
|
||||
(0x00000004): addnoise - Add noise
|
||||
(0x00000400): mfqe - Multi-frame quality enhancement
|
||||
deblocking-level : Deblocking level
|
||||
flags: readable, writable
|
||||
Unsigned Integer. Range: 0 - 16 Default: 4
|
||||
noise-level : Noise level
|
||||
flags: readable, writable
|
||||
Unsigned Integer. Range: 0 - 16 Default: 0
|
||||
threads : Maximum number of decoding threads
|
||||
flags: readable, writable
|
||||
Unsigned Integer. Range: 1 - 16 Default: 0
|
||||
```
|
||||
|
||||
The most relevant sections are:
|
||||
|
||||
- Pad Templates: This lists all the kinds of Pads this
|
||||
element can have, along with their capabilities. This is where you
|
||||
look to find out if an element can link with another one. In this
|
||||
case, it has only one sink pad template, accepting only
|
||||
`video/x-vp8` (encoded video data in VP8 format) and only one source
|
||||
pad template, producing `video/x-raw` (decoded video data).
|
||||
- Element Properties: This lists the properties of the
|
||||
element, along with their type and accepted values.
|
||||
|
||||
For more information, you can check the [documentation
|
||||
page](tools/gst-inspect.md) of `gst-inspect-1.0`.
|
||||
|
||||
## `gst-discoverer-1.0`
|
||||
|
||||
This tool is a wrapper around the `GstDiscoverer` object shown in [](tutorials/basic/media-information-gathering.md).
|
||||
It accepts a URI from the command line and prints all information
|
||||
regarding the media that GStreamer can extract. It is useful to find out
|
||||
what container and codecs have been used to produce the media, and
|
||||
therefore what elements you need to put in a pipeline to play it.
|
||||
|
||||
Use `gst-discoverer-1.0 --help` to obtain the list of available options,
|
||||
which basically control the amount of verbosity of the output.
|
||||
|
||||
Let's see an
|
||||
example:
|
||||
|
||||
```
|
||||
gst-discoverer-1.0 https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm -v
|
||||
|
||||
Analyzing https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm
|
||||
Done discovering https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm
|
||||
Topology:
|
||||
container: video/webm
|
||||
audio: audio/x-vorbis, channels=(int)2, rate=(int)48000
|
||||
Codec:
|
||||
audio/x-vorbis, channels=(int)2, rate=(int)48000
|
||||
Additional info:
|
||||
None
|
||||
Language: en
|
||||
Channels: 2
|
||||
Sample rate: 48000
|
||||
Depth: 0
|
||||
Bitrate: 80000
|
||||
Max bitrate: 0
|
||||
Tags:
|
||||
taglist, language-code=(string)en, container-format=(string)Matroska, audio-codec=(string)Vorbis, application-name=(string)ffmpeg2theora-0.24, encoder=(string)"Xiph.Org\ libVorbis\ I\ 20090709", encoder-version=(uint)0, nominal-bitrate=(uint)80000, bitrate=(uint)80000;
|
||||
video: video/x-vp8, width=(int)854, height=(int)480, framerate=(fraction)25/1
|
||||
Codec:
|
||||
video/x-vp8, width=(int)854, height=(int)480, framerate=(fraction)25/1
|
||||
Additional info:
|
||||
None
|
||||
Width: 854
|
||||
Height: 480
|
||||
Depth: 0
|
||||
Frame rate: 25/1
|
||||
Pixel aspect ratio: 1/1
|
||||
Interlaced: false
|
||||
Bitrate: 0
|
||||
Max bitrate: 0
|
||||
Tags:
|
||||
taglist, video-codec=(string)"VP8\ video", container-format=(string)Matroska;
|
||||
|
||||
Properties:
|
||||
Duration: 0:00:52.250000000
|
||||
Seekable: yes
|
||||
Tags:
|
||||
video codec: VP8 video
|
||||
language code: en
|
||||
container format: Matroska
|
||||
application name: ffmpeg2theora-0.24
|
||||
encoder: Xiph.Org libVorbis I 20090709
|
||||
encoder version: 0
|
||||
audio codec: Vorbis
|
||||
nominal bitrate: 80000
|
||||
bitrate: 80000
|
||||
```
|
||||
|
||||
## Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to build and run GStreamer pipelines from the command line using
|
||||
the `gst-launch-1.0` tool.
|
||||
- How to find out what GStreamer elements you have available and their
|
||||
capabilities, using the `gst-inspect-1.0` tool.
|
||||
- How to discover the internal structure of media files, using
|
||||
`gst-discoverer-1.0`.
|
||||
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
@@ -0,0 +1,354 @@
|
||||
# Basic tutorial 14: Handy elements
|
||||
|
||||
## Goal
|
||||
|
||||
This tutorial gives a list of handy GStreamer elements that are worth
|
||||
knowing. They range from powerful all-in-one elements that allow you to
|
||||
build complex pipelines easily (like `playbin`), to little helper
|
||||
elements which are extremely useful when debugging.
|
||||
|
||||
For simplicity, the following examples are given using the
|
||||
`gst-launch-1.0` tool (Learn about it in
|
||||
[](tutorials/basic/gstreamer-tools.md)). Use the `-v` command line
|
||||
parameter if you want to see the Pad Caps that are being negotiated.
|
||||
|
||||
## Bins
|
||||
|
||||
These are Bin elements which you treat as a single element and they take
|
||||
care of instantiating all the necessary internal pipeline to accomplish
|
||||
their task.
|
||||
|
||||
### `playbin`
|
||||
|
||||
This element has been extensively used throughout the tutorials. It
|
||||
manages all aspects of media playback, from source to display, passing
|
||||
through demuxing and decoding. It is so flexible and has so many options
|
||||
that a whole set of tutorials are devoted to it. See the [](tutorials/playback/index.md) for more details.
|
||||
|
||||
### `uridecodebin`
|
||||
|
||||
This element decodes data from a URI into raw media. It selects a source
|
||||
element that can handle the given URI scheme and connects it to
|
||||
a `decodebin` element. It acts like a demuxer, so it offers as many
|
||||
source pads as streams are found in the
|
||||
media.
|
||||
|
||||
``` bash
|
||||
gst-launch-1.0 uridecodebin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! videoconvert ! autovideosink
|
||||
```
|
||||
|
||||
``` bash
|
||||
gst-launch-1.0 uridecodebin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! audioconvert ! autoaudiosink
|
||||
```
|
||||
|
||||
### `decodebin`
|
||||
|
||||
This element automatically constructs a decoding pipeline using
|
||||
available decoders and demuxers via auto-plugging until raw media is
|
||||
obtained. It is used internally by `uridecodebin` which is often more
|
||||
convenient to use, as it creates a suitable source element as well. It
|
||||
replaces the old `decodebin` element. It acts like a demuxer, so it
|
||||
offers as many source pads as streams are found in the
|
||||
media.
|
||||
|
||||
``` bash
|
||||
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! decodebin ! autovideosink
|
||||
```
|
||||
|
||||
## File input/output
|
||||
|
||||
### `filesrc`
|
||||
|
||||
This element reads a local file and produces media with `ANY` Caps. If
|
||||
you want to obtain the correct Caps for the media, explore the stream by
|
||||
using a `typefind` element or by setting the `typefind` property
|
||||
of `filesrc` to
|
||||
`TRUE`.
|
||||
|
||||
``` c
|
||||
gst-launch-1.0 filesrc location=f:\\media\\sintel\\sintel_trailer-480p.webm ! decodebin ! autovideosink
|
||||
```
|
||||
|
||||
### `filesink`
|
||||
|
||||
This element writes to a file all the media it receives. Use the
|
||||
`location` property to specify the file
|
||||
name.
|
||||
|
||||
```
|
||||
gst-launch-1.0 audiotestsrc ! vorbisenc ! oggmux ! filesink location=test.ogg
|
||||
```
|
||||
|
||||
## Network
|
||||
|
||||
### `souphttpsrc`
|
||||
|
||||
This element receives data as a client over the network via HTTP using
|
||||
the [libsoup](https://wiki.gnome.org/Projects/libsoup) library. Set the URL to retrieve through the `location`
|
||||
property.
|
||||
|
||||
``` bash
|
||||
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! decodebin ! autovideosink
|
||||
```
|
||||
|
||||
## Test media generation
|
||||
|
||||
These elements are very useful to check if other parts of the pipeline
|
||||
are working, by replacing the source by one of these test sources which
|
||||
are “guaranteed” to work.
|
||||
|
||||
### `videotestsrc`
|
||||
|
||||
This element produces a video pattern (selectable among many different
|
||||
options with the `pattern` property). Use it to test video pipelines.
|
||||
|
||||
``` bash
|
||||
gst-launch-1.0 videotestsrc ! videoconvert ! autovideosink
|
||||
```
|
||||
|
||||
### `audiotestsrc`
|
||||
|
||||
This element produces an audio wave (selectable among many different
|
||||
options with the `wave` property). Use it to test video pipelines.
|
||||
|
||||
``` bash
|
||||
gst-launch-1.0 audiotestsrc ! audioconvert ! autoaudiosink
|
||||
```
|
||||
|
||||
## Video adapters
|
||||
|
||||
### `videoconvert`
|
||||
|
||||
This element converts from one color space (e.g. RGB) to another one
|
||||
(e.g. YUV). It can also convert between different YUV formats (e.g.
|
||||
I420, NV12, YUY2 …) or RGB format arrangements (e.g. RGBA, ARGB, BGRA…).
|
||||
|
||||
This is normally your first choice when solving negotiation problems.
|
||||
When not needed, because its upstream and downstream elements can
|
||||
already understand each other, it acts in pass-through mode having
|
||||
minimal impact on the performance.
|
||||
|
||||
As a rule of thumb, always use `videoconvert` whenever you use
|
||||
elements whose Caps are unknown at design time, like `autovideosink`, or
|
||||
that can vary depending on external factors, like decoding a
|
||||
user-provided file.
|
||||
|
||||
``` bash
|
||||
gst-launch-1.0 videotestsrc ! videoconvert ! autovideosink
|
||||
```
|
||||
|
||||
### `videorate`
|
||||
|
||||
This element takes an incoming stream of time-stamped video frames and
|
||||
produces a stream that matches the source pad's frame rate. The
|
||||
correction is performed by dropping and duplicating frames, no fancy
|
||||
algorithm is used to interpolate frames.
|
||||
|
||||
This is useful to allow elements requiring different frame rates to
|
||||
link. As with the other adapters, if it is not needed (because there is
|
||||
a frame rate on which both Pads can agree), it acts in pass-through mode
|
||||
and does not impact performance.
|
||||
|
||||
It is therefore a good idea to always use it whenever the actual frame
|
||||
rate is unknown at design time, just in
|
||||
case.
|
||||
|
||||
``` c
|
||||
gst-launch-1.0 videotestsrc ! video/x-raw,framerate=30/1 ! videorate ! video/x-raw,framerate=1/1 ! videoconvert ! autovideosink
|
||||
```
|
||||
|
||||
### `videoscale`
|
||||
|
||||
This element resizes video frames. By default the element tries to
|
||||
negotiate to the same size on the source and sink Pads so that no
|
||||
scaling is needed. It is therefore safe to insert this element in a
|
||||
pipeline to get more robust behavior without any cost if no scaling is
|
||||
needed.
|
||||
|
||||
This element supports a wide range of color spaces including various YUV
|
||||
and RGB formats and is therefore generally able to operate anywhere in a
|
||||
pipeline.
|
||||
|
||||
If the video is to be output to a window whose size is controlled by the
|
||||
user, it is a good idea to use a `videoscale` element, since not all
|
||||
video sinks are capable of performing scaling
|
||||
operations.
|
||||
|
||||
``` bash
|
||||
gst-launch-1.0 uridecodebin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! videoscale ! video/x-raw,width=178,height=100 ! videoconvert ! autovideosink
|
||||
```
|
||||
|
||||
## Audio adapters
|
||||
|
||||
### `audioconvert`
|
||||
|
||||
This element converts raw audio buffers between various possible
|
||||
formats. It supports integer to float conversion, width/depth
|
||||
conversion, signedness and endianness conversion and channel
|
||||
transformations.
|
||||
|
||||
Like `videoconvert` does for video, you use this to solve
|
||||
negotiation problems with audio, and it is generally safe to use it
|
||||
liberally, since this element does nothing if it is not needed.
|
||||
|
||||
``` bash
|
||||
gst-launch-1.0 audiotestsrc ! audioconvert ! autoaudiosink
|
||||
```
|
||||
|
||||
### `audioresample`
|
||||
|
||||
This element resamples raw audio buffers to different sampling rates
|
||||
using a configurable windowing function to enhance quality
|
||||
|
||||
Again, use it to solve negotiation problems regarding sampling rates and
|
||||
do not fear to use it
|
||||
generously.
|
||||
|
||||
``` bash
|
||||
gst-launch-1.0 uridecodebin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! audioresample ! audio/x-raw-float,rate=4000 ! audioconvert ! autoaudiosink
|
||||
```
|
||||
|
||||
### `audiorate`
|
||||
|
||||
This element takes an incoming stream of time-stamped raw audio frames
|
||||
and produces a perfect stream by inserting or dropping samples as
|
||||
needed. It does not allow the sample rate to be changed
|
||||
as `videorate` does, it just fills gaps and removes overlapped samples
|
||||
so the output stream is continuous and “clean”.
|
||||
|
||||
It is useful in situations where the timestamps are going to be lost
|
||||
(when storing into certain file formats, for example) and the receiver
|
||||
will require all samples to be present. It is cumbersome to exemplify
|
||||
this, so no example is given.
|
||||
|
||||

|
||||
Most of the time, `audiorate` is not what you want.
|
||||
|
||||
## Multithreading
|
||||
|
||||
### `queue`
|
||||
|
||||
Queues have been explained in [](tutorials/basic/multithreading-and-pad-availability.md). Basically, a queue performs two tasks:
|
||||
|
||||
- Data is queued until a selected limit is reached. Any attempt to
|
||||
push more buffers into the queue blocks the pushing thread until
|
||||
more space becomes available.
|
||||
- The queue creates a new thread on the source Pad to decouple the
|
||||
processing on sink and source Pads.
|
||||
|
||||
Additionally, `queue` triggers signals when it is about to become empty
|
||||
or full (according to some configurable thresholds), and can be
|
||||
instructed to drop buffers instead of blocking when it is full.
|
||||
|
||||
As a rule of thumb, prefer the simpler `queue` element
|
||||
over `queue2` whenever network buffering is not a concern to you.
|
||||
See [](tutorials/basic/multithreading-and-pad-availability.md)
|
||||
for an example.
|
||||
|
||||
### `queue2`
|
||||
|
||||
This element is not an evolution of `queue`. It has the same design
|
||||
goals but follows a different implementation approach, which results in
|
||||
different features. Unfortunately, it is often not easy to tell which
|
||||
queue is the best choice.
|
||||
|
||||
`queue2` performs the two tasks listed above for `queue`, and,
|
||||
additionally, is able to store the received data (or part of it) on a
|
||||
disk file, for later retrieval. It also replaces the signals with the
|
||||
more general and convenient buffering messages described in
|
||||
[](tutorials/basic/streaming.md).
|
||||
|
||||
As a rule of thumb, prefer `queue2` over `queue` whenever network
|
||||
buffering is a concern to you. See [](tutorials/basic/streaming.md)
|
||||
for an example (`queue2` is hidden inside `playbin`).
|
||||
|
||||
### `multiqueue`
|
||||
|
||||
This element provides queues for multiple streams simultaneously, and
|
||||
eases their management, by allowing some queues to grow if no data is
|
||||
being received on other streams, or by allowing some queues to drop data
|
||||
if they are not connected to anything (instead of returning an error, as
|
||||
a simpler queue would do). Additionally, it synchronizes the different
|
||||
streams, ensuring that none of them goes too far ahead of the others.
|
||||
|
||||
This is an advanced element. It is found inside `decodebin`, but you
|
||||
will rarely need to instantiate it yourself in a normal playback
|
||||
application.
|
||||
|
||||
### `tee`
|
||||
|
||||
[](tutorials/basic/multithreading-and-pad-availability.md) already
|
||||
showed how to use a `tee` element, which splits data to multiple pads.
|
||||
Splitting the data flow is useful, for example, when capturing a video
|
||||
where the video is shown on the screen and also encoded and written to a
|
||||
file. Another example is playing music and hooking up a visualization
|
||||
module.
|
||||
|
||||
One needs to use separate `queue` elements in each branch to provide
|
||||
separate threads for each branch. Otherwise a blocked dataflow in one
|
||||
branch would stall the other
|
||||
branches.
|
||||
|
||||
```
|
||||
gst-launch-1.0 audiotestsrc ! tee name=t ! queue ! audioconvert ! autoaudiosink t. ! queue ! wavescope ! videoconvert ! autovideosink
|
||||
```
|
||||
|
||||
## Capabilities
|
||||
|
||||
### `capsfilter`
|
||||
[](tutorials/basic/gstreamer-tools.md) already
|
||||
explained how to use Caps filters with `gst-launch-1.0`. When building a
|
||||
pipeline programmatically, Caps filters are implemented with
|
||||
the `capsfilter` element. This element does not modify data as such,
|
||||
but enforces limitations on the data format.
|
||||
|
||||
``` bash
|
||||
gst-launch-1.0 videotestsrc ! video/x-raw, format=GRAY8 ! videoconvert ! autovideosink
|
||||
```
|
||||
|
||||
### `typefind`
|
||||
|
||||
This element determines the type of media a stream contains. It applies
|
||||
typefind functions in the order of their rank. Once the type has been
|
||||
detected it sets its source Pad Caps to the found media type and emits
|
||||
the `have-type` signal.
|
||||
|
||||
It is instantiated internally by `decodebin`, and you can use it too to
|
||||
find the media type, although you can normally use the
|
||||
`GstDiscoverer` which provides more information (as seen in
|
||||
[](tutorials/basic/media-information-gathering.md)).
|
||||
|
||||
## Debugging
|
||||
|
||||
### `fakesink`
|
||||
|
||||
This sink element simply swallows any data fed to it. It is useful when
|
||||
debugging, to replace your normal sinks and rule them out of the
|
||||
equation. It can be very verbose when combined with the `-v` switch
|
||||
of `gst-launch-1.0`, so use the `silent` property to remove any unwanted
|
||||
noise.
|
||||
|
||||
```
|
||||
gst-launch-1.0 audiotestsrc num-buffers=1000 ! fakesink sync=false
|
||||
```
|
||||
|
||||
### `identity`
|
||||
|
||||
This is a dummy element that passes incoming data through unmodified. It
|
||||
has several useful diagnostic functions, such as offset and timestamp
|
||||
checking, or buffer dropping. Read its documentation to learn all the
|
||||
things this seemingly harmless element can
|
||||
do.
|
||||
|
||||
```
|
||||
gst-launch-1.0 audiotestsrc ! identity drop-probability=0.1 ! audioconvert ! autoaudiosink
|
||||
```
|
||||
|
||||
## Conclusion
|
||||
|
||||
This tutorial has listed a few elements which are worth knowing, due to
|
||||
their usefulness in the day-to-day work with GStreamer. Some are
|
||||
valuable for production pipelines, whereas others are only needed for
|
||||
debugging purposes.
|
||||
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
@@ -0,0 +1,271 @@
|
||||
---
|
||||
short-description: The mandatory 'Hello world' example
|
||||
...
|
||||
|
||||
{{ ALERT_JS.md }}
|
||||
|
||||
# Basic tutorial 1: Hello world!
|
||||
|
||||
## Goal
|
||||
|
||||
Nothing better to get a first impression about a software library than
|
||||
to print “Hello World” on the screen!
|
||||
|
||||
But since we are dealing with multimedia frameworks, we are going to
|
||||
play a video instead.
|
||||
|
||||
{{ C.md }}
|
||||
Do not be scared by the amount of code below: there are only 4 lines
|
||||
which do *real* work. The rest is cleanup code, and, in C, this is
|
||||
always a bit verbose.
|
||||
{{ END_LANG.md }}
|
||||
|
||||
Without further ado, get ready for your first GStreamer application...
|
||||
|
||||
## Hello world
|
||||
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
Copy this code into a text file named `basic-tutorial-1.c` (or find it
|
||||
in your GStreamer installation).
|
||||
|
||||
**basic-tutorial-1.c**
|
||||
|
||||
{{ tutorials/basic-tutorial-1.c }}
|
||||
|
||||
Compile it as described in [Installing on Linux], [Installing on Mac OS
|
||||
X] or [Installing on Windows]. If you get compilation errors,
|
||||
double-check the instructions given in those sections.
|
||||
|
||||
If everything built fine, fire up the executable! You should see a
|
||||
window pop up, containing a video being played straight from the
|
||||
Internet, along with audio. Congratulations!
|
||||
|
||||
> ![Information] Need help?
|
||||
>
|
||||
> If you need help to compile this code, refer to the **Building the
|
||||
> tutorials** section for your platform: [Linux], [Mac OS X] or
|
||||
> [Windows], or use this specific command on Linux:
|
||||
>
|
||||
> `` gcc basic-tutorial-1.c -o basic-tutorial-1 `pkg-config --cflags --libs gstreamer-1.0` ``
|
||||
>
|
||||
> If you need help to run this code, refer to the **Running the
|
||||
> tutorials** section for your platform: [Linux][1], [Mac OS X][2] or
|
||||
> [Windows][3].
|
||||
>
|
||||
> Required libraries: `gstreamer-1.0`
|
||||
{{ END_LANG.md }}
|
||||
|
||||
{{ PY.md }}
|
||||
**basic-tutorial-1.py**
|
||||
|
||||
{{ tutorials/python/basic-tutorial-1.py }}
|
||||
|
||||
Just run the file with `python3 basic-tutorial-1.py`
|
||||
{{ END_LANG.md }}
|
||||
|
||||
|
||||
This tutorial opens a window and displays a movie, with accompanying audio. The
|
||||
media is fetched from the Internet, so the window might take a few seconds to
|
||||
appear, depending on your connection speed. Also, there is no latency management
|
||||
(buffering), so on slow connections, the movie might stop after a few seconds.
|
||||
See how [Basic tutorial 12: Streaming] solves this issue.
|
||||
|
||||
## Walkthrough
|
||||
|
||||
Let's review these lines of code and see what they do:
|
||||
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
{{ tutorials/basic-tutorial-1.c[9:11] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
{{ PY.md }}
|
||||
{{ tutorials/python/basic-tutorial-1.py[15:17] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
|
||||
This must always be your first GStreamer command. Among other things,
|
||||
[gst_init]\():
|
||||
|
||||
- Initializes all internal structures
|
||||
|
||||
- Checks what plug-ins are available
|
||||
|
||||
- Executes any command-line option intended for GStreamer
|
||||
|
||||
If you always pass your command-line parameters
|
||||
`argc` and `argv` to [gst_init]\() your application will automatically
|
||||
benefit from the GStreamer standard command-line options (more on this
|
||||
in [Basic tutorial 10: GStreamer tools])
|
||||
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
{{ tutorials/basic-tutorial-1.c[13:17] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
{{ PY.md }}
|
||||
{{ tutorials/python/basic-tutorial-1.py[18:22] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
This line is the heart of this tutorial, and exemplifies **two** key
|
||||
points: [gst_parse_launch]\() and [playbin].
|
||||
|
||||
### [gst_parse_launch]
|
||||
|
||||
GStreamer is a framework designed to handle multimedia flows. Media
|
||||
travels from the “source” elements (the producers), down to the “sink”
|
||||
elements (the consumers), passing through a series of intermediate
|
||||
elements performing all kinds of tasks. The set of all the
|
||||
interconnected elements is called a “pipeline”.
|
||||
|
||||
In GStreamer you usually build the pipeline by manually assembling the
|
||||
individual elements, but, when the pipeline is easy enough, and you do
|
||||
not need any advanced features, you can take the shortcut:
|
||||
[gst_parse_launch]\().
|
||||
|
||||
This function takes a textual representation of a pipeline and turns it
|
||||
into an actual pipeline, which is very handy. In fact, this function is
|
||||
so handy there is a tool built completely around it which you will get
|
||||
very acquainted with (see [Basic tutorial 10: GStreamer tools][Basic
|
||||
tutorial 10: GStreamer tools] to learn about
|
||||
[gst-launch-1.0] and the
|
||||
[gst-launch-1.0] syntax).
|
||||
|
||||
### [playbin]
|
||||
|
||||
So, what kind of pipeline are we asking [gst_parse_launch]\() to build for
|
||||
us? Here enters the second key point: We are building a pipeline
|
||||
composed of a single element called [playbin].
|
||||
|
||||
[playbin] is a special element which acts as a source and as a sink, and
|
||||
is a whole pipeline. Internally, it creates and connects all the
|
||||
necessary elements to play your media, so you do not have to worry about
|
||||
it.
|
||||
|
||||
It does not allow the control granularity that a manual pipeline does,
|
||||
but, still, it permits enough customization to suffice for a wide range
|
||||
of applications. Including this tutorial.
|
||||
|
||||
In this example, we are only passing one parameter to [playbin], which
|
||||
is the URI of the media we want to play. Try changing it to something
|
||||
else! Whether it is an `http://` or `file://` URI, [playbin] will
|
||||
instantiate the appropriate GStreamer source transparently!
|
||||
|
||||
If you mistype the URI, or the file does not exist, or you are missing a
|
||||
plug-in, GStreamer provides several notification mechanisms, but the
|
||||
only thing we are doing in this example is exiting on error, so do not
|
||||
expect much feedback.
|
||||
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
{{ tutorials/basic-tutorial-1.c[18:20] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
{{ PY.md }}
|
||||
{{ tutorials/python/basic-tutorial-1.py[23:25] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
This line highlights another interesting concept: the state. Every
|
||||
GStreamer element has an associated state, which you can more or less
|
||||
think of as the Play/Pause button in your regular DVD player. For now,
|
||||
suffice to say that playback will not start unless you set the pipeline
|
||||
to the `PLAYING` state.
|
||||
|
||||
In this line, [gst_element_set_state]\() is setting `pipeline` (our only
|
||||
element, remember) to the `PLAYING` state, thus initiating playback.
|
||||
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
{{ tutorials/basic-tutorial-1.c[21:26] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
{{ PY.md }}
|
||||
{{ tutorials/python/basic-tutorial-1.py[26:32] }}
|
||||
{{ END_LANG.md }}
|
||||
|
||||
These lines will wait until an error occurs or the end of the stream is
|
||||
found. [gst_element_get_bus]\() retrieves the pipeline's bus, and
|
||||
[gst_bus_timed_pop_filtered]\() will block until you receive either an
|
||||
ERROR or an `EOS` (End-Of-Stream) through that bus. Do not worry much
|
||||
about this line, the GStreamer bus is explained in [Basic tutorial 2:
|
||||
GStreamer concepts].
|
||||
|
||||
And that's it! From this point onwards, GStreamer takes care of
|
||||
everything. Execution will end when the media reaches its end (EOS) or
|
||||
an error is encountered (try closing the video window, or unplugging the
|
||||
network cable). The application can always be stopped by pressing
|
||||
control-C in the console.
|
||||
|
||||
### Cleanup
|
||||
|
||||
Before terminating the application, though, there is a couple of things
|
||||
we need to do to tidy up correctly after ourselves.
|
||||
|
||||
{{ C+JS_FALLBACK.md }}
|
||||
{{ tutorials/basic-tutorial-1.c[27:33] }}
|
||||
|
||||
Always read the documentation of the functions you use, to know if you
|
||||
should free the objects they return after using them.
|
||||
|
||||
In this case, [gst_bus_timed_pop_filtered]\() returned a message which
|
||||
needs to be freed with [gst_message_unref]\() (more about messages in
|
||||
[Basic tutorial 2: GStreamer concepts][Basic tutorial 2: GStreamer
|
||||
concepts]).
|
||||
|
||||
[gst_element_get_bus]\() added a reference to the bus that must be freed
|
||||
with [gst_object_unref]\(). Setting the pipeline to the NULL state will
|
||||
make sure it frees any resources it has allocated (More about states in
|
||||
[Basic tutorial 3: Dynamic pipelines]). Finally, unreferencing the
|
||||
pipeline will destroy it, and all its contents.
|
||||
{{ END_LANG.md }}
|
||||
{{ PY.md }}
|
||||
{{ tutorials/python/basic-tutorial-1.py[33:35] }}
|
||||
The pipeline state should always be set back to [GST_STATE_NULL] before
|
||||
quitting.
|
||||
{{ END_LANG.md }}
|
||||
|
||||
## Conclusion
|
||||
|
||||
And so ends your first tutorial with GStreamer. We hope its brevity
|
||||
serves as an example of how powerful this framework is!
|
||||
|
||||
Let's recap a bit. Today we have learned:
|
||||
|
||||
- How to initialize GStreamer using [gst_init]\().
|
||||
|
||||
- How to quickly build a pipeline from a textual description using
|
||||
[gst_parse_launch]\().
|
||||
|
||||
- How to create an automatic playback pipeline using [playbin].
|
||||
|
||||
- How to signal GStreamer to start playback using
|
||||
[gst_element_set_state]\().
|
||||
|
||||
- How to sit back and relax, while GStreamer takes care of everything,
|
||||
using [gst_element_get_bus]\() and [gst_bus_timed_pop_filtered]\().
|
||||
|
||||
The next tutorial will keep introducing more basic GStreamer elements,
|
||||
and show you how to build a pipeline manually.
|
||||
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
|
||||
[Installing on Linux]: installing/on-linux.md
|
||||
[Installing on Mac OS X]: installing/on-mac-osx.md
|
||||
[Installing on Windows]: installing/on-windows.md
|
||||
[Information]: images/icons/emoticons/information.svg
|
||||
[Linux]: installing/on-linux.md#InstallingonLinux-Build
|
||||
[Mac OS X]: installing/on-mac-osx.md#InstallingonMacOSX-Build
|
||||
[Windows]: installing/on-windows.md#InstallingonWindows-Build
|
||||
[1]: installing/on-linux.md#InstallingonLinux-Run
|
||||
[2]: installing/on-mac-osx.md#InstallingonMacOSX-Run
|
||||
[3]: installing/on-windows.md#InstallingonWindows-Run
|
||||
[Basic tutorial 12: Streaming]: tutorials/basic/streaming.md
|
||||
[Basic tutorial 10: GStreamer tools]: tutorials/basic/gstreamer-tools.md
|
||||
[Basic tutorial 2: GStreamer concepts]: tutorials/basic/concepts.md
|
||||
[Basic tutorial 3: Dynamic pipelines]: tutorials/basic/dynamic-pipelines.md
|
||||
[gst_bus_timed_pop_filtered]: gst_bus_timed_pop_filtered
|
||||
[gst_element_get_bus]: gst_element_get_bus
|
||||
[gst_element_set_state]: gst_element_set_state
|
||||
[gst_init]: gst_init
|
||||
[gst_message_unref]: gst_message_unref
|
||||
[gst_object_unref]: gst_object_unref
|
||||
[gst_parse_launch]: gst_parse_launch
|
||||
[playbin]: playbin
|
||||
[gst-launch-1.0]: tools/gst-launch.md
|
||||
[GST_STATE_NULL]: GST_STATE_NULL
|
||||
@@ -0,0 +1,8 @@
|
||||
---
|
||||
short-description: General topics required to understand the rest of the tutorials
|
||||
...
|
||||
|
||||
# Basic tutorials
|
||||
|
||||
These tutorials describe general topics required to understand the rest
|
||||
of tutorials.
|
||||
@@ -0,0 +1,481 @@
|
||||
# Basic tutorial 6: Media formats and Pad Capabilities
|
||||
|
||||
|
||||
{{ ALERT_PY.md }}
|
||||
|
||||
{{ ALERT_JS.md }}
|
||||
|
||||
## Goal
|
||||
|
||||
Pad Capabilities are a fundamental element of GStreamer, although most
|
||||
of the time they are invisible because the framework handles them
|
||||
automatically. This somewhat theoretical tutorial shows:
|
||||
|
||||
- What are Pad Capabilities.
|
||||
|
||||
- How to retrieve them.
|
||||
|
||||
- When to retrieve them.
|
||||
|
||||
- Why you need to know about them.
|
||||
|
||||
## Introduction
|
||||
|
||||
### Pads
|
||||
|
||||
As it has already been shown, Pads allow information to enter and leave
|
||||
an element. The *Capabilities* (or *Caps*, for short) of a Pad, then,
|
||||
specify what kind of information can travel through the Pad. For
|
||||
example, “RGB video with a resolution of 320x200 pixels and 30 frames
|
||||
per second”, or “16-bits per sample audio, 5.1 channels at 44100 samples
|
||||
per second”, or even compressed formats like mp3 or h264.
|
||||
|
||||
Pads can support multiple Capabilities (for example, a video sink can
|
||||
support video in different types of RGB or YUV formats) and Capabilities can be
|
||||
specified as *ranges* (for example, an audio sink can support samples
|
||||
rates from 1 to 48000 samples per second). However, the actual
|
||||
information traveling from Pad to Pad must have only one well-specified
|
||||
type. Through a process known as *negotiation*, two linked Pads agree on
|
||||
a common type, and thus the Capabilities of the Pads become *fixed*
|
||||
(they only have one type and do not contain ranges). The walkthrough of
|
||||
the sample code below should make all this clear.
|
||||
|
||||
**In order for two elements to be linked together, they must share a
|
||||
common subset of Capabilities** (Otherwise they could not possibly
|
||||
understand each other). This is the main goal of Capabilities.
|
||||
|
||||
As an application developer, you will usually build pipelines by linking
|
||||
elements together (to a lesser extent if you use all-in-all elements
|
||||
like `playbin`). In this case, you need to know the *Pad Caps* (as they
|
||||
are familiarly referred to) of your elements, or, at least, know what
|
||||
they are when GStreamer refuses to link two elements with a negotiation
|
||||
error.
|
||||
|
||||
### Pad templates
|
||||
|
||||
Pads are created from *Pad Templates*, which indicate all possible
|
||||
Capabilities a Pad could ever have. Templates are useful to create several
|
||||
similar Pads, and also allow early refusal of connections between
|
||||
elements: If the Capabilities of their Pad Templates do not have a
|
||||
common subset (their *intersection* is empty), there is no need to
|
||||
negotiate further.
|
||||
|
||||
Pad Templates can be viewed as the first step in the negotiation
|
||||
process. As the process evolves, actual Pads are instantiated and their
|
||||
Capabilities refined until they are fixed (or negotiation fails).
|
||||
|
||||
### Capabilities examples
|
||||
|
||||
```
|
||||
SINK template: 'sink'
|
||||
Availability: Always
|
||||
Capabilities:
|
||||
audio/x-raw
|
||||
format: S16LE
|
||||
rate: [ 1, 2147483647 ]
|
||||
channels: [ 1, 2 ]
|
||||
audio/x-raw
|
||||
format: U8
|
||||
rate: [ 1, 2147483647 ]
|
||||
channels: [ 1, 2 ]
|
||||
```
|
||||
|
||||
This pad is a sink which is always available on the element (we will not
|
||||
talk about availability for now). It supports two kinds of media, both
|
||||
raw audio in integer format (`audio/x-raw`): signed, 16-bit little endian and
|
||||
unsigned 8-bit. The square brackets indicate a range: for instance, the
|
||||
number of channels varies from 1 to 2.
|
||||
|
||||
```
|
||||
SRC template: 'src'
|
||||
Availability: Always
|
||||
Capabilities:
|
||||
video/x-raw
|
||||
width: [ 1, 2147483647 ]
|
||||
height: [ 1, 2147483647 ]
|
||||
framerate: [ 0/1, 2147483647/1 ]
|
||||
format: { I420, NV12, NV21, YV12, YUY2, Y42B, Y444, YUV9, YVU9, Y41B, Y800, Y8, GREY, Y16 , UYVY, YVYU, IYU1, v308, AYUV, A420 }
|
||||
```
|
||||
|
||||
`video/x-raw` indicates that this source pad outputs raw video. It
|
||||
supports a wide range of dimensions and framerates, and a set of YUV
|
||||
formats (The curly braces indicate a *list*). All these formats
|
||||
indicate different packing and subsampling of the image planes.
|
||||
|
||||
### Last remarks
|
||||
|
||||
You can use the `gst-inspect-1.0` tool described in [Basic tutorial 10:
|
||||
GStreamer tools](tutorials/basic/gstreamer-tools.md) to
|
||||
learn about the Caps of any GStreamer element.
|
||||
|
||||
Bear in mind that some elements query the underlying hardware for
|
||||
supported formats and offer their Pad Caps accordingly (They usually do
|
||||
this when entering the READY state or higher). Therefore, the shown caps
|
||||
can vary from platform to platform, or even from one execution to the
|
||||
next (even though this case is rare).
|
||||
|
||||
This tutorial instantiates two elements (this time, through their
|
||||
factories), shows their Pad Templates, links them and sets the pipeline
|
||||
to play. On each state change, the Capabilities of the sink element's
|
||||
Pad are shown, so you can observe how the negotiation proceeds until the
|
||||
Pad Caps are fixed.
|
||||
|
||||
## A trivial Pad Capabilities Example
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-6.c` (or find it
|
||||
in your GStreamer installation).
|
||||
|
||||
**basic-tutorial-6.c**
|
||||
|
||||
``` c
|
||||
#include <gst/gst.h>
|
||||
|
||||
/* Functions below print the Capabilities in a human-friendly format */
|
||||
static gboolean print_field (GQuark field, const GValue * value, gpointer pfx) {
|
||||
gchar *str = gst_value_serialize (value);
|
||||
|
||||
g_print ("%s %15s: %s\n", (gchar *) pfx, g_quark_to_string (field), str);
|
||||
g_free (str);
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
static void print_caps (const GstCaps * caps, const gchar * pfx) {
|
||||
guint i;
|
||||
|
||||
g_return_if_fail (caps != NULL);
|
||||
|
||||
if (gst_caps_is_any (caps)) {
|
||||
g_print ("%sANY\n", pfx);
|
||||
return;
|
||||
}
|
||||
if (gst_caps_is_empty (caps)) {
|
||||
g_print ("%sEMPTY\n", pfx);
|
||||
return;
|
||||
}
|
||||
|
||||
for (i = 0; i < gst_caps_get_size (caps); i++) {
|
||||
GstStructure *structure = gst_caps_get_structure (caps, i);
|
||||
|
||||
g_print ("%s%s\n", pfx, gst_structure_get_name (structure));
|
||||
gst_structure_foreach (structure, print_field, (gpointer) pfx);
|
||||
}
|
||||
}
|
||||
|
||||
/* Prints information about a Pad Template, including its Capabilities */
|
||||
static void print_pad_templates_information (GstElementFactory * factory) {
|
||||
const GList *pads;
|
||||
GstStaticPadTemplate *padtemplate;
|
||||
|
||||
g_print ("Pad Templates for %s:\n", gst_element_factory_get_longname (factory));
|
||||
if (!gst_element_factory_get_num_pad_templates (factory)) {
|
||||
g_print (" none\n");
|
||||
return;
|
||||
}
|
||||
|
||||
pads = gst_element_factory_get_static_pad_templates (factory);
|
||||
while (pads) {
|
||||
padtemplate = pads->data;
|
||||
pads = g_list_next (pads);
|
||||
|
||||
if (padtemplate->direction == GST_PAD_SRC)
|
||||
g_print (" SRC template: '%s'\n", padtemplate->name_template);
|
||||
else if (padtemplate->direction == GST_PAD_SINK)
|
||||
g_print (" SINK template: '%s'\n", padtemplate->name_template);
|
||||
else
|
||||
g_print (" UNKNOWN!!! template: '%s'\n", padtemplate->name_template);
|
||||
|
||||
if (padtemplate->presence == GST_PAD_ALWAYS)
|
||||
g_print (" Availability: Always\n");
|
||||
else if (padtemplate->presence == GST_PAD_SOMETIMES)
|
||||
g_print (" Availability: Sometimes\n");
|
||||
else if (padtemplate->presence == GST_PAD_REQUEST)
|
||||
g_print (" Availability: On request\n");
|
||||
else
|
||||
g_print (" Availability: UNKNOWN!!!\n");
|
||||
|
||||
if (padtemplate->static_caps.string) {
|
||||
GstCaps *caps;
|
||||
g_print (" Capabilities:\n");
|
||||
caps = gst_static_caps_get (&padtemplate->static_caps);
|
||||
print_caps (caps, " ");
|
||||
gst_caps_unref (caps);
|
||||
|
||||
}
|
||||
|
||||
g_print ("\n");
|
||||
}
|
||||
}
|
||||
|
||||
/* Shows the CURRENT capabilities of the requested pad in the given element */
|
||||
static void print_pad_capabilities (GstElement *element, gchar *pad_name) {
|
||||
GstPad *pad = NULL;
|
||||
GstCaps *caps = NULL;
|
||||
|
||||
/* Retrieve pad */
|
||||
pad = gst_element_get_static_pad (element, pad_name);
|
||||
if (!pad) {
|
||||
g_printerr ("Could not retrieve pad '%s'\n", pad_name);
|
||||
return;
|
||||
}
|
||||
|
||||
/* Retrieve negotiated caps (or acceptable caps if negotiation is not finished yet) */
|
||||
caps = gst_pad_get_current_caps (pad);
|
||||
if (!caps)
|
||||
caps = gst_pad_query_caps (pad, NULL);
|
||||
|
||||
/* Print and free */
|
||||
g_print ("Caps for the %s pad:\n", pad_name);
|
||||
print_caps (caps, " ");
|
||||
gst_caps_unref (caps);
|
||||
gst_object_unref (pad);
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
GstElement *pipeline, *source, *sink;
|
||||
GstElementFactory *source_factory, *sink_factory;
|
||||
GstBus *bus;
|
||||
GstMessage *msg;
|
||||
GstStateChangeReturn ret;
|
||||
gboolean terminate = FALSE;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the element factories */
|
||||
source_factory = gst_element_factory_find ("audiotestsrc");
|
||||
sink_factory = gst_element_factory_find ("autoaudiosink");
|
||||
if (!source_factory || !sink_factory) {
|
||||
g_printerr ("Not all element factories could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Print information about the pad templates of these factories */
|
||||
print_pad_templates_information (source_factory);
|
||||
print_pad_templates_information (sink_factory);
|
||||
|
||||
/* Ask the factories to instantiate actual elements */
|
||||
source = gst_element_factory_create (source_factory, "source");
|
||||
sink = gst_element_factory_create (sink_factory, "sink");
|
||||
|
||||
/* Create the empty pipeline */
|
||||
pipeline = gst_pipeline_new ("test-pipeline");
|
||||
|
||||
if (!pipeline || !source || !sink) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Build the pipeline */
|
||||
gst_bin_add_many (GST_BIN (pipeline), source, sink, NULL);
|
||||
if (gst_element_link (source, sink) != TRUE) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Print initial negotiated caps (in NULL state) */
|
||||
g_print ("In NULL state:\n");
|
||||
print_pad_capabilities (sink, "sink");
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state (check the bus for error messages).\n");
|
||||
}
|
||||
|
||||
/* Wait until error, EOS or State Change */
|
||||
bus = gst_element_get_bus (pipeline);
|
||||
do {
|
||||
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS |
|
||||
GST_MESSAGE_STATE_CHANGED);
|
||||
|
||||
/* Parse message */
|
||||
if (msg != NULL) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
switch (GST_MESSAGE_TYPE (msg)) {
|
||||
case GST_MESSAGE_ERROR:
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
terminate = TRUE;
|
||||
break;
|
||||
case GST_MESSAGE_EOS:
|
||||
g_print ("End-Of-Stream reached.\n");
|
||||
terminate = TRUE;
|
||||
break;
|
||||
case GST_MESSAGE_STATE_CHANGED:
|
||||
/* We are only interested in state-changed messages from the pipeline */
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (pipeline)) {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
g_print ("\nPipeline state changed from %s to %s:\n",
|
||||
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
|
||||
/* Print the current capabilities of the sink element */
|
||||
print_pad_capabilities (sink, "sink");
|
||||
}
|
||||
break;
|
||||
default:
|
||||
/* We should not reach here because we only asked for ERRORs, EOS and STATE_CHANGED */
|
||||
g_printerr ("Unexpected message received.\n");
|
||||
break;
|
||||
}
|
||||
gst_message_unref (msg);
|
||||
}
|
||||
} while (!terminate);
|
||||
|
||||
/* Free resources */
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (pipeline);
|
||||
gst_object_unref (source_factory);
|
||||
gst_object_unref (sink_factory);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
> 
|
||||
> Need help?
|
||||
>
|
||||
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Build), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Build) or [Windows](installing/on-windows.md#InstallingonWindows-Build), or use this specific command on Linux:
|
||||
>
|
||||
> `` gcc basic-tutorial-6.c -o basic-tutorial-6 `pkg-config --cflags --libs gstreamer-1.0` ``
|
||||
>
|
||||
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Run), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](installing/on-windows.md#InstallingonWindows-Run).
|
||||
>
|
||||
> This tutorial simply displays information regarding the Pad Capabilities in different time instants.
|
||||
>
|
||||
> Required libraries: `gstreamer-1.0`
|
||||
|
||||
## Walkthrough
|
||||
|
||||
The `print_field`, `print_caps` and `print_pad_templates` simply
|
||||
display, in a human-friendly format, the capabilities structures. If you
|
||||
want to learn about the internal organization of the
|
||||
`GstCaps` structure, read the `GStreamer Documentation` regarding Pad
|
||||
Caps.
|
||||
|
||||
``` c
|
||||
/* Shows the CURRENT capabilities of the requested pad in the given element */
|
||||
static void print_pad_capabilities (GstElement *element, gchar *pad_name) {
|
||||
GstPad *pad = NULL;
|
||||
GstCaps *caps = NULL;
|
||||
|
||||
/* Retrieve pad */
|
||||
pad = gst_element_get_static_pad (element, pad_name);
|
||||
if (!pad) {
|
||||
g_printerr ("Could not retrieve pad '%s'\n", pad_name);
|
||||
return;
|
||||
}
|
||||
|
||||
/* Retrieve negotiated caps (or acceptable caps if negotiation is not finished yet) */
|
||||
caps = gst_pad_get_current_caps (pad);
|
||||
if (!caps)
|
||||
caps = gst_pad_query_caps (pad, NULL);
|
||||
|
||||
/* Print and free */
|
||||
g_print ("Caps for the %s pad:\n", pad_name);
|
||||
print_caps (caps, " ");
|
||||
gst_caps_unref (caps);
|
||||
gst_object_unref (pad);
|
||||
}
|
||||
```
|
||||
|
||||
`gst_element_get_static_pad()` retrieves the named Pad from the given
|
||||
element. This Pad is *static* because it is always present in the
|
||||
element. To know more about Pad availability read the `GStreamer
|
||||
documentation` about Pads.
|
||||
|
||||
Then we call `gst_pad_get_current_caps()` to retrieve the Pad's
|
||||
current Capabilities, which can be fixed or not, depending on the state
|
||||
of the negotiation process. They could even be non-existent, in which
|
||||
case, we call `gst_pad_query_caps()` to retrieve the currently
|
||||
acceptable Pad Capabilities. The currently acceptable Caps will be the
|
||||
Pad Template's Caps in the NULL state, but might change in later states,
|
||||
as the actual hardware Capabilities might be queried.
|
||||
|
||||
We then print these Capabilities.
|
||||
|
||||
``` c
|
||||
/* Create the element factories */
|
||||
source_factory = gst_element_factory_find ("audiotestsrc");
|
||||
sink_factory = gst_element_factory_find ("autoaudiosink");
|
||||
if (!source_factory || !sink_factory) {
|
||||
g_printerr ("Not all element factories could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Print information about the pad templates of these factories */
|
||||
print_pad_templates_information (source_factory);
|
||||
print_pad_templates_information (sink_factory);
|
||||
|
||||
/* Ask the factories to instantiate actual elements */
|
||||
source = gst_element_factory_create (source_factory, "source");
|
||||
sink = gst_element_factory_create (sink_factory, "sink");
|
||||
```
|
||||
|
||||
In the previous tutorials we created the elements directly using
|
||||
`gst_element_factory_make()` and skipped talking about factories, but we
|
||||
will do now. A `GstElementFactory` is in charge of instantiating a
|
||||
particular type of element, identified by its factory name.
|
||||
|
||||
You can use `gst_element_factory_find()` to create a factory of type
|
||||
“videotestsrc”, and then use it to instantiate multiple “videotestsrc”
|
||||
elements using `gst_element_factory_create()`.
|
||||
`gst_element_factory_make()` is really a shortcut for
|
||||
`gst_element_factory_find()`+ `gst_element_factory_create()`.
|
||||
|
||||
The Pad Templates can already be accessed through the factories, so they
|
||||
are printed as soon as the factories are created.
|
||||
|
||||
We skip the pipeline creation and start, and go to the State-Changed
|
||||
message handling:
|
||||
|
||||
``` c
|
||||
case GST_MESSAGE_STATE_CHANGED:
|
||||
/* We are only interested in state-changed messages from the pipeline */
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (pipeline)) {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
g_print ("\nPipeline state changed from %s to %s:\n",
|
||||
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
|
||||
/* Print the current capabilities of the sink element */
|
||||
print_pad_capabilities (sink, "sink");
|
||||
}
|
||||
break;
|
||||
```
|
||||
|
||||
This simply prints the current Pad Caps every time the state of the
|
||||
pipeline changes. You should see, in the output, how the initial caps
|
||||
(the Pad Template's Caps) are progressively refined until they are
|
||||
completely fixed (they contain a single type with no ranges).
|
||||
|
||||
## Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- What are Pad Capabilities and Pad Template Capabilities.
|
||||
|
||||
- How to retrieve them
|
||||
with `gst_pad_get_current_caps()` or `gst_pad_query_caps()`.
|
||||
|
||||
- That they have different meaning depending on the state of the
|
||||
pipeline (initially they indicate all the possible Capabilities,
|
||||
later they indicate the currently negotiated Caps for the Pad).
|
||||
|
||||
- That Pad Caps are important to know beforehand if two elements can
|
||||
be linked together.
|
||||
|
||||
- That Pad Caps can be found using the `gst-inspect-1.0` tool described
|
||||
in [Basic tutorial 10: GStreamer
|
||||
tools](tutorials/basic/gstreamer-tools.md).
|
||||
|
||||
Next tutorial shows how data can be manually injected into and extracted
|
||||
from the GStreamer pipeline.
|
||||
|
||||
Remember that attached to this page you should find the complete source
|
||||
code of the tutorial and any accessory files needed to build it.
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
@@ -0,0 +1,542 @@
|
||||
# Basic tutorial 9: Media information gathering
|
||||
|
||||
|
||||
{{ ALERT_PY.md }}
|
||||
|
||||
{{ ALERT_JS.md }}
|
||||
|
||||
## Goal
|
||||
|
||||
Sometimes you might want to quickly find out what kind of media a file
|
||||
(or URI) contains, or if you will be able to play the media at all. You
|
||||
can build a pipeline, set it to run, and watch the bus messages, but
|
||||
GStreamer has a utility that does just that for you. This tutorial
|
||||
shows:
|
||||
|
||||
- How to recover information regarding a URI
|
||||
|
||||
- How to find out if a URI is playable
|
||||
|
||||
## Introduction
|
||||
|
||||
`GstDiscoverer` is a utility object found in the `pbutils` library
|
||||
(Plug-in Base utilities) that accepts a URI or list of URIs, and returns
|
||||
information about them. It can work in synchronous or asynchronous
|
||||
modes.
|
||||
|
||||
In synchronous mode, there is only a single function to call,
|
||||
`gst_discoverer_discover_uri()`, which blocks until the information is
|
||||
ready. Due to this blocking, it is usually less interesting for
|
||||
GUI-based applications and the asynchronous mode is used, as described
|
||||
in this tutorial.
|
||||
|
||||
The recovered information includes codec descriptions, stream topology
|
||||
(number of streams and sub-streams) and available metadata (like the
|
||||
audio language).
|
||||
|
||||
As an example, this is the result
|
||||
of discovering https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel\_trailer-480p.webm
|
||||
|
||||
Duration: 0:00:52.250000000
|
||||
Tags:
|
||||
video codec: On2 VP8
|
||||
language code: en
|
||||
container format: Matroska
|
||||
application name: ffmpeg2theora-0.24
|
||||
encoder: Xiph.Org libVorbis I 20090709
|
||||
encoder version: 0
|
||||
audio codec: Vorbis
|
||||
nominal bitrate: 80000
|
||||
bitrate: 80000
|
||||
Seekable: yes
|
||||
Stream information:
|
||||
container: WebM
|
||||
audio: Vorbis
|
||||
Tags:
|
||||
language code: en
|
||||
container format: Matroska
|
||||
audio codec: Vorbis
|
||||
application name: ffmpeg2theora-0.24
|
||||
encoder: Xiph.Org libVorbis I 20090709
|
||||
encoder version: 0
|
||||
nominal bitrate: 80000
|
||||
bitrate: 80000
|
||||
video: VP8
|
||||
Tags:
|
||||
video codec: VP8 video
|
||||
container format: Matroska
|
||||
|
||||
The following code tries to discover the URI provided through the
|
||||
command line, and outputs the retrieved information (If no URI is
|
||||
provided it uses a default one).
|
||||
|
||||
This is a simplified version of what the `gst-discoverer-1.0` tool does
|
||||
([](tutorials/basic/gstreamer-tools.md)), which is
|
||||
an application that only displays data, but does not perform any
|
||||
playback.
|
||||
|
||||
## The GStreamer Discoverer
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-9.c` (or find it
|
||||
in your GStreamer installation).
|
||||
|
||||
**basic-tutorial-9.c**
|
||||
|
||||
``` c
|
||||
#include <string.h>
|
||||
#include <gst/gst.h>
|
||||
#include <gst/pbutils/pbutils.h>
|
||||
|
||||
/* Structure to contain all our information, so we can pass it around */
|
||||
typedef struct _CustomData {
|
||||
GstDiscoverer *discoverer;
|
||||
GMainLoop *loop;
|
||||
} CustomData;
|
||||
|
||||
/* Print a tag in a human-readable format (name: value) */
|
||||
static void print_tag_foreach (const GstTagList *tags, const gchar *tag, gpointer user_data) {
|
||||
GValue val = { 0, };
|
||||
gchar *str;
|
||||
gint depth = GPOINTER_TO_INT (user_data);
|
||||
|
||||
gst_tag_list_copy_value (&val, tags, tag);
|
||||
|
||||
if (G_VALUE_HOLDS_STRING (&val))
|
||||
str = g_value_dup_string (&val);
|
||||
else
|
||||
str = gst_value_serialize (&val);
|
||||
|
||||
g_print ("%*s%s: %s\n", 2 * depth, " ", gst_tag_get_nick (tag), str);
|
||||
g_free (str);
|
||||
|
||||
g_value_unset (&val);
|
||||
}
|
||||
|
||||
/* Print information regarding a stream */
|
||||
static void print_stream_info (GstDiscovererStreamInfo *info, gint depth) {
|
||||
gchar *desc = NULL;
|
||||
GstCaps *caps;
|
||||
const GstTagList *tags;
|
||||
|
||||
caps = gst_discoverer_stream_info_get_caps (info);
|
||||
|
||||
if (caps) {
|
||||
if (gst_caps_is_fixed (caps))
|
||||
desc = gst_pb_utils_get_codec_description (caps);
|
||||
else
|
||||
desc = gst_caps_to_string (caps);
|
||||
gst_caps_unref (caps);
|
||||
}
|
||||
|
||||
g_print ("%*s%s: %s\n", 2 * depth, " ", gst_discoverer_stream_info_get_stream_type_nick (info), (desc ? desc : ""));
|
||||
|
||||
if (desc) {
|
||||
g_free (desc);
|
||||
desc = NULL;
|
||||
}
|
||||
|
||||
tags = gst_discoverer_stream_info_get_tags (info);
|
||||
if (tags) {
|
||||
g_print ("%*sTags:\n", 2 * (depth + 1), " ");
|
||||
gst_tag_list_foreach (tags, print_tag_foreach, GINT_TO_POINTER (depth + 2));
|
||||
}
|
||||
}
|
||||
|
||||
/* Print information regarding a stream and its substreams, if any */
|
||||
static void print_topology (GstDiscovererStreamInfo *info, gint depth) {
|
||||
GstDiscovererStreamInfo *next;
|
||||
|
||||
if (!info)
|
||||
return;
|
||||
|
||||
print_stream_info (info, depth);
|
||||
|
||||
next = gst_discoverer_stream_info_get_next (info);
|
||||
if (next) {
|
||||
print_topology (next, depth + 1);
|
||||
gst_discoverer_stream_info_unref (next);
|
||||
} else if (GST_IS_DISCOVERER_CONTAINER_INFO (info)) {
|
||||
GList *tmp, *streams;
|
||||
|
||||
streams = gst_discoverer_container_info_get_streams (GST_DISCOVERER_CONTAINER_INFO (info));
|
||||
for (tmp = streams; tmp; tmp = tmp->next) {
|
||||
GstDiscovererStreamInfo *tmpinf = (GstDiscovererStreamInfo *) tmp->data;
|
||||
print_topology (tmpinf, depth + 1);
|
||||
}
|
||||
gst_discoverer_stream_info_list_free (streams);
|
||||
}
|
||||
}
|
||||
|
||||
/* This function is called every time the discoverer has information regarding
|
||||
* one of the URIs we provided.*/
|
||||
static void on_discovered_cb (GstDiscoverer *discoverer, GstDiscovererInfo *info, GError *err, CustomData *data) {
|
||||
GstDiscovererResult result;
|
||||
const gchar *uri;
|
||||
const GstTagList *tags;
|
||||
GstDiscovererStreamInfo *sinfo;
|
||||
|
||||
uri = gst_discoverer_info_get_uri (info);
|
||||
result = gst_discoverer_info_get_result (info);
|
||||
switch (result) {
|
||||
case GST_DISCOVERER_URI_INVALID:
|
||||
g_print ("Invalid URI '%s'\n", uri);
|
||||
break;
|
||||
case GST_DISCOVERER_ERROR:
|
||||
g_print ("Discoverer error: %s\n", err->message);
|
||||
break;
|
||||
case GST_DISCOVERER_TIMEOUT:
|
||||
g_print ("Timeout\n");
|
||||
break;
|
||||
case GST_DISCOVERER_BUSY:
|
||||
g_print ("Busy\n");
|
||||
break;
|
||||
case GST_DISCOVERER_MISSING_PLUGINS:{
|
||||
const GstStructure *s;
|
||||
gchar *str;
|
||||
|
||||
s = gst_discoverer_info_get_misc (info);
|
||||
str = gst_structure_to_string (s);
|
||||
|
||||
g_print ("Missing plugins: %s\n", str);
|
||||
g_free (str);
|
||||
break;
|
||||
}
|
||||
case GST_DISCOVERER_OK:
|
||||
g_print ("Discovered '%s'\n", uri);
|
||||
break;
|
||||
}
|
||||
|
||||
if (result != GST_DISCOVERER_OK) {
|
||||
g_printerr ("This URI cannot be played\n");
|
||||
return;
|
||||
}
|
||||
|
||||
/* If we got no error, show the retrieved information */
|
||||
|
||||
g_print ("\nDuration: %" GST_TIME_FORMAT "\n", GST_TIME_ARGS (gst_discoverer_info_get_duration (info)));
|
||||
|
||||
tags = gst_discoverer_info_get_tags (info);
|
||||
if (tags) {
|
||||
g_print ("Tags:\n");
|
||||
gst_tag_list_foreach (tags, print_tag_foreach, GINT_TO_POINTER (1));
|
||||
}
|
||||
|
||||
g_print ("Seekable: %s\n", (gst_discoverer_info_get_seekable (info) ? "yes" : "no"));
|
||||
|
||||
g_print ("\n");
|
||||
|
||||
sinfo = gst_discoverer_info_get_stream_info (info);
|
||||
if (!sinfo)
|
||||
return;
|
||||
|
||||
g_print ("Stream information:\n");
|
||||
|
||||
print_topology (sinfo, 1);
|
||||
|
||||
gst_discoverer_stream_info_unref (sinfo);
|
||||
|
||||
g_print ("\n");
|
||||
}
|
||||
|
||||
/* This function is called when the discoverer has finished examining
|
||||
* all the URIs we provided.*/
|
||||
static void on_finished_cb (GstDiscoverer *discoverer, CustomData *data) {
|
||||
g_print ("Finished discovering\n");
|
||||
|
||||
g_main_loop_quit (data->loop);
|
||||
}
|
||||
|
||||
int main (int argc, char **argv) {
|
||||
CustomData data;
|
||||
GError *err = NULL;
|
||||
gchar *uri = "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm";
|
||||
|
||||
/* if a URI was provided, use it instead of the default one */
|
||||
if (argc > 1) {
|
||||
uri = argv[1];
|
||||
}
|
||||
|
||||
/* Initialize custom data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
g_print ("Discovering '%s'\n", uri);
|
||||
|
||||
/* Instantiate the Discoverer */
|
||||
data.discoverer = gst_discoverer_new (5 * GST_SECOND, &err);
|
||||
if (!data.discoverer) {
|
||||
g_print ("Error creating discoverer instance: %s\n", err->message);
|
||||
g_clear_error (&err);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Connect to the interesting signals */
|
||||
g_signal_connect (data.discoverer, "discovered", G_CALLBACK (on_discovered_cb), &data);
|
||||
g_signal_connect (data.discoverer, "finished", G_CALLBACK (on_finished_cb), &data);
|
||||
|
||||
/* Start the discoverer process (nothing to do yet) */
|
||||
gst_discoverer_start (data.discoverer);
|
||||
|
||||
/* Add a request to process asynchronously the URI passed through the command line */
|
||||
if (!gst_discoverer_discover_uri_async (data.discoverer, uri)) {
|
||||
g_print ("Failed to start discovering URI '%s'\n", uri);
|
||||
g_object_unref (data.discoverer);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Create a GLib Main Loop and set it to run, so we can wait for the signals */
|
||||
data.loop = g_main_loop_new (NULL, FALSE);
|
||||
g_main_loop_run (data.loop);
|
||||
|
||||
/* Stop the discoverer process */
|
||||
gst_discoverer_stop (data.discoverer);
|
||||
|
||||
/* Free resources */
|
||||
g_object_unref (data.discoverer);
|
||||
g_main_loop_unref (data.loop);
|
||||
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
> 
|
||||
> Need help?
|
||||
>
|
||||
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Build), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Build) or [Windows](installing/on-windows.md#InstallingonWindows-Build), or use this specific command on Linux:
|
||||
>
|
||||
> ``gcc basic-tutorial-9.c -o basic-tutorial-9 `pkg-config --cflags --libs gstreamer-1.0 gstreamer-pbutils-1.0` ``
|
||||
>
|
||||
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Run), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](installing/on-windows.md#InstallingonWindows-Run).
|
||||
>
|
||||
> This tutorial opens the URI passed as the first parameter in the command line (or a default URI if none is provided) and outputs information about it on the screen. If the media is located on the Internet, the application might take a bit to react depending on your connection speed.
|
||||
>
|
||||
> Required libraries: `gstreamer-pbutils-1.0` `gstreamer-1.0`
|
||||
|
||||
|
||||
## Walkthrough
|
||||
|
||||
These are the main steps to use the `GstDiscoverer`:
|
||||
|
||||
``` c
|
||||
/* Instantiate the Discoverer */
|
||||
data.discoverer = gst_discoverer_new (5 * GST_SECOND, &err);
|
||||
if (!data.discoverer) {
|
||||
g_print ("Error creating discoverer instance: %s\n", err->message);
|
||||
g_clear_error (&err);
|
||||
return -1;
|
||||
}
|
||||
```
|
||||
|
||||
`gst_discoverer_new()` creates a new Discoverer object. The first
|
||||
parameter is the timeout per file, in nanoseconds (use the
|
||||
`GST_SECOND` macro for simplicity).
|
||||
|
||||
``` c
|
||||
/* Connect to the interesting signals */
|
||||
g_signal_connect (data.discoverer, "discovered", G_CALLBACK (on_discovered_cb), &data);
|
||||
g_signal_connect (data.discoverer, "finished", G_CALLBACK (on_finished_cb), &data);
|
||||
```
|
||||
|
||||
Connect to the interesting signals, as usual. We discuss them in the
|
||||
snippet for their callbacks.
|
||||
|
||||
``` c
|
||||
/* Start the discoverer process (nothing to do yet) */
|
||||
gst_discoverer_start (data.discoverer);
|
||||
```
|
||||
|
||||
`gst_discoverer_start()` launches the discovering process, but we have
|
||||
not provided any URI to discover yet. This is done
|
||||
next:
|
||||
|
||||
``` c
|
||||
/* Add a request to process asynchronously the URI passed through the command line */
|
||||
if (!gst_discoverer_discover_uri_async (data.discoverer, uri)) {
|
||||
g_print ("Failed to start discovering URI '%s'\n", uri);
|
||||
g_object_unref (data.discoverer);
|
||||
return -1;
|
||||
}
|
||||
```
|
||||
|
||||
`gst_discoverer_discover_uri_async()` enqueues the provided URI for
|
||||
discovery. Multiple URIs can be enqueued with this function. As the
|
||||
discovery process for each of them finishes, the registered callback
|
||||
functions will be fired
|
||||
up.
|
||||
|
||||
``` c
|
||||
/* Create a GLib Main Loop and set it to run, so we can wait for the signals */
|
||||
data.loop = g_main_loop_new (NULL, FALSE);
|
||||
g_main_loop_run (data.loop);
|
||||
```
|
||||
|
||||
The usual GLib main loop is instantiated and executed. We will get out
|
||||
of it when `g_main_loop_quit()` is called from the
|
||||
`on_finished_cb` callback.
|
||||
|
||||
``` c
|
||||
/* Stop the discoverer process */
|
||||
gst_discoverer_stop (data.discoverer);
|
||||
```
|
||||
|
||||
Once we are done with the discoverer, we stop it with
|
||||
`gst_discoverer_stop()` and unref it with `g_object_unref()`.
|
||||
|
||||
Let's review now the callbacks we have
|
||||
registered:
|
||||
|
||||
``` c
|
||||
/* This function is called every time the discoverer has information regarding
|
||||
* one of the URIs we provided.*/
|
||||
static void on_discovered_cb (GstDiscoverer *discoverer, GstDiscovererInfo *info, GError *err, CustomData *data) {
|
||||
GstDiscovererResult result;
|
||||
const gchar *uri;
|
||||
const GstTagList *tags;
|
||||
GstDiscovererStreamInfo *sinfo;
|
||||
|
||||
uri = gst_discoverer_info_get_uri (info);
|
||||
result = gst_discoverer_info_get_result (info);
|
||||
```
|
||||
|
||||
We got here because the Discoverer has finished working on one URI, and
|
||||
provides us a `GstDiscovererInfo` structure with all the information.
|
||||
|
||||
The first step is to retrieve the particular URI this call refers to (in
|
||||
case we had multiple discover process running, which is not the case in
|
||||
this example) with `gst_discoverer_info_get_uri()` and the discovery
|
||||
result with `gst_discoverer_info_get_result()`.
|
||||
|
||||
``` c
|
||||
switch (result) {
|
||||
case GST_DISCOVERER_URI_INVALID:
|
||||
g_print ("Invalid URI '%s'\n", uri);
|
||||
break;
|
||||
case GST_DISCOVERER_ERROR:
|
||||
g_print ("Discoverer error: %s\n", err->message);
|
||||
break;
|
||||
case GST_DISCOVERER_TIMEOUT:
|
||||
g_print ("Timeout\n");
|
||||
break;
|
||||
case GST_DISCOVERER_BUSY:
|
||||
g_print ("Busy\n");
|
||||
break;
|
||||
case GST_DISCOVERER_MISSING_PLUGINS:{
|
||||
const GstStructure *s;
|
||||
gchar *str;
|
||||
|
||||
s = gst_discoverer_info_get_misc (info);
|
||||
str = gst_structure_to_string (s);
|
||||
|
||||
g_print ("Missing plugins: %s\n", str);
|
||||
g_free (str);
|
||||
break;
|
||||
}
|
||||
case GST_DISCOVERER_OK:
|
||||
g_print ("Discovered '%s'\n", uri);
|
||||
break;
|
||||
}
|
||||
|
||||
if (result != GST_DISCOVERER_OK) {
|
||||
g_printerr ("This URI cannot be played\n");
|
||||
return;
|
||||
}
|
||||
```
|
||||
|
||||
As the code shows, any result other than `GST_DISCOVERER_OK` means that
|
||||
there has been some kind of problem, and this URI cannot be played. The
|
||||
reasons can vary, but the enum values are quite explicit
|
||||
(`GST_DISCOVERER_BUSY` can only happen when in synchronous mode, which
|
||||
is not used in this example).
|
||||
|
||||
If no error happened, information can be retrieved from the
|
||||
`GstDiscovererInfo` structure with the different
|
||||
`gst_discoverer_info_get_*` methods (like,
|
||||
`gst_discoverer_info_get_duration()`, for example).
|
||||
|
||||
Bits of information which are made of lists, like tags and stream info,
|
||||
needs some extra parsing:
|
||||
|
||||
``` c
|
||||
tags = gst_discoverer_info_get_tags (info);
|
||||
if (tags) {
|
||||
g_print ("Tags:\n");
|
||||
gst_tag_list_foreach (tags, print_tag_foreach, GINT_TO_POINTER (1));
|
||||
}
|
||||
```
|
||||
|
||||
Tags are metadata (labels) attached to the media. They can be examined
|
||||
with `gst_tag_list_foreach()`, which will call `print_tag_foreach` for
|
||||
each tag found (the list could also be traversed manually, for example,
|
||||
or a specific tag could be searched for with
|
||||
`gst_tag_list_get_string()`). The code for `print_tag_foreach` is pretty
|
||||
much self-explanatory.
|
||||
|
||||
``` c
|
||||
sinfo = gst_discoverer_info_get_stream_info (info);
|
||||
if (!sinfo)
|
||||
return;
|
||||
|
||||
g_print ("Stream information:\n");
|
||||
|
||||
print_topology (sinfo, 1);
|
||||
|
||||
gst_discoverer_stream_info_unref (sinfo);
|
||||
```
|
||||
|
||||
`gst_discoverer_info_get_stream_info()` returns
|
||||
a `GstDiscovererStreamInfo` structure that is parsed in
|
||||
the `print_topology` function, and then discarded
|
||||
with `gst_discoverer_stream_info_unref()`.
|
||||
|
||||
``` c
|
||||
/* Print information regarding a stream and its substreams, if any */
|
||||
static void print_topology (GstDiscovererStreamInfo *info, gint depth) {
|
||||
GstDiscovererStreamInfo *next;
|
||||
|
||||
if (!info)
|
||||
return;
|
||||
|
||||
print_stream_info (info, depth);
|
||||
|
||||
next = gst_discoverer_stream_info_get_next (info);
|
||||
if (next) {
|
||||
print_topology (next, depth + 1);
|
||||
gst_discoverer_stream_info_unref (next);
|
||||
} else if (GST_IS_DISCOVERER_CONTAINER_INFO (info)) {
|
||||
GList *tmp, *streams;
|
||||
|
||||
streams = gst_discoverer_container_info_get_streams (GST_DISCOVERER_CONTAINER_INFO (info));
|
||||
for (tmp = streams; tmp; tmp = tmp->next) {
|
||||
GstDiscovererStreamInfo *tmpinf = (GstDiscovererStreamInfo *) tmp->data;
|
||||
print_topology (tmpinf, depth + 1);
|
||||
}
|
||||
gst_discoverer_stream_info_list_free (streams);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
The `print_stream_info` function's code is also pretty much
|
||||
self-explanatory: it prints the stream's capabilities and then the
|
||||
associated caps, using `print_tag_foreach` too.
|
||||
|
||||
Then, `print_topology` looks for the next element to display. If
|
||||
`gst_discoverer_stream_info_get_next()` returns a non-NULL stream info,
|
||||
it refers to our descendant and that should be displayed. Otherwise, if
|
||||
we are a container, recursively call `print_topology` on each of our
|
||||
children obtained with `gst_discoverer_container_info_get_streams()`.
|
||||
Otherwise, we are a final stream, and do not need to recurse (This part
|
||||
of the Discoverer API is admittedly a bit obscure).
|
||||
|
||||
## Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to recover information regarding a URI using the `GstDiscoverer`
|
||||
|
||||
- How to find out if a URI is playable by looking at the return code
|
||||
obtained with `gst_discoverer_info_get_result()`.
|
||||
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
@@ -0,0 +1,331 @@
|
||||
# Basic tutorial 7: Multithreading and Pad Availability
|
||||
|
||||
|
||||
{{ ALERT_PY.md }}
|
||||
|
||||
{{ ALERT_JS.md }}
|
||||
|
||||
## Goal
|
||||
|
||||
GStreamer handles multithreading automatically, but, under some
|
||||
circumstances, you might need to decouple threads manually. This
|
||||
tutorial shows how to do this and, in addition, completes the exposition
|
||||
about Pad Availability. More precisely, this document explains:
|
||||
|
||||
- How to create new threads of execution for some parts of the
|
||||
pipeline
|
||||
|
||||
- What is the Pad Availability
|
||||
|
||||
- How to replicate streams
|
||||
|
||||
## Introduction
|
||||
|
||||
### Multithreading
|
||||
|
||||
GStreamer is a multithreaded framework. This means that, internally, it
|
||||
creates and destroys threads as it needs them, for example, to decouple
|
||||
streaming from the application thread. Moreover, plugins are also free
|
||||
to create threads for their own processing, for example, a video decoder
|
||||
could create 4 threads to take full advantage of a CPU with 4 cores.
|
||||
|
||||
On top of this, when building the pipeline an application can specify
|
||||
explicitly that a *branch* (a part of the pipeline) runs on a different
|
||||
thread (for example, to have the audio and video decoders executing
|
||||
simultaneously).
|
||||
|
||||
This is accomplished using the `queue` element, which works as follows.
|
||||
The sink pad just enqueues data and returns control. On a different
|
||||
thread, data is dequeued and pushed downstream. This element is also
|
||||
used for buffering, as seen later in the streaming tutorials. The size
|
||||
of the queue can be controlled through properties.
|
||||
|
||||
### The example pipeline
|
||||
|
||||
This example builds the following pipeline:
|
||||
|
||||

|
||||
|
||||
The source is a synthetic audio signal (a continuous tone) which is
|
||||
split using a `tee` element (it sends through its source pads everything
|
||||
it receives through its sink pad). One branch then sends the signal to
|
||||
the audio card, and the other renders a video of the waveform and sends
|
||||
it to the screen.
|
||||
|
||||
As seen in the picture, queues create a new thread, so this pipeline
|
||||
runs in 3 threads. Pipelines with more than one sink usually need to be
|
||||
multithreaded, because, to be synchronized, sinks usually block
|
||||
execution until all other sinks are ready, and they cannot get ready if
|
||||
there is only one thread, being blocked by the first sink.
|
||||
|
||||
### Request pads
|
||||
|
||||
In [Basic tutorial 3: Dynamic
|
||||
pipelines](tutorials/basic/dynamic-pipelines.md) we saw
|
||||
an element (`uridecodebin`) which had no pads to begin with, and they
|
||||
appeared as data started to flow and the element learned about the
|
||||
media. These are called **Sometimes Pads**, and contrast with the
|
||||
regular pads which are always available and are called **Always Pads**.
|
||||
|
||||
The third kind of pad is the **Request Pad**, which is created on
|
||||
demand. The classical example is the `tee` element, which has one sink
|
||||
pad and no initial source pads: they need to be requested and then
|
||||
`tee` adds them. In this way, an input stream can be replicated any
|
||||
number of times. The disadvantage is that linking elements with Request
|
||||
Pads is not as automatic, as linking Always Pads, as the walkthrough for
|
||||
this example will show.
|
||||
|
||||
Also, to request (or release) pads in the `PLAYING` or `PAUSED` states, you
|
||||
need to take additional cautions (Pad blocking) which are not described
|
||||
in this tutorial. It is safe to request (or release) pads in the `NULL` or
|
||||
`READY` states, though.
|
||||
|
||||
Without further delay, let's see the code.
|
||||
|
||||
## Simple multithreaded example
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-7.c` (or find it
|
||||
in your GStreamer installation).
|
||||
|
||||
**basic-tutorial-7.c**
|
||||
|
||||
``` c
|
||||
#include <gst/gst.h>
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
GstElement *pipeline, *audio_source, *tee, *audio_queue, *audio_convert, *audio_resample, *audio_sink;
|
||||
GstElement *video_queue, *visual, *video_convert, *video_sink;
|
||||
GstBus *bus;
|
||||
GstMessage *msg;
|
||||
GstPad *tee_audio_pad, *tee_video_pad;
|
||||
GstPad *queue_audio_pad, *queue_video_pad;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the elements */
|
||||
audio_source = gst_element_factory_make ("audiotestsrc", "audio_source");
|
||||
tee = gst_element_factory_make ("tee", "tee");
|
||||
audio_queue = gst_element_factory_make ("queue", "audio_queue");
|
||||
audio_convert = gst_element_factory_make ("audioconvert", "audio_convert");
|
||||
audio_resample = gst_element_factory_make ("audioresample", "audio_resample");
|
||||
audio_sink = gst_element_factory_make ("autoaudiosink", "audio_sink");
|
||||
video_queue = gst_element_factory_make ("queue", "video_queue");
|
||||
visual = gst_element_factory_make ("wavescope", "visual");
|
||||
video_convert = gst_element_factory_make ("videoconvert", "csp");
|
||||
video_sink = gst_element_factory_make ("autovideosink", "video_sink");
|
||||
|
||||
/* Create the empty pipeline */
|
||||
pipeline = gst_pipeline_new ("test-pipeline");
|
||||
|
||||
if (!pipeline || !audio_source || !tee || !audio_queue || !audio_convert || !audio_resample || !audio_sink ||
|
||||
!video_queue || !visual || !video_convert || !video_sink) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Configure elements */
|
||||
g_object_set (audio_source, "freq", 215.0f, NULL);
|
||||
g_object_set (visual, "shader", 0, "style", 1, NULL);
|
||||
|
||||
/* Link all elements that can be automatically linked because they have "Always" pads */
|
||||
gst_bin_add_many (GST_BIN (pipeline), audio_source, tee, audio_queue, audio_convert, audio_resample, audio_sink,
|
||||
video_queue, visual, video_convert, video_sink, NULL);
|
||||
if (gst_element_link_many (audio_source, tee, NULL) != TRUE ||
|
||||
gst_element_link_many (audio_queue, audio_convert, audio_resample, audio_sink, NULL) != TRUE ||
|
||||
gst_element_link_many (video_queue, visual, video_convert, video_sink, NULL) != TRUE) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Manually link the Tee, which has "Request" pads */
|
||||
tee_audio_pad = gst_element_request_pad_simple (tee, "src_%u");
|
||||
g_print ("Obtained request pad %s for audio branch.\n", gst_pad_get_name (tee_audio_pad));
|
||||
queue_audio_pad = gst_element_get_static_pad (audio_queue, "sink");
|
||||
tee_video_pad = gst_element_request_pad_simple (tee, "src_%u");
|
||||
g_print ("Obtained request pad %s for video branch.\n", gst_pad_get_name (tee_video_pad));
|
||||
queue_video_pad = gst_element_get_static_pad (video_queue, "sink");
|
||||
if (gst_pad_link (tee_audio_pad, queue_audio_pad) != GST_PAD_LINK_OK ||
|
||||
gst_pad_link (tee_video_pad, queue_video_pad) != GST_PAD_LINK_OK) {
|
||||
g_printerr ("Tee could not be linked.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
gst_object_unref (queue_audio_pad);
|
||||
gst_object_unref (queue_video_pad);
|
||||
|
||||
/* Start playing the pipeline */
|
||||
gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
|
||||
/* Wait until error or EOS */
|
||||
bus = gst_element_get_bus (pipeline);
|
||||
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
|
||||
|
||||
/* Release the request pads from the Tee, and unref them */
|
||||
gst_element_release_request_pad (tee, tee_audio_pad);
|
||||
gst_element_release_request_pad (tee, tee_video_pad);
|
||||
gst_object_unref (tee_audio_pad);
|
||||
gst_object_unref (tee_video_pad);
|
||||
|
||||
/* Free resources */
|
||||
if (msg != NULL)
|
||||
gst_message_unref (msg);
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (pipeline, GST_STATE_NULL);
|
||||
|
||||
gst_object_unref (pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
> 
|
||||
> Need help?
|
||||
>
|
||||
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Build), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Build) or [Windows](installing/on-windows.md#InstallingonWindows-Build), or use this specific command on Linux:
|
||||
>
|
||||
> ``gcc basic-tutorial-7.c -o basic-tutorial-7 `pkg-config --cflags --libs gstreamer-1.0` ``
|
||||
>
|
||||
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Run), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](installing/on-windows.md#InstallingonWindows-Run).
|
||||
>
|
||||
> This tutorial plays an audible tone through the audio card and opens a window with a waveform representation of the tone. The waveform should be a sinusoid, but due to the refreshing of the window might not appear so.
|
||||
>
|
||||
> Required libraries: `gstreamer-1.0`
|
||||
|
||||
## Walkthrough
|
||||
|
||||
``` c
|
||||
/* Create the elements */
|
||||
audio_source = gst_element_factory_make ("audiotestsrc", "audio_source");
|
||||
tee = gst_element_factory_make ("tee", "tee");
|
||||
audio_queue = gst_element_factory_make ("queue", "audio_queue");
|
||||
audio_convert = gst_element_factory_make ("audioconvert", "audio_convert");
|
||||
audio_resample = gst_element_factory_make ("audioresample", "audio_resample");
|
||||
audio_sink = gst_element_factory_make ("autoaudiosink", "audio_sink");
|
||||
video_queue = gst_element_factory_make ("queue", "video_queue");
|
||||
visual = gst_element_factory_make ("wavescope", "visual");
|
||||
video_convert = gst_element_factory_make ("videoconvert", "video_convert");
|
||||
video_sink = gst_element_factory_make ("autovideosink", "video_sink");
|
||||
```
|
||||
|
||||
All the elements in the above picture are instantiated here:
|
||||
|
||||
`audiotestsrc` produces a synthetic tone. `wavescope` consumes an audio
|
||||
signal and renders a waveform as if it was an (admittedly cheap)
|
||||
oscilloscope. We have already worked with the `autoaudiosink` and
|
||||
`autovideosink`.
|
||||
|
||||
The conversion elements (`audioconvert`, `audioresample` and
|
||||
`videoconvert`) are necessary to guarantee that the pipeline can be
|
||||
linked. Indeed, the Capabilities of the audio and video sinks depend on
|
||||
the hardware, and you do not know at design time if they will match the
|
||||
Caps produced by the `audiotestsrc` and `wavescope`. If the Caps
|
||||
matched, though, these elements act in “pass-through” mode and do not
|
||||
modify the signal, having negligible impact on performance.
|
||||
|
||||
``` c
|
||||
/* Configure elements */
|
||||
g_object_set (audio_source, "freq", 215.0f, NULL);
|
||||
g_object_set (visual, "shader", 0, "style", 1, NULL);
|
||||
```
|
||||
|
||||
Small adjustments for better demonstration: The “freq” property of
|
||||
`audiotestsrc` controls the frequency of the wave (215Hz makes the wave
|
||||
appear almost stationary in the window), and this style and shader for
|
||||
`wavescope` make the wave continuous. Use the `gst-inspect-1.0` tool
|
||||
described in [Basic tutorial 10: GStreamer
|
||||
tools](tutorials/basic/gstreamer-tools.md) to learn all
|
||||
the properties of these
|
||||
elements.
|
||||
|
||||
``` c
|
||||
/* Link all elements that can be automatically linked because they have "Always" pads */
|
||||
gst_bin_add_many (GST_BIN (pipeline), audio_source, tee, audio_queue, audio_convert, audio_sink,
|
||||
video_queue, visual, video_convert, video_sink, NULL);
|
||||
if (gst_element_link_many (audio_source, tee, NULL) != TRUE ||
|
||||
gst_element_link_many (audio_queue, audio_convert, audio_sink, NULL) != TRUE ||
|
||||
gst_element_link_many (video_queue, visual, video_convert, video_sink, NULL) != TRUE) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
```
|
||||
|
||||
This code block adds all elements to the pipeline and then links the
|
||||
ones that can be automatically linked (the ones with Always Pads, as the
|
||||
comment says).
|
||||
|
||||
> 
|
||||
> `gst_element_link_many()` can actually link elements with Request Pads. It internally requests the Pads so you do not have worry about the elements being linked having Always or Request Pads. Strange as it might seem, this is actually inconvenient, because you still need to release the requested Pads afterwards, and, if the Pad was requested automatically by `gst_element_link_many()`, it is easy to forget. Stay out of trouble by always requesting Request Pads manually, as shown in the next code block.
|
||||
|
||||
``` c
|
||||
/* Manually link the Tee, which has "Request" pads */
|
||||
tee_audio_pad = gst_element_request_pad_simple (tee, "src_%u");
|
||||
g_print ("Obtained request pad %s for audio branch.\n", gst_pad_get_name (tee_audio_pad));
|
||||
queue_audio_pad = gst_element_get_static_pad (audio_queue, "sink");
|
||||
tee_video_pad = gst_element_request_pad_simple (tee, "src_%u");
|
||||
g_print ("Obtained request pad %s for video branch.\n", gst_pad_get_name (tee_video_pad));
|
||||
queue_video_pad = gst_element_get_static_pad (video_queue, "sink");
|
||||
if (gst_pad_link (tee_audio_pad, queue_audio_pad) != GST_PAD_LINK_OK ||
|
||||
gst_pad_link (tee_video_pad, queue_video_pad) != GST_PAD_LINK_OK) {
|
||||
g_printerr ("Tee could not be linked.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
gst_object_unref (queue_audio_pad);
|
||||
gst_object_unref (queue_video_pad);
|
||||
```
|
||||
|
||||
To link Request Pads, they need to be obtained by “requesting” them to
|
||||
the element. An element might be able to produce different kinds of
|
||||
Request Pads, so, when requesting them, the desired Pad Template name must be
|
||||
provided.
|
||||
In the documentation for the `tee` element we see that it has two pad
|
||||
templates named “sink” (for its sink Pads) and “src_%u” (for the Request
|
||||
Pads). We request two Pads from the tee (for the
|
||||
audio and video branches) with `gst_element_request_pad_simple()`.
|
||||
|
||||
We then obtain the Pads from the downstream elements to which these
|
||||
Request Pads need to be linked. These are normal Always Pads, so we
|
||||
obtain them with `gst_element_get_static_pad()`.
|
||||
|
||||
Finally, we link the pads with `gst_pad_link()`. This is the function
|
||||
that `gst_element_link()` and `gst_element_link_many()` use internally.
|
||||
|
||||
The sink Pads we have obtained need to be released with
|
||||
`gst_object_unref()`. The Request Pads will be released when we no
|
||||
longer need them, at the end of the program.
|
||||
|
||||
We then set the pipeline to playing as usual, and wait until an error
|
||||
message or an EOS is produced. The only thing left to do is cleanup the
|
||||
requested Pads:
|
||||
|
||||
``` c
|
||||
/* Release the request pads from the Tee, and unref them */
|
||||
gst_element_release_request_pad (tee, tee_audio_pad);
|
||||
gst_element_release_request_pad (tee, tee_video_pad);
|
||||
gst_object_unref (tee_audio_pad);
|
||||
gst_object_unref (tee_video_pad);
|
||||
```
|
||||
|
||||
`gst_element_release_request_pad()` releases the pad from the `tee`, but
|
||||
it still needs to be unreferenced (freed) with `gst_object_unref()`.
|
||||
|
||||
## Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to make parts of a pipeline run on a different thread by using
|
||||
`queue` elements.
|
||||
|
||||
- What is a Request Pad and how to link elements with request pads,
|
||||
with `gst_element_request_pad_simple()`, `gst_pad_link()` and
|
||||
`gst_element_release_request_pad()`.
|
||||
|
||||
- How to have the same stream available in different branches by using
|
||||
`tee` elements.
|
||||
|
||||
The next tutorial builds on top of this one to show how data can be
|
||||
manually injected into and extracted from a running pipeline.
|
||||
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
@@ -0,0 +1,179 @@
|
||||
# Basic tutorial 16: Platform-specific elements
|
||||
|
||||
## Goal
|
||||
|
||||
Even though GStreamer is a multiplatform framework, not all the elements
|
||||
are available on all platforms. For example, the video sinks
|
||||
depend heavily on the underlying windowing system, and a different one
|
||||
needs to be selected depending on the platform. You normally do not need
|
||||
to worry about this when using elements like `playbin` or
|
||||
`autovideosink`, but, for those cases when you need to use one of the
|
||||
sinks that are only available on specific platforms, this tutorial hints
|
||||
you some of their peculiarities.
|
||||
|
||||
## Cross Platform
|
||||
|
||||
### `glimagesink`
|
||||
|
||||
This video sink is based on
|
||||
[OpenGL](http://en.wikipedia.org/wiki/OpenGL) or [OpenGL ES](https://en.wikipedia.org/wiki/OpenGL_ES). It supports rescaling
|
||||
and filtering of the scaled image to alleviate aliasing. It implements
|
||||
the VideoOverlay interface, so the video window can be re-parented
|
||||
(embedded inside other windows). This is the video sink recommended on
|
||||
most platforms. In particular, on Android and iOS, it is the only
|
||||
available video sink. It can be decomposed into
|
||||
`glupload ! glcolorconvert ! glimagesinkelement` to insert further OpenGL
|
||||
hardware accelerated processing into the pipeline.
|
||||
|
||||
## Linux
|
||||
|
||||
### `ximagesink`
|
||||
|
||||
A standard RGB only X-based video sink. It implements the VideoOverlay
|
||||
interface, so the video window can be re-parented (embedded inside
|
||||
other windows). It does not support scaling or color formats other
|
||||
than RGB; it has to be performed by different means (using the
|
||||
`videoscale` element, for example).
|
||||
|
||||
### `xvimagesink`
|
||||
|
||||
An X-based video sink, using the [X Video
|
||||
Extension](http://en.wikipedia.org/wiki/X_video_extension) (Xv). It
|
||||
implements the VideoOverlay interface, so the video window can be
|
||||
re-parented (embedded inside other windows). It can perform scaling
|
||||
efficiently, on the GPU. It is only available if the hardware and
|
||||
corresponding drivers support the Xv extension.
|
||||
|
||||
### `alsasink`
|
||||
|
||||
This audio sink outputs to the sound card via
|
||||
[ALSA](http://www.alsa-project.org/) (Advanced Linux Sound
|
||||
Architecture). This sink is available on almost every Linux platform. It
|
||||
is often seen as a “low level” interface to the sound card, and can be
|
||||
complicated to configure (See the comment on
|
||||
[](tutorials/playback/digital-audio-pass-through.md)).
|
||||
|
||||
### `pulsesink`
|
||||
|
||||
This sink plays audio to a [PulseAudio](http://www.pulseaudio.org/)
|
||||
server. It is a higher level abstraction of the sound card than ALSA,
|
||||
and is therefore easier to use and offers more advanced features. It has
|
||||
been known to be unstable on some older Linux distributions, though.
|
||||
|
||||
## Mac OS X
|
||||
|
||||
### `osxvideosink`
|
||||
|
||||
This is the video sink available to GStreamer on Mac OS X. It is also
|
||||
possible to draw using `glimagesink` using OpenGL.
|
||||
|
||||
### `osxaudiosink`
|
||||
|
||||
This is the only audio sink available to GStreamer on Mac OS X.
|
||||
|
||||
## Windows
|
||||
|
||||
### `directdrawsink`
|
||||
|
||||
This is the oldest of the Windows video sinks, based on [Direct
|
||||
Draw](http://en.wikipedia.org/wiki/DirectDraw). It requires DirectX 7,
|
||||
so it is available on almost every current Windows platform. It supports
|
||||
rescaling and filtering of the scaled image to alleviate aliasing.
|
||||
|
||||
### `dshowvideosink`
|
||||
|
||||
This video sink is based on [Direct
|
||||
Show](http://en.wikipedia.org/wiki/Direct_Show). It can use different
|
||||
rendering back-ends, like
|
||||
[EVR](http://en.wikipedia.org/wiki/Enhanced_Video_Renderer),
|
||||
[VMR9](http://en.wikipedia.org/wiki/Direct_Show#Video_rendering_filters)
|
||||
or
|
||||
[VMR7](http://en.wikipedia.org/wiki/Direct_Show#Video_rendering_filters),
|
||||
EVR only being available on Windows Vista or more recent. It supports
|
||||
rescaling and filtering of the scaled image to alleviate aliasing. It
|
||||
implements the VideoOverlay interface, so the video window can be
|
||||
re-parented (embedded inside other windows).
|
||||
|
||||
### `d3dvideosink`
|
||||
|
||||
This video sink is based on
|
||||
[Direct3D](http://en.wikipedia.org/wiki/Direct3D) and it’s the most
|
||||
recent Windows video sink. It supports rescaling and filtering of the
|
||||
scaled image to alleviate aliasing. It implements the VideoOverlay
|
||||
interface, so the video window can be re-parented (embedded inside other
|
||||
windows).
|
||||
|
||||
### `directsoundsink`
|
||||
|
||||
This is the default audio sink for Windows, based on [Direct
|
||||
Sound](http://en.wikipedia.org/wiki/DirectSound), which is available in
|
||||
all Windows versions.
|
||||
|
||||
### `dshowdecwrapper`
|
||||
|
||||
[Direct Show](http://en.wikipedia.org/wiki/Direct_Show) is a multimedia
|
||||
framework similar to GStreamer. They are different enough, though, so
|
||||
that their pipelines cannot be interconnected. However, through this
|
||||
element, GStreamer can benefit from the decoding elements present in
|
||||
Direct Show. `dshowdecwrapper` wraps multiple Direct Show decoders so
|
||||
they can be embedded in a GStreamer pipeline. Use the `gst-inspect-1.0` tool
|
||||
(see [](tutorials/basic/gstreamer-tools.md)) to see the
|
||||
available decoders.
|
||||
|
||||
## Android
|
||||
|
||||
### `openslessink`
|
||||
|
||||
This is the only audio sink available to GStreamer on Android. It is
|
||||
based on [OpenSL ES](http://en.wikipedia.org/wiki/OpenSL_ES).
|
||||
|
||||
### `openslessrc`
|
||||
|
||||
This is the only audio source available to GStreamer on Android. It is
|
||||
based on [OpenSL ES](http://en.wikipedia.org/wiki/OpenSL_ES).
|
||||
|
||||
### `androidmedia`
|
||||
|
||||
[android.media.MediaCodec](http://developer.android.com/reference/android/media/MediaCodec.html)
|
||||
is an Android specific API to access the codecs that are available on
|
||||
the device, including hardware codecs. It is available since API level
|
||||
16 (JellyBean) and GStreamer can use it via the androidmedia plugin
|
||||
for audio and video decoding. On Android, attaching the hardware
|
||||
decoder to the `glimagesink` element can produce a high performance
|
||||
zero-copy decodebin pipeline.
|
||||
|
||||
### `ahcsrc`
|
||||
|
||||
This video source can capture from the cameras on Android devices, it is part
|
||||
of the androidmedia plugin and uses the [android.hardware.Camera API](https://developer.android.com/reference/android/hardware/Camera.html).
|
||||
|
||||
## iOS
|
||||
|
||||
### `osxaudiosink`
|
||||
|
||||
This is the only audio sink available to GStreamer on iOS.
|
||||
|
||||
### `iosassetsrc`
|
||||
|
||||
Source element to read iOS assets, this is, documents stored in the
|
||||
Library (like photos, music and videos). It can be instantiated
|
||||
automatically by `playbin` when URIs use the
|
||||
`assets-library://` scheme.
|
||||
|
||||
### `iosavassetsrc`
|
||||
|
||||
Source element to read and decode iOS audiovisual assets, this is,
|
||||
documents stored in the Library (like photos, music and videos). It can
|
||||
be instantiated automatically by `playbin` when URIs use the
|
||||
`ipod-library://` scheme. Decoding is performed by the system, so
|
||||
dedicated hardware will be used if available.
|
||||
|
||||
## Conclusion
|
||||
|
||||
This tutorial has shown a few specific details about some GStreamer
|
||||
elements which are not available on all platforms. You do not have to
|
||||
worry about them when using multiplatform elements like `playbin` or
|
||||
`autovideosink`, but it is good to know their personal quirks if
|
||||
instancing them manually.
|
||||
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
@@ -0,0 +1,235 @@
|
||||
# Basic tutorial 13: Playback speed
|
||||
|
||||
|
||||
{{ ALERT_PY.md }}
|
||||
|
||||
{{ ALERT_JS.md }}
|
||||
|
||||
|
||||
## Goal
|
||||
|
||||
Fast-forward, reverse-playback and slow-motion are all techniques
|
||||
collectively known as *trick modes* and they all have in common that
|
||||
modify the normal playback rate. This tutorial shows how to achieve
|
||||
these effects and adds frame-stepping into the deal. In particular, it
|
||||
shows:
|
||||
|
||||
- How to change the playback rate, faster and slower than normal,
|
||||
forward and backwards.
|
||||
- How to advance a video frame-by-frame
|
||||
|
||||
## Introduction
|
||||
|
||||
Fast-forward is the technique that plays a media at a speed higher than
|
||||
its normal (intended) speed; whereas slow-motion uses a speed lower than
|
||||
the intended one. Reverse playback does the same thing but backwards,
|
||||
from the end of the stream to the beginning.
|
||||
|
||||
All these techniques do is change the playback rate, which is a variable
|
||||
equal to 1.0 for normal playback, greater than 1.0 (in absolute value)
|
||||
for fast modes, lower than 1.0 (in absolute value) for slow modes,
|
||||
positive for forward playback and negative for reverse playback.
|
||||
|
||||
GStreamer provides two mechanisms to change the playback rate: Step
|
||||
Events and Seek Events. Step Events allow skipping a given amount of
|
||||
media besides changing the subsequent playback rate (only to positive
|
||||
values). Seek Events, additionally, allow jumping to any position in the
|
||||
stream and set positive and negative playback rates.
|
||||
|
||||
In [](tutorials/basic/time-management.md) seek
|
||||
events have already been shown, using a helper function to hide their
|
||||
complexity. This tutorial explains a bit more how to use these events.
|
||||
|
||||
Step Events are a more convenient way of changing the playback rate,
|
||||
due to the reduced number of parameters needed to create them;
|
||||
however, they have some downsides, so Seek Events are used in this
|
||||
tutorial instead. Step events only affect the sink (at the end of the
|
||||
pipeline), so they will only work if the rest of the pipeline can
|
||||
support going at a different speed, Seek events go all the way through
|
||||
the pipeline so every element can react to them. The upside of Step
|
||||
events is that they are much faster to act. Step events are also
|
||||
unable to change the playback direction.
|
||||
|
||||
To use these events, they are created and then passed onto the pipeline,
|
||||
where they propagate upstream until they reach an element that can
|
||||
handle them. If an event is passed onto a bin element like `playbin`,
|
||||
it will simply feed the event to all its sinks, which will result in
|
||||
multiple seeks being performed. The common approach is to retrieve one
|
||||
of `playbin`’s sinks through the `video-sink` or
|
||||
`audio-sink` properties and feed the event directly into the sink.
|
||||
|
||||
Frame stepping is a technique that allows playing a video frame by
|
||||
frame. It is implemented by pausing the pipeline, and then sending Step
|
||||
Events to skip one frame each time.
|
||||
|
||||
## A trick mode player
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-13.c`.
|
||||
|
||||
**basic-tutorial-13.c**
|
||||
|
||||
{{ tutorials/basic-tutorial-13.c }}
|
||||
|
||||
> 
|
||||
> Need help?
|
||||
>
|
||||
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Build), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Build) or [Windows](installing/on-windows.md#InstallingonWindows-Build), or use this specific command on Linux:
|
||||
>
|
||||
> `` gcc basic-tutorial-13.c -o basic-tutorial-13 `pkg-config --cflags --libs gstreamer-1.0` ``
|
||||
>
|
||||
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Run), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](installing/on-windows.md#InstallingonWindows-Run).
|
||||
>
|
||||
> This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The console shows the available commands, composed of a single upper-case or lower-case letter, which you should input followed by the Enter key.
|
||||
>
|
||||
> Required libraries: `gstreamer-1.0`
|
||||
|
||||
## Walkthrough
|
||||
|
||||
There is nothing new in the initialization code in the main function: a
|
||||
`playbin` pipeline is instantiated, an I/O watch is installed to track
|
||||
keystrokes and a GLib main loop is executed.
|
||||
|
||||
Then, in the keyboard handler function:
|
||||
|
||||
``` c
|
||||
/* Process keyboard input */
|
||||
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
|
||||
gchar *str = NULL;
|
||||
|
||||
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) != G_IO_STATUS_NORMAL) {
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
switch (g_ascii_tolower (str[0])) {
|
||||
case 'p':
|
||||
data->playing = !data->playing;
|
||||
gst_element_set_state (data->pipeline, data->playing ? GST_STATE_PLAYING : GST_STATE_PAUSED);
|
||||
g_print ("Setting state to %s\n", data->playing ? "PLAYING" : "PAUSE");
|
||||
break;
|
||||
```
|
||||
|
||||
Pause / Playing toggle is handled with `gst_element_set_state()` as in
|
||||
previous tutorials.
|
||||
|
||||
``` c
|
||||
case 's':
|
||||
if (g_ascii_isupper (str[0])) {
|
||||
data->rate *= 2.0;
|
||||
} else {
|
||||
data->rate /= 2.0;
|
||||
}
|
||||
send_seek_event (data);
|
||||
break;
|
||||
case 'd':
|
||||
data->rate *= -1.0;
|
||||
send_seek_event (data);
|
||||
break;
|
||||
```
|
||||
|
||||
Use ‘S’ and ‘s’ to double or halve the current playback rate, and ‘d’ to
|
||||
reverse the current playback direction. In both cases, the
|
||||
`rate` variable is updated and `send_seek_event` is called. Let’s
|
||||
review this function.
|
||||
|
||||
``` c
|
||||
/* Send seek event to change rate */
|
||||
static void send_seek_event (CustomData *data) {
|
||||
gint64 position;
|
||||
GstEvent *seek_event;
|
||||
|
||||
/* Obtain the current position, needed for the seek event */
|
||||
if (!gst_element_query_position (data->pipeline, GST_FORMAT_TIME, &position)) {
|
||||
g_printerr ("Unable to retrieve current position.\n");
|
||||
return;
|
||||
}
|
||||
```
|
||||
|
||||
This function creates a new Seek Event and sends it to the pipeline to
|
||||
update the rate. First, the current position is recovered with
|
||||
`gst_element_query_position()`. This is needed because the Seek Event
|
||||
jumps to another position in the stream, and, since we do not actually
|
||||
want to move, we jump to the current position. Using a Step Event would
|
||||
be simpler, but this event is not currently fully functional, as
|
||||
explained in the Introduction.
|
||||
|
||||
``` c
|
||||
/* Create the seek event */
|
||||
if (data->rate > 0) {
|
||||
seek_event = gst_event_new_seek (data->rate, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE,
|
||||
GST_SEEK_TYPE_SET, position, GST_SEEK_TYPE_END, 0);
|
||||
} else {
|
||||
seek_event = gst_event_new_seek (data->rate, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE,
|
||||
GST_SEEK_TYPE_SET, 0, GST_SEEK_TYPE_SET, position);
|
||||
}
|
||||
```
|
||||
|
||||
The Seek Event is created with `gst_event_new_seek()`. Its parameters
|
||||
are, basically, the new rate, the new start position and the new stop
|
||||
position. Regardless of the playback direction, the start position must
|
||||
be smaller than the stop position, so the two playback directions are
|
||||
treated differently.
|
||||
|
||||
``` c
|
||||
if (data->video_sink == NULL) {
|
||||
/* If we have not done so, obtain the sink through which we will send the seek events */
|
||||
g_object_get (data->pipeline, "video-sink", &data->video_sink, NULL);
|
||||
}
|
||||
```
|
||||
|
||||
As explained in the Introduction, to avoid performing multiple Seeks,
|
||||
the Event is sent to only one sink, in this case, the video sink. It is
|
||||
obtained from `playbin` through the `video-sink` property. It is read
|
||||
at this time instead at initialization time because the actual sink may
|
||||
change depending on the media contents, and this won’t be known until
|
||||
the pipeline is `PLAYING` and some media has been read.
|
||||
|
||||
``` c
|
||||
/* Send the event */
|
||||
gst_element_send_event (data->video_sink, seek_event);
|
||||
```
|
||||
|
||||
The new Event is finally sent to the selected sink with
|
||||
`gst_element_send_event()`.
|
||||
|
||||
Back to the keyboard handler, we still miss the frame stepping code,
|
||||
which is really simple:
|
||||
|
||||
``` c
|
||||
case 'n':
|
||||
if (data->video_sink == NULL) {
|
||||
/* If we have not done so, obtain the sink through which we will send the step events */
|
||||
g_object_get (data->pipeline, "video-sink", &data->video_sink, NULL);
|
||||
}
|
||||
|
||||
gst_element_send_event (data->video_sink,
|
||||
gst_event_new_step (GST_FORMAT_BUFFERS, 1, ABS (data->rate), TRUE, FALSE));
|
||||
g_print ("Stepping one frame\n");
|
||||
break;
|
||||
```
|
||||
|
||||
A new Step Event is created with `gst_event_new_step()`, whose
|
||||
parameters basically specify the amount to skip (1 frame in the example)
|
||||
and the new rate (which we do not change).
|
||||
|
||||
The video sink is grabbed from `playbin` in case we didn’t have it yet,
|
||||
just like before.
|
||||
|
||||
And with this we are done. When testing this tutorial, keep in mind that
|
||||
backward playback is not optimal in many elements.
|
||||
|
||||
> 
|
||||
>
|
||||
>Changing the playback rate might only work with local files. If you cannot modify it, try changing the URI passed to `playbin` in line 114 to a local URI, starting with `file:///`
|
||||
</table>
|
||||
|
||||
## Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to change the playback rate using a Seek Event, created with
|
||||
`gst_event_new_seek()` and fed to the pipeline
|
||||
with `gst_element_send_event()`.
|
||||
- How to advance a video frame-by-frame by using Step Events, created
|
||||
with `gst_event_new_step()`.
|
||||
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
@@ -0,0 +1,546 @@
|
||||
# Basic tutorial 8: Short-cutting the pipeline
|
||||
|
||||
|
||||
{{ ALERT_PY.md }}
|
||||
|
||||
{{ ALERT_JS.md }}
|
||||
|
||||
## Goal
|
||||
|
||||
Pipelines constructed with GStreamer do not need to be completely
|
||||
closed. Data can be injected into the pipeline and extracted from it at
|
||||
any time, in a variety of ways. This tutorial shows:
|
||||
|
||||
- How to inject external data into a general GStreamer pipeline.
|
||||
|
||||
- How to extract data from a general GStreamer pipeline.
|
||||
|
||||
- How to access and manipulate this data.
|
||||
|
||||
[](tutorials/playback/short-cutting-the-pipeline.md) explains
|
||||
how to achieve the same goals in a playbin-based pipeline.
|
||||
|
||||
## Introduction
|
||||
|
||||
Applications can interact with the data flowing through a GStreamer
|
||||
pipeline in several ways. This tutorial describes the easiest one, since
|
||||
it uses elements that have been created for this sole purpose.
|
||||
|
||||
The element used to inject application data into a GStreamer pipeline is
|
||||
`appsrc`, and its counterpart, used to extract GStreamer data back to
|
||||
the application is `appsink`. To avoid confusing the names, think of it
|
||||
from GStreamer's point of view: `appsrc` is just a regular source, that
|
||||
provides data magically fallen from the sky (provided by the
|
||||
application, actually). `appsink` is a regular sink, where the data
|
||||
flowing through a GStreamer pipeline goes to die (it is recovered by the
|
||||
application, actually).
|
||||
|
||||
`appsrc` and `appsink` are so versatile that they offer their own API
|
||||
(see their documentation), which can be accessed by linking against the
|
||||
`gstreamer-app` library. In this tutorial, however, we will use a
|
||||
simpler approach and control them through signals.
|
||||
|
||||
`appsrc` can work in a variety of modes: in **pull** mode, it requests
|
||||
data from the application every time it needs it. In **push** mode, the
|
||||
application pushes data at its own pace. Furthermore, in push mode, the
|
||||
application can choose to be blocked in the push function when enough
|
||||
data has already been provided, or it can listen to the
|
||||
`enough-data` and `need-data` signals to control flow. This example
|
||||
implements the latter approach. Information regarding the other methods
|
||||
can be found in the `appsrc` documentation.
|
||||
|
||||
### Buffers
|
||||
|
||||
Data travels through a GStreamer pipeline in chunks called **buffers**.
|
||||
Since this example produces and consumes data, we need to know about
|
||||
`GstBuffer`s.
|
||||
|
||||
Source Pads produce buffers, that are consumed by Sink Pads; GStreamer
|
||||
takes these buffers and passes them from element to element.
|
||||
|
||||
A buffer simply represents a unit of data, do not assume that all
|
||||
buffers will have the same size, or represent the same amount of time.
|
||||
Neither should you assume that if a single buffer enters an element, a
|
||||
single buffer will come out. Elements are free to do with the received
|
||||
buffers as they please. `GstBuffer`s may also contain more than one
|
||||
actual memory buffer. Actual memory buffers are abstracted away using
|
||||
`GstMemory` objects, and a `GstBuffer` can contain multiple `GstMemory` objects.
|
||||
|
||||
Every buffer has attached time-stamps and duration, that describe in
|
||||
which moment the content of the buffer should be decoded, rendered or
|
||||
displayed. Time stamping is a very complex and delicate subject, but
|
||||
this simplified vision should suffice for now.
|
||||
|
||||
As an example, a `filesrc` (a GStreamer element that reads files)
|
||||
produces buffers with the “ANY” caps and no time-stamping information.
|
||||
After demuxing (see [](tutorials/basic/dynamic-pipelines.md))
|
||||
buffers can have some specific caps, for example “video/x-h264”. After
|
||||
decoding, each buffer will contain a single video frame with raw caps
|
||||
(for example, “video/x-raw-yuv”) and very precise time stamps indicating
|
||||
when should that frame be displayed.
|
||||
|
||||
### This tutorial
|
||||
|
||||
This tutorial expands [](tutorials/basic/multithreading-and-pad-availability.md) in
|
||||
two ways: firstly, the `audiotestsrc` is replaced by an `appsrc` that
|
||||
will generate the audio data. Secondly, a new branch is added to the
|
||||
`tee` so data going into the audio sink and the wave display is also
|
||||
replicated into an `appsink`. The `appsink` uploads the information back
|
||||
into the application, which then just notifies the user that data has
|
||||
been received, but it could obviously perform more complex tasks.
|
||||
|
||||

|
||||
|
||||
## A crude waveform generator
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-8.c` (or find it
|
||||
in your GStreamer installation).
|
||||
|
||||
``` c
|
||||
#include <gst/gst.h>
|
||||
#include <gst/audio/audio.h>
|
||||
#include <string.h>
|
||||
|
||||
#define CHUNK_SIZE 1024 /* Amount of bytes we are sending in each buffer */
|
||||
#define SAMPLE_RATE 44100 /* Samples per second we are sending */
|
||||
|
||||
/* Structure to contain all our information, so we can pass it to callbacks */
|
||||
typedef struct _CustomData {
|
||||
GstElement *pipeline, *app_source, *tee, *audio_queue, *audio_convert1, *audio_resample, *audio_sink;
|
||||
GstElement *video_queue, *audio_convert2, *visual, *video_convert, *video_sink;
|
||||
GstElement *app_queue, *app_sink;
|
||||
|
||||
guint64 num_samples; /* Number of samples generated so far (for timestamp generation) */
|
||||
gfloat a, b, c, d; /* For waveform generation */
|
||||
|
||||
guint sourceid; /* To control the GSource */
|
||||
|
||||
GMainLoop *main_loop; /* GLib's Main Loop */
|
||||
} CustomData;
|
||||
|
||||
/* This method is called by the idle GSource in the mainloop, to feed CHUNK_SIZE bytes into appsrc.
|
||||
* The idle handler is added to the mainloop when appsrc requests us to start sending data (need-data signal)
|
||||
* and is removed when appsrc has enough data (enough-data signal).
|
||||
*/
|
||||
static gboolean push_data (CustomData *data) {
|
||||
GstBuffer *buffer;
|
||||
GstFlowReturn ret;
|
||||
int i;
|
||||
GstMapInfo map;
|
||||
gint16 *raw;
|
||||
gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
|
||||
gfloat freq;
|
||||
|
||||
/* Create a new empty buffer */
|
||||
buffer = gst_buffer_new_and_alloc (CHUNK_SIZE);
|
||||
|
||||
/* Set its timestamp and duration */
|
||||
GST_BUFFER_TIMESTAMP (buffer) = gst_util_uint64_scale (data->num_samples, GST_SECOND, SAMPLE_RATE);
|
||||
GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (num_samples, GST_SECOND, SAMPLE_RATE);
|
||||
|
||||
/* Generate some psychodelic waveforms */
|
||||
gst_buffer_map (buffer, &map, GST_MAP_WRITE);
|
||||
raw = (gint16 *)map.data;
|
||||
data->c += data->d;
|
||||
data->d -= data->c / 1000;
|
||||
freq = 1100 + 1000 * data->d;
|
||||
for (i = 0; i < num_samples; i++) {
|
||||
data->a += data->b;
|
||||
data->b -= data->a / freq;
|
||||
raw[i] = (gint16)(500 * data->a);
|
||||
}
|
||||
gst_buffer_unmap (buffer, &map);
|
||||
data->num_samples += num_samples;
|
||||
|
||||
/* Push the buffer into the appsrc */
|
||||
g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);
|
||||
|
||||
/* Free the buffer now that we are done with it */
|
||||
gst_buffer_unref (buffer);
|
||||
|
||||
if (ret != GST_FLOW_OK) {
|
||||
/* We got some error, stop sending data */
|
||||
return FALSE;
|
||||
}
|
||||
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
/* This signal callback triggers when appsrc needs data. Here, we add an idle handler
|
||||
* to the mainloop to start pushing data into the appsrc */
|
||||
static void start_feed (GstElement *source, guint size, CustomData *data) {
|
||||
if (data->sourceid == 0) {
|
||||
g_print ("Start feeding\n");
|
||||
data->sourceid = g_idle_add ((GSourceFunc) push_data, data);
|
||||
}
|
||||
}
|
||||
|
||||
/* This callback triggers when appsrc has enough data and we can stop sending.
|
||||
* We remove the idle handler from the mainloop */
|
||||
static void stop_feed (GstElement *source, CustomData *data) {
|
||||
if (data->sourceid != 0) {
|
||||
g_print ("Stop feeding\n");
|
||||
g_source_remove (data->sourceid);
|
||||
data->sourceid = 0;
|
||||
}
|
||||
}
|
||||
|
||||
/* The appsink has received a buffer */
|
||||
static GstFlowReturn new_sample (GstElement *sink, CustomData *data) {
|
||||
GstSample *sample;
|
||||
|
||||
/* Retrieve the buffer */
|
||||
g_signal_emit_by_name (sink, "pull-sample", &sample);
|
||||
if (sample) {
|
||||
/* The only thing we do in this example is print a * to indicate a received buffer */
|
||||
g_print ("*");
|
||||
gst_sample_unref (sample);
|
||||
return GST_FLOW_OK;
|
||||
}
|
||||
|
||||
return GST_FLOW_ERROR;
|
||||
}
|
||||
|
||||
/* This function is called when an error message is posted on the bus */
|
||||
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
/* Print error details on the screen */
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
|
||||
g_main_loop_quit (data->main_loop);
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstPad *tee_audio_pad, *tee_video_pad, *tee_app_pad;
|
||||
GstPad *queue_audio_pad, *queue_video_pad, *queue_app_pad;
|
||||
GstAudioInfo info;
|
||||
GstCaps *audio_caps;
|
||||
GstBus *bus;
|
||||
|
||||
/* Initialize custom data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
data.b = 1; /* For waveform generation */
|
||||
data.d = 1;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the elements */
|
||||
data.app_source = gst_element_factory_make ("appsrc", "audio_source");
|
||||
data.tee = gst_element_factory_make ("tee", "tee");
|
||||
data.audio_queue = gst_element_factory_make ("queue", "audio_queue");
|
||||
data.audio_convert1 = gst_element_factory_make ("audioconvert", "audio_convert1");
|
||||
data.audio_resample = gst_element_factory_make ("audioresample", "audio_resample");
|
||||
data.audio_sink = gst_element_factory_make ("autoaudiosink", "audio_sink");
|
||||
data.video_queue = gst_element_factory_make ("queue", "video_queue");
|
||||
data.audio_convert2 = gst_element_factory_make ("audioconvert", "audio_convert2");
|
||||
data.visual = gst_element_factory_make ("wavescope", "visual");
|
||||
data.video_convert = gst_element_factory_make ("videoconvert", "video_convert");
|
||||
data.video_sink = gst_element_factory_make ("autovideosink", "video_sink");
|
||||
data.app_queue = gst_element_factory_make ("queue", "app_queue");
|
||||
data.app_sink = gst_element_factory_make ("appsink", "app_sink");
|
||||
|
||||
/* Create the empty pipeline */
|
||||
data.pipeline = gst_pipeline_new ("test-pipeline");
|
||||
|
||||
if (!data.pipeline || !data.app_source || !data.tee || !data.audio_queue || !data.audio_convert1 ||
|
||||
!data.audio_resample || !data.audio_sink || !data.video_queue || !data.audio_convert2 || !data.visual ||
|
||||
!data.video_convert || !data.video_sink || !data.app_queue || !data.app_sink) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Configure wavescope */
|
||||
g_object_set (data.visual, "shader", 0, "style", 0, NULL);
|
||||
|
||||
/* Configure appsrc */
|
||||
gst_audio_info_set_format (&info, GST_AUDIO_FORMAT_S16, SAMPLE_RATE, 1, NULL);
|
||||
audio_caps = gst_audio_info_to_caps (&info);
|
||||
g_object_set (data.app_source, "caps", audio_caps, "format", GST_FORMAT_TIME, NULL);
|
||||
g_signal_connect (data.app_source, "need-data", G_CALLBACK (start_feed), &data);
|
||||
g_signal_connect (data.app_source, "enough-data", G_CALLBACK (stop_feed), &data);
|
||||
|
||||
/* Configure appsink */
|
||||
g_object_set (data.app_sink, "emit-signals", TRUE, "caps", audio_caps, NULL);
|
||||
g_signal_connect (data.app_sink, "new-sample", G_CALLBACK (new_sample), &data);
|
||||
gst_caps_unref (audio_caps);
|
||||
|
||||
/* Link all elements that can be automatically linked because they have "Always" pads */
|
||||
gst_bin_add_many (GST_BIN (data.pipeline), data.app_source, data.tee, data.audio_queue, data.audio_convert1, data.audio_resample,
|
||||
data.audio_sink, data.video_queue, data.audio_convert2, data.visual, data.video_convert, data.video_sink, data.app_queue,
|
||||
data.app_sink, NULL);
|
||||
if (gst_element_link_many (data.app_source, data.tee, NULL) != TRUE ||
|
||||
gst_element_link_many (data.audio_queue, data.audio_convert1, data.audio_resample, data.audio_sink, NULL) != TRUE ||
|
||||
gst_element_link_many (data.video_queue, data.audio_convert2, data.visual, data.video_convert, data.video_sink, NULL) != TRUE ||
|
||||
gst_element_link_many (data.app_queue, data.app_sink, NULL) != TRUE) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (data.pipeline);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Manually link the Tee, which has "Request" pads */
|
||||
tee_audio_pad = gst_element_request_pad_simple (data.tee, "src_%u");
|
||||
g_print ("Obtained request pad %s for audio branch.\n", gst_pad_get_name (tee_audio_pad));
|
||||
queue_audio_pad = gst_element_get_static_pad (data.audio_queue, "sink");
|
||||
tee_video_pad = gst_element_request_pad_simple (data.tee, "src_%u");
|
||||
g_print ("Obtained request pad %s for video branch.\n", gst_pad_get_name (tee_video_pad));
|
||||
queue_video_pad = gst_element_get_static_pad (data.video_queue, "sink");
|
||||
tee_app_pad = gst_element_request_pad_simple (data.tee, "src_%u");
|
||||
g_print ("Obtained request pad %s for app branch.\n", gst_pad_get_name (tee_app_pad));
|
||||
queue_app_pad = gst_element_get_static_pad (data.app_queue, "sink");
|
||||
if (gst_pad_link (tee_audio_pad, queue_audio_pad) != GST_PAD_LINK_OK ||
|
||||
gst_pad_link (tee_video_pad, queue_video_pad) != GST_PAD_LINK_OK ||
|
||||
gst_pad_link (tee_app_pad, queue_app_pad) != GST_PAD_LINK_OK) {
|
||||
g_printerr ("Tee could not be linked\n");
|
||||
gst_object_unref (data.pipeline);
|
||||
return -1;
|
||||
}
|
||||
gst_object_unref (queue_audio_pad);
|
||||
gst_object_unref (queue_video_pad);
|
||||
gst_object_unref (queue_app_pad);
|
||||
|
||||
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
|
||||
bus = gst_element_get_bus (data.pipeline);
|
||||
gst_bus_add_signal_watch (bus);
|
||||
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
|
||||
gst_object_unref (bus);
|
||||
|
||||
/* Start playing the pipeline */
|
||||
gst_element_set_state (data.pipeline, GST_STATE_PLAYING);
|
||||
|
||||
/* Create a GLib Main Loop and set it to run */
|
||||
data.main_loop = g_main_loop_new (NULL, FALSE);
|
||||
g_main_loop_run (data.main_loop);
|
||||
|
||||
/* Release the request pads from the Tee, and unref them */
|
||||
gst_element_release_request_pad (data.tee, tee_audio_pad);
|
||||
gst_element_release_request_pad (data.tee, tee_video_pad);
|
||||
gst_element_release_request_pad (data.tee, tee_app_pad);
|
||||
gst_object_unref (tee_audio_pad);
|
||||
gst_object_unref (tee_video_pad);
|
||||
gst_object_unref (tee_app_pad);
|
||||
|
||||
/* Free resources */
|
||||
gst_element_set_state (data.pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (data.pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
> 
|
||||
> Need help?
|
||||
>
|
||||
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Build), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Build) or [Windows](installing/on-windows.md#InstallingonWindows-Build), or use this specific command on Linux:
|
||||
>
|
||||
> `` gcc basic-tutorial-8.c -o basic-tutorial-8 `pkg-config --cflags --libs gstreamer-1.0 gstreamer-audio-1.0` ``
|
||||
>
|
||||
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Run), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](installing/on-windows.md#InstallingonWindows-Run).
|
||||
>
|
||||
> This tutorial plays an audible tone for varying frequency through the audio card and opens a window with a waveform representation of the tone. The waveform should be a sinusoid, but due to the refreshing of the window might not appear so.
|
||||
>
|
||||
> Required libraries: `gstreamer-1.0`
|
||||
|
||||
## Walkthrough
|
||||
|
||||
The code to create the pipeline (Lines 131 to 205) is an enlarged
|
||||
version of [Basic tutorial 7: Multithreading and Pad
|
||||
Availability](tutorials/basic/multithreading-and-pad-availability.md).
|
||||
It involves instantiating all the elements, link the elements with
|
||||
Always Pads, and manually link the Request Pads of the `tee` element.
|
||||
|
||||
Regarding the configuration of the `appsrc` and `appsink` elements:
|
||||
|
||||
``` c
|
||||
/* Configure appsrc */
|
||||
gst_audio_info_set_format (&info, GST_AUDIO_FORMAT_S16, SAMPLE_RATE, 1, NULL);
|
||||
audio_caps = gst_audio_info_to_caps (&info);
|
||||
g_object_set (data.app_source, "caps", audio_caps, NULL);
|
||||
g_signal_connect (data.app_source, "need-data", G_CALLBACK (start_feed), &data);
|
||||
g_signal_connect (data.app_source, "enough-data", G_CALLBACK (stop_feed), &data);
|
||||
```
|
||||
|
||||
The first property that needs to be set on the `appsrc` is `caps`. It
|
||||
specifies the kind of data that the element is going to produce, so
|
||||
GStreamer can check if linking with downstream elements is possible
|
||||
(this is, if the downstream elements will understand this kind of data).
|
||||
This property must be a `GstCaps` object, which is easily built from a
|
||||
string with `gst_caps_from_string()`.
|
||||
|
||||
We then connect to the `need-data` and `enough-data` signals. These are
|
||||
fired by `appsrc` when its internal queue of data is running low or
|
||||
almost full, respectively. We will use these signals to start and stop
|
||||
(respectively) our signal generation process.
|
||||
|
||||
``` c
|
||||
/* Configure appsink */
|
||||
g_object_set (data.app_sink, "emit-signals", TRUE, "caps", audio_caps, NULL);
|
||||
g_signal_connect (data.app_sink, "new-sample", G_CALLBACK (new_sample), &data);
|
||||
gst_caps_unref (audio_caps);
|
||||
```
|
||||
|
||||
Regarding the `appsink` configuration, we connect to the
|
||||
`new-sample` signal, which is emitted every time the sink receives a
|
||||
buffer. Also, the signal emission needs to be enabled through the
|
||||
`emit-signals` property, because, by default, it is disabled.
|
||||
|
||||
Starting the pipeline, waiting for messages and final cleanup is done as
|
||||
usual. Let's review the callbacks we have just
|
||||
registered:
|
||||
|
||||
``` c
|
||||
/* This signal callback triggers when appsrc needs data. Here, we add an idle handler
|
||||
* to the mainloop to start pushing data into the appsrc */
|
||||
static void start_feed (GstElement *source, guint size, CustomData *data) {
|
||||
if (data->sourceid == 0) {
|
||||
g_print ("Start feeding\n");
|
||||
data->sourceid = g_idle_add ((GSourceFunc) push_data, data);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
This function is called when the internal queue of `appsrc` is about to
|
||||
starve (run out of data). The only thing we do here is register a GLib
|
||||
idle function with `g_idle_add()` that feeds data to `appsrc` until it
|
||||
is full again. A GLib idle function is a method that GLib will call from
|
||||
its main loop whenever it is “idle”, this is, when it has no
|
||||
higher-priority tasks to perform. It requires a GLib `GMainLoop` to be
|
||||
instantiated and running, obviously.
|
||||
|
||||
This is only one of the multiple approaches that `appsrc` allows. In
|
||||
particular, buffers do not need to be fed into `appsrc` from the main
|
||||
thread using GLib, and you do not need to use the `need-data` and
|
||||
`enough-data` signals to synchronize with `appsrc` (although this is
|
||||
allegedly the most convenient).
|
||||
|
||||
We take note of the sourceid that `g_idle_add()` returns, so we can
|
||||
disable it
|
||||
later.
|
||||
|
||||
``` c
|
||||
/* This callback triggers when appsrc has enough data and we can stop sending.
|
||||
* We remove the idle handler from the mainloop */
|
||||
static void stop_feed (GstElement *source, CustomData *data) {
|
||||
if (data->sourceid != 0) {
|
||||
g_print ("Stop feeding\n");
|
||||
g_source_remove (data->sourceid);
|
||||
data->sourceid = 0;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
This function is called when the internal queue of `appsrc` is full
|
||||
enough so we stop pushing data. Here we simply remove the idle function
|
||||
by using `g_source_remove()` (The idle function is implemented as a
|
||||
`GSource`).
|
||||
|
||||
``` c
|
||||
/* This method is called by the idle GSource in the mainloop, to feed CHUNK_SIZE bytes into appsrc.
|
||||
* The ide handler is added to the mainloop when appsrc requests us to start sending data (need-data signal)
|
||||
* and is removed when appsrc has enough data (enough-data signal).
|
||||
*/
|
||||
static gboolean push_data (CustomData *data) {
|
||||
GstBuffer *buffer;
|
||||
GstFlowReturn ret;
|
||||
int i;
|
||||
gint16 *raw;
|
||||
gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
|
||||
gfloat freq;
|
||||
|
||||
/* Create a new empty buffer */
|
||||
buffer = gst_buffer_new_and_alloc (CHUNK_SIZE);
|
||||
|
||||
/* Set its timestamp and duration */
|
||||
GST_BUFFER_TIMESTAMP (buffer) = gst_util_uint64_scale (data->num_samples, GST_SECOND, SAMPLE_RATE);
|
||||
GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (num_samples, GST_SECOND, SAMPLE_RATE);
|
||||
|
||||
/* Generate some psychodelic waveforms */
|
||||
raw = (gint16 *)GST_BUFFER_DATA (buffer);
|
||||
```
|
||||
|
||||
This is the function that feeds `appsrc`. It will be called by GLib at
|
||||
times and rates which are out of our control, but we know that we will
|
||||
disable it when its job is done (when the queue in `appsrc` is full).
|
||||
|
||||
Its first task is to create a new buffer with a given size (in this
|
||||
example, it is arbitrarily set to 1024 bytes) with
|
||||
`gst_buffer_new_and_alloc()`.
|
||||
|
||||
We count the number of samples that we have generated so far with the
|
||||
`CustomData.num_samples` variable, so we can time-stamp this buffer
|
||||
using the `GST_BUFFER_TIMESTAMP` macro in `GstBuffer`.
|
||||
|
||||
Since we are producing buffers of the same size, their duration is the
|
||||
same and is set using the `GST_BUFFER_DURATION` in `GstBuffer`.
|
||||
|
||||
`gst_util_uint64_scale()` is a utility function that scales (multiply
|
||||
and divide) numbers which can be large, without fear of overflows.
|
||||
|
||||
The bytes that for the buffer can be accessed with GST\_BUFFER\_DATA in
|
||||
`GstBuffer` (Be careful not to write past the end of the buffer: you
|
||||
allocated it, so you know its size).
|
||||
|
||||
We will skip over the waveform generation, since it is outside the scope
|
||||
of this tutorial (it is simply a funny way of generating a pretty
|
||||
psychedelic wave).
|
||||
|
||||
``` c
|
||||
/* Push the buffer into the appsrc */
|
||||
g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);
|
||||
|
||||
/* Free the buffer now that we are done with it */
|
||||
gst_buffer_unref (buffer);
|
||||
```
|
||||
|
||||
Once we have the buffer ready, we pass it to `appsrc` with the
|
||||
`push-buffer` action signal (see information box at the end of [](tutorials/playback/playbin-usage.md)), and then
|
||||
`gst_buffer_unref()` it since we no longer need it.
|
||||
|
||||
``` c
|
||||
/* The appsink has received a buffer */
|
||||
static GstFlowReturn new_sample (GstElement *sink, CustomData *data) {
|
||||
GstSample *sample;
|
||||
/* Retrieve the buffer */
|
||||
g_signal_emit_by_name (sink, "pull-sample", &sample);
|
||||
if (sample) {
|
||||
/* The only thing we do in this example is print a * to indicate a received buffer */
|
||||
g_print ("*");
|
||||
gst_sample_unref (sample);
|
||||
return GST_FLOW_OK;
|
||||
}
|
||||
return GST_FLOW_ERROR;
|
||||
}
|
||||
```
|
||||
|
||||
Finally, this is the function that gets called when the
|
||||
`appsink` receives a buffer. We use the `pull-sample` action signal to
|
||||
retrieve the buffer and then just print some indicator on the screen. We
|
||||
can retrieve the data pointer using the `GST_BUFFER_DATA` macro and the
|
||||
data size using the `GST_BUFFER_SIZE` macro in `GstBuffer`. Remember
|
||||
that this buffer does not have to match the buffer that we produced in
|
||||
the `push_data` function, any element in the path could have altered the
|
||||
buffers in any way (Not in this example: there is only a `tee` in the
|
||||
path between `appsrc` and `appsink`, and it does not change the content
|
||||
of the buffers).
|
||||
|
||||
We then `gst_buffer_unref()` the buffer, and this tutorial is done.
|
||||
|
||||
## Conclusion
|
||||
|
||||
This tutorial has shown how applications can:
|
||||
|
||||
- Inject data into a pipeline using the `appsrc`element.
|
||||
- Retrieve data from a pipeline using the `appsink` element.
|
||||
- Manipulate this data by accessing the `GstBuffer`.
|
||||
|
||||
In a playbin-based pipeline, the same goals are achieved in a slightly
|
||||
different way. [](tutorials/playback/short-cutting-the-pipeline.md) shows
|
||||
how to do it.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
@@ -0,0 +1,263 @@
|
||||
# Basic tutorial 12: Streaming
|
||||
|
||||
|
||||
{{ ALERT_PY.md }}
|
||||
|
||||
{{ ALERT_JS.md }}
|
||||
|
||||
## Goal
|
||||
|
||||
Playing media straight from the Internet without storing it locally is
|
||||
known as Streaming. We have been doing it throughout the tutorials
|
||||
whenever we used a URI starting with `http://`. This tutorial shows a
|
||||
couple of additional points to keep in mind when streaming. In
|
||||
particular:
|
||||
|
||||
- How to enable buffering (to alleviate network problems)
|
||||
- How to recover from interruptions (lost clock)
|
||||
|
||||
## Introduction
|
||||
|
||||
When streaming, media chunks are decoded and queued for presentation as
|
||||
soon as they arrive from the network. This means that if a chunk is
|
||||
delayed (which is not an uncommon situation at all on the Internet) the
|
||||
presentation queue might run dry and media playback could stall.
|
||||
|
||||
The universal solution is to build a “buffer”, this is, allow a certain
|
||||
number of media chunks to be queued before starting playback. In this
|
||||
way, playback start is delayed a bit, but, if some chunks are late,
|
||||
reproduction is not impacted as there are more chunks in the queue,
|
||||
waiting.
|
||||
|
||||
As it turns out, this solution is already implemented in GStreamer, but
|
||||
the previous tutorials have not been benefiting from it. Some elements,
|
||||
like the `queue2` and `multiqueue` found inside `playbin`, are capable
|
||||
of building this buffer and post bus messages regarding the buffer level
|
||||
(the state of the queue). An application wanting to have more network
|
||||
resilience, then, should listen to these messages and pause playback if
|
||||
the buffer level is not high enough (usually, whenever it is below
|
||||
100%).
|
||||
|
||||
To achieve synchronization among multiple sinks (for example an audio
|
||||
and a video sink) a global clock is used. This clock is selected by
|
||||
GStreamer among all elements which can provide one. Under some
|
||||
circumstances, for example, an RTP source switching streams or changing
|
||||
the output device, this clock can be lost and a new one needs to be
|
||||
selected. This happens mostly when dealing with streaming, so the
|
||||
process is explained in this tutorial.
|
||||
|
||||
When the clock is lost, the application receives a message on the bus;
|
||||
to select a new one, the application just needs to set the pipeline to
|
||||
`PAUSED` and then to `PLAYING` again.
|
||||
|
||||
## A network-resilient example
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-12.c`.
|
||||
|
||||
**basic-tutorial-12.c**
|
||||
|
||||
``` c
|
||||
#include <gst/gst.h>
|
||||
#include <string.h>
|
||||
|
||||
typedef struct _CustomData {
|
||||
gboolean is_live;
|
||||
GstElement *pipeline;
|
||||
GMainLoop *loop;
|
||||
} CustomData;
|
||||
|
||||
static void cb_message (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
|
||||
switch (GST_MESSAGE_TYPE (msg)) {
|
||||
case GST_MESSAGE_ERROR: {
|
||||
GError *err;
|
||||
gchar *debug;
|
||||
|
||||
gst_message_parse_error (msg, &err, &debug);
|
||||
g_print ("Error: %s\n", err->message);
|
||||
g_error_free (err);
|
||||
g_free (debug);
|
||||
|
||||
gst_element_set_state (data->pipeline, GST_STATE_READY);
|
||||
g_main_loop_quit (data->loop);
|
||||
break;
|
||||
}
|
||||
case GST_MESSAGE_EOS:
|
||||
/* end-of-stream */
|
||||
gst_element_set_state (data->pipeline, GST_STATE_READY);
|
||||
g_main_loop_quit (data->loop);
|
||||
break;
|
||||
case GST_MESSAGE_BUFFERING: {
|
||||
gint percent = 0;
|
||||
|
||||
/* If the stream is live, we do not care about buffering. */
|
||||
if (data->is_live) break;
|
||||
|
||||
gst_message_parse_buffering (msg, &percent);
|
||||
g_print ("Buffering (%3d%%)\r", percent);
|
||||
/* Wait until buffering is complete before start/resume playing */
|
||||
if (percent < 100)
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
|
||||
else
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
|
||||
break;
|
||||
}
|
||||
case GST_MESSAGE_CLOCK_LOST:
|
||||
/* Get a new clock */
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
|
||||
break;
|
||||
default:
|
||||
/* Unhandled message */
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
GstElement *pipeline;
|
||||
GstBus *bus;
|
||||
GstStateChangeReturn ret;
|
||||
GMainLoop *main_loop;
|
||||
CustomData data;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Initialize our data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
|
||||
/* Build the pipeline */
|
||||
pipeline = gst_parse_launch ("playbin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL);
|
||||
bus = gst_element_get_bus (pipeline);
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
} else if (ret == GST_STATE_CHANGE_NO_PREROLL) {
|
||||
data.is_live = TRUE;
|
||||
}
|
||||
|
||||
main_loop = g_main_loop_new (NULL, FALSE);
|
||||
data.loop = main_loop;
|
||||
data.pipeline = pipeline;
|
||||
|
||||
gst_bus_add_signal_watch (bus);
|
||||
g_signal_connect (bus, "message", G_CALLBACK (cb_message), &data);
|
||||
|
||||
g_main_loop_run (main_loop);
|
||||
|
||||
/* Free resources */
|
||||
g_main_loop_unref (main_loop);
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
> 
|
||||
> Need help?
|
||||
>
|
||||
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Build), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Build) or [Windows](installing/on-windows.md#InstallingonWindows-Build), or use this specific command on Linux:
|
||||
>
|
||||
> `` gcc basic-tutorial-12.c -o basic-tutorial-12 `pkg-config --cflags --libs gstreamer-1.0` ``
|
||||
>
|
||||
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Run), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](installing/on-windows.md#InstallingonWindows-Run).
|
||||
>
|
||||
> This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. In the console window, you should see a buffering message, and playback should only start when the buffering reaches 100%. This percentage might not change at all if your connection is fast enough and buffering is not required.
|
||||
>
|
||||
> Required libraries: `gstreamer-1.0`
|
||||
|
||||
## Walkthrough
|
||||
|
||||
The only special thing this tutorial does is react to certain messages;
|
||||
therefore, the initialization code is very simple and should be
|
||||
self-explanatory by now. The only new bit is the detection of live
|
||||
streams:
|
||||
|
||||
``` c
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
} else if (ret == GST_STATE_CHANGE_NO_PREROLL) {
|
||||
data.is_live = TRUE;
|
||||
}
|
||||
```
|
||||
|
||||
Live streams cannot be paused, so they behave in `PAUSED` state as if they
|
||||
were in the `PLAYING` state. Setting live streams to `PAUSED` succeeds, but
|
||||
returns `GST_STATE_CHANGE_NO_PREROLL`, instead of
|
||||
`GST_STATE_CHANGE_SUCCESS` to indicate that this is a live stream. We
|
||||
are receiving the `NO_PREROLL` return code even though we are trying to
|
||||
set the pipeline to `PLAYING`, because state changes happen progressively
|
||||
(from NULL to READY, to `PAUSED` and then to `PLAYING`).
|
||||
|
||||
We care about live streams because we want to disable buffering for
|
||||
them, so we take note of the result of `gst_element_set_state()` in the
|
||||
`is_live` variable.
|
||||
|
||||
Let’s now review the interesting parts of the message parsing callback:
|
||||
|
||||
``` c
|
||||
case GST_MESSAGE_BUFFERING: {
|
||||
gint percent = 0;
|
||||
|
||||
/* If the stream is live, we do not care about buffering. */
|
||||
if (data->is_live) break;
|
||||
|
||||
gst_message_parse_buffering (msg, &percent);
|
||||
g_print ("Buffering (%3d%%)\r", percent);
|
||||
/* Wait until buffering is complete before start/resume playing */
|
||||
if (percent < 100)
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
|
||||
else
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
|
||||
break;
|
||||
}
|
||||
```
|
||||
|
||||
First, if this is a live source, ignore buffering messages.
|
||||
|
||||
We parse the buffering message with `gst_message_parse_buffering()` to
|
||||
retrieve the buffering level.
|
||||
|
||||
Then, we print the buffering level on the console and set the pipeline
|
||||
to `PAUSED` if it is below 100%. Otherwise, we set the pipeline to
|
||||
`PLAYING`.
|
||||
|
||||
At startup, we will see the buffering level rise up to 100% before
|
||||
playback starts, which is what we wanted to achieve. If, later on, the
|
||||
network becomes slow or unresponsive and our buffer depletes, we will
|
||||
receive new buffering messages with levels below 100% so we will pause
|
||||
the pipeline again until enough buffer has been built up.
|
||||
|
||||
``` c
|
||||
case GST_MESSAGE_CLOCK_LOST:
|
||||
/* Get a new clock */
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
|
||||
break;
|
||||
```
|
||||
|
||||
For the second network issue, the loss of clock, we simply set the
|
||||
pipeline to `PAUSED` and back to `PLAYING`, so a new clock is selected,
|
||||
waiting for new media chunks to be received if necessary.
|
||||
|
||||
## Conclusion
|
||||
|
||||
This tutorial has described how to add network resilience to your
|
||||
application with two very simple precautions:
|
||||
|
||||
- Taking care of buffering messages sent by the pipeline
|
||||
- Taking care of clock loss
|
||||
|
||||
Handling these messages improves the application’s response to network
|
||||
problems, increasing the overall playback smoothness.
|
||||
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
@@ -0,0 +1,458 @@
|
||||
# Basic tutorial 4: Time management
|
||||
|
||||
|
||||
{{ ALERT_PY.md }}
|
||||
|
||||
{{ ALERT_JS.md }}
|
||||
|
||||
## Goal
|
||||
|
||||
This tutorial shows how to use GStreamer time-related facilities. In
|
||||
particular:
|
||||
|
||||
- How to query the pipeline for information like stream position or
|
||||
duration.
|
||||
- How to seek (jump) to a different position (time) inside the
|
||||
stream.
|
||||
|
||||
## Introduction
|
||||
|
||||
`GstQuery` is a mechanism that allows asking an element or pad for a
|
||||
piece of information. In this example we ask the pipeline if seeking is
|
||||
allowed (some sources, like live streams, do not allow seeking). If it
|
||||
is allowed, then, once the movie has been running for ten seconds, we
|
||||
skip to a different position using a seek.
|
||||
|
||||
In the previous tutorials, once we had the pipeline setup and running,
|
||||
our main function just sat and waited to receive an `ERROR` or an `EOS`
|
||||
through the bus. Here, we modify this function to periodically wake up
|
||||
and query the pipeline for the stream position, so we can print it on
|
||||
the screen. This is similar to what a media player would do, updating the
|
||||
user Interface on a periodic basis.
|
||||
|
||||
Finally, the stream duration is queried and updated whenever it changes.
|
||||
|
||||
## Seeking example
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-4.c` (or find it
|
||||
in your GStreamer installation).
|
||||
|
||||
**basic-tutorial-4.c**
|
||||
|
||||
``` c
|
||||
#include <gst/gst.h>
|
||||
|
||||
/* Structure to contain all our information, so we can pass it around */
|
||||
typedef struct _CustomData {
|
||||
GstElement *playbin; /* Our one and only element */
|
||||
gboolean playing; /* Are we in the PLAYING state? */
|
||||
gboolean terminate; /* Should we terminate execution? */
|
||||
gboolean seek_enabled; /* Is seeking enabled for this media? */
|
||||
gboolean seek_done; /* Have we performed the seek already? */
|
||||
gint64 duration; /* How long does this media last, in nanoseconds */
|
||||
} CustomData;
|
||||
|
||||
/* Forward definition of the message processing function */
|
||||
static void handle_message (CustomData *data, GstMessage *msg);
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstBus *bus;
|
||||
GstMessage *msg;
|
||||
GstStateChangeReturn ret;
|
||||
|
||||
data.playing = FALSE;
|
||||
data.terminate = FALSE;
|
||||
data.seek_enabled = FALSE;
|
||||
data.seek_done = FALSE;
|
||||
data.duration = GST_CLOCK_TIME_NONE;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the elements */
|
||||
data.playbin = gst_element_factory_make ("playbin", "playbin");
|
||||
|
||||
if (!data.playbin) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Set the URI to play */
|
||||
g_object_set (data.playbin, "uri", "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL);
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (data.playbin, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (data.playbin);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Listen to the bus */
|
||||
bus = gst_element_get_bus (data.playbin);
|
||||
do {
|
||||
msg = gst_bus_timed_pop_filtered (bus, 100 * GST_MSECOND,
|
||||
GST_MESSAGE_STATE_CHANGED | GST_MESSAGE_ERROR | GST_MESSAGE_EOS | GST_MESSAGE_DURATION);
|
||||
|
||||
/* Parse message */
|
||||
if (msg != NULL) {
|
||||
handle_message (&data, msg);
|
||||
} else {
|
||||
/* We got no message, this means the timeout expired */
|
||||
if (data.playing) {
|
||||
gint64 current = -1;
|
||||
|
||||
/* Query the current position of the stream */
|
||||
if (!gst_element_query_position (data.playbin, GST_FORMAT_TIME, ¤t)) {
|
||||
g_printerr ("Could not query current position.\n");
|
||||
}
|
||||
|
||||
/* If we didn't know it yet, query the stream duration */
|
||||
if (!GST_CLOCK_TIME_IS_VALID (data.duration)) {
|
||||
if (!gst_element_query_duration (data.playbin, GST_FORMAT_TIME, &data.duration)) {
|
||||
g_printerr ("Could not query current duration.\n");
|
||||
}
|
||||
}
|
||||
|
||||
/* Print current position and total duration */
|
||||
g_print ("Position %" GST_TIME_FORMAT " / %" GST_TIME_FORMAT "\r",
|
||||
GST_TIME_ARGS (current), GST_TIME_ARGS (data.duration));
|
||||
|
||||
/* If seeking is enabled, we have not done it yet, and the time is right, seek */
|
||||
if (data.seek_enabled && !data.seek_done && current > 10 * GST_SECOND) {
|
||||
g_print ("\nReached 10s, performing seek...\n");
|
||||
gst_element_seek_simple (data.playbin, GST_FORMAT_TIME,
|
||||
GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT, 30 * GST_SECOND);
|
||||
data.seek_done = TRUE;
|
||||
}
|
||||
}
|
||||
}
|
||||
} while (!data.terminate);
|
||||
|
||||
/* Free resources */
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (data.playbin, GST_STATE_NULL);
|
||||
gst_object_unref (data.playbin);
|
||||
return 0;
|
||||
}
|
||||
|
||||
static void handle_message (CustomData *data, GstMessage *msg) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
switch (GST_MESSAGE_TYPE (msg)) {
|
||||
case GST_MESSAGE_ERROR:
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
data->terminate = TRUE;
|
||||
break;
|
||||
case GST_MESSAGE_EOS:
|
||||
g_print ("\nEnd-Of-Stream reached.\n");
|
||||
data->terminate = TRUE;
|
||||
break;
|
||||
case GST_MESSAGE_DURATION:
|
||||
/* The duration has changed, mark the current one as invalid */
|
||||
data->duration = GST_CLOCK_TIME_NONE;
|
||||
break;
|
||||
case GST_MESSAGE_STATE_CHANGED: {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin)) {
|
||||
g_print ("Pipeline state changed from %s to %s:\n",
|
||||
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
|
||||
|
||||
/* Remember whether we are in the PLAYING state or not */
|
||||
data->playing = (new_state == GST_STATE_PLAYING);
|
||||
|
||||
if (data->playing) {
|
||||
/* We just moved to PLAYING. Check if seeking is possible */
|
||||
GstQuery *query;
|
||||
gint64 start, end;
|
||||
query = gst_query_new_seeking (GST_FORMAT_TIME);
|
||||
if (gst_element_query (data->playbin, query)) {
|
||||
gst_query_parse_seeking (query, NULL, &data->seek_enabled, &start, &end);
|
||||
if (data->seek_enabled) {
|
||||
g_print ("Seeking is ENABLED from %" GST_TIME_FORMAT " to %" GST_TIME_FORMAT "\n",
|
||||
GST_TIME_ARGS (start), GST_TIME_ARGS (end));
|
||||
} else {
|
||||
g_print ("Seeking is DISABLED for this stream.\n");
|
||||
}
|
||||
}
|
||||
else {
|
||||
g_printerr ("Seeking query failed.");
|
||||
}
|
||||
gst_query_unref (query);
|
||||
}
|
||||
}
|
||||
} break;
|
||||
default:
|
||||
/* We should not reach here */
|
||||
g_printerr ("Unexpected message received.\n");
|
||||
break;
|
||||
}
|
||||
gst_message_unref (msg);
|
||||
}
|
||||
```
|
||||
|
||||
> 
|
||||
> Need help?
|
||||
>
|
||||
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Build), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Build) or [Windows](installing/on-windows.md#InstallingonWindows-Build), or use this specific command on Linux:
|
||||
>
|
||||
> ``gcc basic-tutorial-4.c -o basic-tutorial-4 `pkg-config --cflags --libs gstreamer-1.0` ``
|
||||
>
|
||||
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Run), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](installing/on-windows.md#InstallingonWindows-Run).
|
||||
>
|
||||
> This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. 10 seconds into the movie it skips to a new position
|
||||
>
|
||||
>Required libraries: `gstreamer-1.0`
|
||||
|
||||
## Walkthrough
|
||||
|
||||
``` c
|
||||
/* Structure to contain all our information, so we can pass it around */
|
||||
typedef struct _CustomData {
|
||||
GstElement *playbin; /* Our one and only element */
|
||||
gboolean playing; /* Are we in the PLAYING state? */
|
||||
gboolean terminate; /* Should we terminate execution? */
|
||||
gboolean seek_enabled; /* Is seeking enabled for this media? */
|
||||
gboolean seek_done; /* Have we performed the seek already? */
|
||||
gint64 duration; /* How long does this media last, in nanoseconds */
|
||||
} CustomData;
|
||||
|
||||
/* Forward definition of the message processing function */
|
||||
static void handle_message (CustomData *data, GstMessage *msg);
|
||||
```
|
||||
|
||||
We start by defining a structure to contain all our information, so we
|
||||
can pass it around to other functions. In particular, in this example we
|
||||
move the message handling code to its own function
|
||||
`handle_message` because it is growing a bit too big.
|
||||
|
||||
We then build a pipeline composed of a single element, a
|
||||
`playbin`, which we already saw in [Basic tutorial 1: Hello
|
||||
world!](tutorials/basic/hello-world.md). However,
|
||||
`playbin` is in itself a pipeline, and in this case it is the only
|
||||
element in the pipeline, so we directly use the `playbin` element. We
|
||||
will skip the details: the URI of the clip is given to `playbin` via
|
||||
the URI property and the pipeline is set to the playing state.
|
||||
|
||||
``` c
|
||||
msg = gst_bus_timed_pop_filtered (bus, 100 * GST_MSECOND,
|
||||
GST_MESSAGE_STATE_CHANGED | GST_MESSAGE_ERROR | GST_MESSAGE_EOS | GST_MESSAGE_DURATION);
|
||||
```
|
||||
|
||||
Previously we did not provide a timeout to
|
||||
`gst_bus_timed_pop_filtered()`, meaning that it didn't return until a
|
||||
message was received. Now we use a timeout of 100 milliseconds, so, if
|
||||
no message is received during one tenth of a second, the function will return
|
||||
`NULL`. We are going to use this logic to update our “UI”.
|
||||
|
||||
Note that the desired timeout must be specified as a `GstClockTime`, hence,
|
||||
in nanoseconds. Numbers expressing different time units then, should be
|
||||
multiplied by macros like `GST_SECOND` or `GST_MSECOND`. This also makes
|
||||
your code more readable.
|
||||
|
||||
If we got a message, we process it in the `handle_message` function
|
||||
(next subsection), otherwise:
|
||||
|
||||
### User interface refreshing
|
||||
|
||||
``` c
|
||||
/* We got no message, this means the timeout expired */
|
||||
if (data.playing) {
|
||||
```
|
||||
|
||||
If the pipeline is in `PLAYING` state, it is time to refresh the screen.
|
||||
We don't want to do anything if we are not in `PLAYING` state, because
|
||||
most queries would fail.
|
||||
|
||||
We get here approximately 10 times per second, a good enough refresh
|
||||
rate for our UI. We are going to print on screen the current media
|
||||
position, which we can learn by querying the pipeline. This involves a
|
||||
few steps that will be shown in the next subsection, but, since position
|
||||
and duration are common enough queries, `GstElement` offers easier,
|
||||
ready-made alternatives:
|
||||
|
||||
``` c
|
||||
/* Query the current position of the stream */
|
||||
if (!gst_element_query_position (data.pipeline, GST_FORMAT_TIME, ¤t)) {
|
||||
g_printerr ("Could not query current position.\n");
|
||||
}
|
||||
```
|
||||
|
||||
`gst_element_query_position()` hides the management of the query object
|
||||
and directly provides us with the result.
|
||||
|
||||
``` c
|
||||
/* If we didn't know it yet, query the stream duration */
|
||||
if (!GST_CLOCK_TIME_IS_VALID (data.duration)) {
|
||||
if (!gst_element_query_duration (data.pipeline, GST_FORMAT_TIME, &data.duration)) {
|
||||
g_printerr ("Could not query current duration.\n");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Now is a good moment to know the length of the stream, with
|
||||
another `GstElement` helper function: `gst_element_query_duration()`
|
||||
|
||||
``` c
|
||||
/* Print current position and total duration */
|
||||
g_print ("Position %" GST_TIME_FORMAT " / %" GST_TIME_FORMAT "\r",
|
||||
GST_TIME_ARGS (current), GST_TIME_ARGS (data.duration));
|
||||
```
|
||||
|
||||
Note the usage of the `GST_TIME_FORMAT` and `GST_TIME_ARGS` macros to
|
||||
provide a user-friendly representation of GStreamer times.
|
||||
|
||||
``` c
|
||||
/* If seeking is enabled, we have not done it yet, and the time is right, seek */
|
||||
if (data.seek_enabled && !data.seek_done && current > 10 * GST_SECOND) {
|
||||
g_print ("\nReached 10s, performing seek...\n");
|
||||
gst_element_seek_simple (data.pipeline, GST_FORMAT_TIME,
|
||||
GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT, 30 * GST_SECOND);
|
||||
data.seek_done = TRUE;
|
||||
}
|
||||
```
|
||||
|
||||
Now we perform the seek, “simply” by
|
||||
calling `gst_element_seek_simple()` on the pipeline. A lot of the
|
||||
intricacies of seeking are hidden in this method, which is a good
|
||||
thing!
|
||||
|
||||
Let's review the parameters:
|
||||
|
||||
`GST_FORMAT_TIME` indicates that we are specifying the destination in
|
||||
time units. Other seek-formats use different units.
|
||||
|
||||
Then come the `GstSeekFlags`, let's review the most common:
|
||||
|
||||
`GST_SEEK_FLAG_FLUSH`: This discards all data currently in the pipeline
|
||||
before doing the seek. Might pause a bit while the pipeline is refilled
|
||||
and the new data starts to show up, but greatly increases the
|
||||
“responsiveness” of the application. If this flag is not provided,
|
||||
“stale” data might be shown for a while until the new position appears
|
||||
at the end of the pipeline.
|
||||
|
||||
`GST_SEEK_FLAG_KEY_UNIT`: With most encoded video streams, seeking to
|
||||
arbitrary positions is not possible but only to certain frames called Key Frames. When this
|
||||
flag is used, the seek will actually move to the closest key frame and
|
||||
start producing data straight away. If this flag is not used, the
|
||||
pipeline will move internally to the closest key frame (it has no other
|
||||
alternative) but data will not be shown until it reaches the requested
|
||||
position. This last alternative is more accurate, but might take longer.
|
||||
|
||||
`GST_SEEK_FLAG_ACCURATE`: Some media clips do not provide enough
|
||||
indexing information, meaning that seeking to arbitrary positions is
|
||||
time-consuming. In these cases, GStreamer usually estimates the position
|
||||
to seek to, and usually works just fine. If this precision is not good
|
||||
enough for your case (you see seeks not going to the exact time you
|
||||
asked for), then provide this flag. Be warned that it might take longer
|
||||
to calculate the seeking position (very long, on some files).
|
||||
|
||||
Finally, we provide the position to seek to. Since we asked
|
||||
for `GST_FORMAT_TIME`, the value must be in nanoseconds so we express
|
||||
the time in seconds, for simplicity, and then multiply by `GST_SECOND`.
|
||||
|
||||
### Message Pump
|
||||
|
||||
The `handle_message` function processes all messages received through
|
||||
the pipeline's bus. `ERROR` and `EOS` handling is the same as in previous
|
||||
tutorials, so we skip to the interesting part:
|
||||
|
||||
``` c
|
||||
case GST_MESSAGE_DURATION:
|
||||
/* The duration has changed, mark the current one as invalid */
|
||||
data->duration = GST_CLOCK_TIME_NONE;
|
||||
break;
|
||||
```
|
||||
|
||||
This message is posted on the bus whenever the duration of the stream
|
||||
changes. Here we simply mark the current duration as invalid, so it gets
|
||||
re-queried later.
|
||||
|
||||
``` c
|
||||
case GST_MESSAGE_STATE_CHANGED: {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->pipeline)) {
|
||||
g_print ("Pipeline state changed from %s to %s:\n",
|
||||
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
|
||||
|
||||
/* Remember whether we are in the PLAYING state or not */
|
||||
data->playing = (new_state == GST_STATE_PLAYING);
|
||||
```
|
||||
|
||||
Seeks and time queries generally only get a valid reply when in the
|
||||
`PAUSED` or `PLAYING` state, since all elements have had a chance to
|
||||
receive information and configure themselves. Here, we use the `playing`
|
||||
variable to keep track of whether the pipeline is in `PLAYING` state.
|
||||
Also, if we have just entered the `PLAYING` state, we do our first query.
|
||||
We ask the pipeline if seeking is allowed on this stream:
|
||||
|
||||
``` c
|
||||
if (data->playing) {
|
||||
/* We just moved to PLAYING. Check if seeking is possible */
|
||||
GstQuery *query;
|
||||
gint64 start, end;
|
||||
query = gst_query_new_seeking (GST_FORMAT_TIME);
|
||||
if (gst_element_query (data->pipeline, query)) {
|
||||
gst_query_parse_seeking (query, NULL, &data->seek_enabled, &start, &end);
|
||||
if (data->seek_enabled) {
|
||||
g_print ("Seeking is ENABLED from %" GST_TIME_FORMAT " to %" GST_TIME_FORMAT "\n",
|
||||
GST_TIME_ARGS (start), GST_TIME_ARGS (end));
|
||||
} else {
|
||||
g_print ("Seeking is DISABLED for this stream.\n");
|
||||
}
|
||||
}
|
||||
else {
|
||||
g_printerr ("Seeking query failed.");
|
||||
}
|
||||
gst_query_unref (query);
|
||||
}
|
||||
```
|
||||
|
||||
`gst_query_new_seeking()` creates a new query object of the "seeking"
|
||||
type, with `GST_FORMAT_TIME` format. This indicates that we are
|
||||
interested in seeking by specifying the new time to which we want to
|
||||
move. We could also ask for `GST_FORMAT_BYTES`, and then seek to a
|
||||
particular byte position inside the source file, but this is normally
|
||||
less useful.
|
||||
|
||||
This query object is then passed to the pipeline with
|
||||
`gst_element_query()`. The result is stored in the same query, and can
|
||||
be easily retrieved with `gst_query_parse_seeking()`. It extracts a
|
||||
boolean indicating if seeking is allowed, and the range in which seeking
|
||||
is possible.
|
||||
|
||||
Don't forget to unref the query object when you are done with it.
|
||||
|
||||
And that's it! With this knowledge a media player can be built which
|
||||
periodically updates a slider based on the current stream position and
|
||||
allows seeking by moving the slider!
|
||||
|
||||
## Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to query the pipeline for information using `GstQuery`
|
||||
|
||||
- How to obtain common information like position and duration
|
||||
using `gst_element_query_position()` and `gst_element_query_duration()`
|
||||
|
||||
- How to seek to an arbitrary position in the stream
|
||||
using `gst_element_seek_simple()`
|
||||
|
||||
- In which states all these operations can be performed.
|
||||
|
||||
The next tutorial shows how to integrate GStreamer with a Graphical User
|
||||
Interface toolkit.
|
||||
|
||||
Remember that attached to this page you should find the complete source
|
||||
code of the tutorial and any accessory files needed to build it.
|
||||
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
@@ -0,0 +1,884 @@
|
||||
# Basic tutorial 5: GUI toolkit integration
|
||||
|
||||
|
||||
{{ ALERT_PY.md }}
|
||||
|
||||
{{ ALERT_JS.md }}
|
||||
|
||||
## Goal
|
||||
|
||||
This tutorial shows how to integrate GStreamer in a Graphical User
|
||||
Interface (GUI) toolkit like [GTK+](http://www.gtk.org). Basically,
|
||||
GStreamer takes care of media playback while the GUI toolkit handles
|
||||
user interaction. The most interesting parts are those in which both
|
||||
libraries have to interact: Instructing GStreamer to output video to a
|
||||
GTK+ window and forwarding user actions to GStreamer.
|
||||
|
||||
In particular, you will learn:
|
||||
|
||||
- How to tell GStreamer to output video to a particular window
|
||||
(instead of creating its own window).
|
||||
|
||||
- How to continuously refresh the GUI with information from GStreamer.
|
||||
|
||||
- How to update the GUI from the multiple threads of GStreamer, an
|
||||
operation forbidden on most GUI toolkits.
|
||||
|
||||
- A mechanism to subscribe only to the messages you are interested in,
|
||||
instead of being notified of all of them.
|
||||
|
||||
## Introduction
|
||||
|
||||
We are going to build a media player using the
|
||||
[GTK+](http://www.gtk.org/) toolkit, but the concepts apply to other
|
||||
toolkits like [Qt](http://qt-project.org/), for example. A minimum
|
||||
knowledge of [GTK+](http://www.gtk.org/) will help understand this
|
||||
tutorial.
|
||||
|
||||
The main point is telling GStreamer to output the video to a window of
|
||||
our choice. The specific mechanism depends on the operating system (or
|
||||
rather, on the windowing system), but GStreamer provides a layer of
|
||||
abstraction for the sake of platform independence. This independence
|
||||
comes through the `GstVideoOverlay` interface, that allows the application to
|
||||
tell a video sink the handler of the window that should receive the
|
||||
rendering.
|
||||
|
||||
> 
|
||||
> **GObject interfaces**
|
||||
>
|
||||
> A `GObject` *interface* (which GStreamer uses) is a set of functions that an element can implement. If it does, then it is said to support that particular interface. For example, video sinks usually create their own windows to display video, but, if they are also capable of rendering to an external window, they can choose to implement the `GstVideoOverlay` interface and provide functions to specify this external window. From the application developer point of view, if a certain interface is supported, you can use it and forget about which kind of element is implementing it. Moreover, if you are using `playbin`, it will automatically expose some of the interfaces supported by its internal elements: You can use your interface functions directly on `playbin` without knowing who is implementing them!
|
||||
|
||||
Another issue is that GUI toolkits usually only allow manipulation of
|
||||
the graphical “widgets” through the main (or application) thread,
|
||||
whereas GStreamer usually spawns multiple threads to take care of
|
||||
different tasks. Calling [GTK+](http://www.gtk.org/) functions from
|
||||
within callbacks will usually fail, because callbacks execute in the
|
||||
calling thread, which does not need to be the main thread. This problem
|
||||
can be solved by posting a message on the GStreamer bus in the callback:
|
||||
The messages will be received by the main thread which will then react
|
||||
accordingly.
|
||||
|
||||
Finally, so far we have registered a `handle_message` function that got
|
||||
called every time a message appeared on the bus, which forced us to
|
||||
parse every message to see if it was of interest to us. In this tutorial
|
||||
a different method is used that registers a callback for each kind of
|
||||
message, so there is less parsing and less code overall.
|
||||
|
||||
## A media player in GTK+
|
||||
|
||||
Let's write a very simple media player based on playbin, this time,
|
||||
with a GUI!
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-5.c` (or find it
|
||||
in your GStreamer installation).
|
||||
|
||||
**basic-tutorial-5.c**
|
||||
|
||||
``` c
|
||||
#include <string.h>
|
||||
|
||||
#include <gtk/gtk.h>
|
||||
#include <gst/gst.h>
|
||||
#include <gst/video/videooverlay.h>
|
||||
|
||||
#include <gdk/gdk.h>
|
||||
#if defined (GDK_WINDOWING_X11)
|
||||
#include <gdk/gdkx.h>
|
||||
#elif defined (GDK_WINDOWING_WIN32)
|
||||
#include <gdk/gdkwin32.h>
|
||||
#elif defined (GDK_WINDOWING_QUARTZ)
|
||||
#include <gdk/gdkquartz.h>
|
||||
#endif
|
||||
|
||||
/* Structure to contain all our information, so we can pass it around */
|
||||
typedef struct _CustomData {
|
||||
GstElement *playbin; /* Our one and only pipeline */
|
||||
|
||||
GtkWidget *slider; /* Slider widget to keep track of current position */
|
||||
GtkWidget *streams_list; /* Text widget to display info about the streams */
|
||||
gulong slider_update_signal_id; /* Signal ID for the slider update signal */
|
||||
|
||||
GstState state; /* Current state of the pipeline */
|
||||
gint64 duration; /* Duration of the clip, in nanoseconds */
|
||||
} CustomData;
|
||||
|
||||
/* This function is called when the GUI toolkit creates the physical window that will hold the video.
|
||||
* At this point we can retrieve its handler (which has a different meaning depending on the windowing system)
|
||||
* and pass it to GStreamer through the VideoOverlay interface. */
|
||||
static void realize_cb (GtkWidget *widget, CustomData *data) {
|
||||
GdkWindow *window = gtk_widget_get_window (widget);
|
||||
guintptr window_handle;
|
||||
|
||||
if (!gdk_window_ensure_native (window))
|
||||
g_error ("Couldn't create native window needed for GstVideoOverlay!");
|
||||
|
||||
/* Retrieve window handler from GDK */
|
||||
#if defined (GDK_WINDOWING_WIN32)
|
||||
window_handle = (guintptr)GDK_WINDOW_HWND (window);
|
||||
#elif defined (GDK_WINDOWING_QUARTZ)
|
||||
window_handle = gdk_quartz_window_get_nsview (window);
|
||||
#elif defined (GDK_WINDOWING_X11)
|
||||
window_handle = GDK_WINDOW_XID (window);
|
||||
#endif
|
||||
/* Pass it to playbin, which implements VideoOverlay and will forward it to the video sink */
|
||||
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->playbin), window_handle);
|
||||
}
|
||||
|
||||
/* This function is called when the PLAY button is clicked */
|
||||
static void play_cb (GtkButton *button, CustomData *data) {
|
||||
gst_element_set_state (data->playbin, GST_STATE_PLAYING);
|
||||
}
|
||||
|
||||
/* This function is called when the PAUSE button is clicked */
|
||||
static void pause_cb (GtkButton *button, CustomData *data) {
|
||||
gst_element_set_state (data->playbin, GST_STATE_PAUSED);
|
||||
}
|
||||
|
||||
/* This function is called when the STOP button is clicked */
|
||||
static void stop_cb (GtkButton *button, CustomData *data) {
|
||||
gst_element_set_state (data->playbin, GST_STATE_READY);
|
||||
}
|
||||
|
||||
/* This function is called when the main window is closed */
|
||||
static void delete_event_cb (GtkWidget *widget, GdkEvent *event, CustomData *data) {
|
||||
stop_cb (NULL, data);
|
||||
gtk_main_quit ();
|
||||
}
|
||||
|
||||
/* This function is called everytime the video window needs to be redrawn (due to damage/exposure,
|
||||
* rescaling, etc). GStreamer takes care of this in the PAUSED and PLAYING states, otherwise,
|
||||
* we simply draw a black rectangle to avoid garbage showing up. */
|
||||
static gboolean draw_cb (GtkWidget *widget, cairo_t *cr, CustomData *data) {
|
||||
if (data->state < GST_STATE_PAUSED) {
|
||||
GtkAllocation allocation;
|
||||
|
||||
/* Cairo is a 2D graphics library which we use here to clean the video window.
|
||||
* It is used by GStreamer for other reasons, so it will always be available to us. */
|
||||
gtk_widget_get_allocation (widget, &allocation);
|
||||
cairo_set_source_rgb (cr, 0, 0, 0);
|
||||
cairo_rectangle (cr, 0, 0, allocation.width, allocation.height);
|
||||
cairo_fill (cr);
|
||||
}
|
||||
|
||||
return FALSE;
|
||||
}
|
||||
|
||||
/* This function is called when the slider changes its position. We perform a seek to the
|
||||
* new position here. */
|
||||
static void slider_cb (GtkRange *range, CustomData *data) {
|
||||
gdouble value = gtk_range_get_value (GTK_RANGE (data->slider));
|
||||
gst_element_seek_simple (data->playbin, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
|
||||
(gint64)(value * GST_SECOND));
|
||||
}
|
||||
|
||||
/* This creates all the GTK+ widgets that compose our application, and registers the callbacks */
|
||||
static void create_ui (CustomData *data) {
|
||||
GtkWidget *main_window; /* The uppermost window, containing all other windows */
|
||||
GtkWidget *video_window; /* The drawing area where the video will be shown */
|
||||
GtkWidget *main_box; /* VBox to hold main_hbox and the controls */
|
||||
GtkWidget *main_hbox; /* HBox to hold the video_window and the stream info text widget */
|
||||
GtkWidget *controls; /* HBox to hold the buttons and the slider */
|
||||
GtkWidget *play_button, *pause_button, *stop_button; /* Buttons */
|
||||
|
||||
main_window = gtk_window_new (GTK_WINDOW_TOPLEVEL);
|
||||
g_signal_connect (G_OBJECT (main_window), "delete-event", G_CALLBACK (delete_event_cb), data);
|
||||
|
||||
video_window = gtk_drawing_area_new ();
|
||||
gtk_widget_set_double_buffered (video_window, FALSE);
|
||||
g_signal_connect (video_window, "realize", G_CALLBACK (realize_cb), data);
|
||||
g_signal_connect (video_window, "draw", G_CALLBACK (draw_cb), data);
|
||||
|
||||
play_button = gtk_button_new_from_icon_name ("media-playback-start", GTK_ICON_SIZE_SMALL_TOOLBAR);
|
||||
g_signal_connect (G_OBJECT (play_button), "clicked", G_CALLBACK (play_cb), data);
|
||||
|
||||
pause_button = gtk_button_new_from_icon_name ("media-playback-pause", GTK_ICON_SIZE_SMALL_TOOLBAR);
|
||||
g_signal_connect (G_OBJECT (pause_button), "clicked", G_CALLBACK (pause_cb), data);
|
||||
|
||||
stop_button = gtk_button_new_from_icon_name ("media-playback-stop", GTK_ICON_SIZE_SMALL_TOOLBAR);
|
||||
g_signal_connect (G_OBJECT (stop_button), "clicked", G_CALLBACK (stop_cb), data);
|
||||
|
||||
data->slider = gtk_scale_new_with_range (GTK_ORIENTATION_HORIZONTAL, 0, 100, 1);
|
||||
gtk_scale_set_draw_value (GTK_SCALE (data->slider), 0);
|
||||
data->slider_update_signal_id = g_signal_connect (G_OBJECT (data->slider), "value-changed", G_CALLBACK (slider_cb), data);
|
||||
|
||||
data->streams_list = gtk_text_view_new ();
|
||||
gtk_text_view_set_editable (GTK_TEXT_VIEW (data->streams_list), FALSE);
|
||||
|
||||
controls = gtk_box_new (GTK_ORIENTATION_HORIZONTAL, 0);
|
||||
gtk_box_pack_start (GTK_BOX (controls), play_button, FALSE, FALSE, 2);
|
||||
gtk_box_pack_start (GTK_BOX (controls), pause_button, FALSE, FALSE, 2);
|
||||
gtk_box_pack_start (GTK_BOX (controls), stop_button, FALSE, FALSE, 2);
|
||||
gtk_box_pack_start (GTK_BOX (controls), data->slider, TRUE, TRUE, 2);
|
||||
|
||||
main_hbox = gtk_box_new (GTK_ORIENTATION_HORIZONTAL, 0);
|
||||
gtk_box_pack_start (GTK_BOX (main_hbox), video_window, TRUE, TRUE, 0);
|
||||
gtk_box_pack_start (GTK_BOX (main_hbox), data->streams_list, FALSE, FALSE, 2);
|
||||
|
||||
main_box = gtk_box_new (GTK_ORIENTATION_VERTICAL, 0);
|
||||
gtk_box_pack_start (GTK_BOX (main_box), main_hbox, TRUE, TRUE, 0);
|
||||
gtk_box_pack_start (GTK_BOX (main_box), controls, FALSE, FALSE, 0);
|
||||
gtk_container_add (GTK_CONTAINER (main_window), main_box);
|
||||
gtk_window_set_default_size (GTK_WINDOW (main_window), 640, 480);
|
||||
|
||||
gtk_widget_show_all (main_window);
|
||||
}
|
||||
|
||||
/* This function is called periodically to refresh the GUI */
|
||||
static gboolean refresh_ui (CustomData *data) {
|
||||
gint64 current = -1;
|
||||
|
||||
/* We do not want to update anything unless we are in the PAUSED or PLAYING states */
|
||||
if (data->state < GST_STATE_PAUSED)
|
||||
return TRUE;
|
||||
|
||||
/* If we didn't know it yet, query the stream duration */
|
||||
if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
|
||||
if (!gst_element_query_duration (data->playbin, GST_FORMAT_TIME, &data->duration)) {
|
||||
g_printerr ("Could not query current duration.\n");
|
||||
} else {
|
||||
/* Set the range of the slider to the clip duration, in SECONDS */
|
||||
gtk_range_set_range (GTK_RANGE (data->slider), 0, (gdouble)data->duration / GST_SECOND);
|
||||
}
|
||||
}
|
||||
|
||||
if (gst_element_query_position (data->playbin, GST_FORMAT_TIME, ¤t)) {
|
||||
/* Block the "value-changed" signal, so the slider_cb function is not called
|
||||
* (which would trigger a seek the user has not requested) */
|
||||
g_signal_handler_block (data->slider, data->slider_update_signal_id);
|
||||
/* Set the position of the slider to the current pipeline position, in SECONDS */
|
||||
gtk_range_set_value (GTK_RANGE (data->slider), (gdouble)current / GST_SECOND);
|
||||
/* Re-enable the signal */
|
||||
g_signal_handler_unblock (data->slider, data->slider_update_signal_id);
|
||||
}
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
/* This function is called when new metadata is discovered in the stream */
|
||||
static void tags_cb (GstElement *playbin, gint stream, CustomData *data) {
|
||||
/* We are possibly in a GStreamer working thread, so we notify the main
|
||||
* thread of this event through a message in the bus */
|
||||
gst_element_post_message (playbin,
|
||||
gst_message_new_application (GST_OBJECT (playbin),
|
||||
gst_structure_new_empty ("tags-changed")));
|
||||
}
|
||||
|
||||
/* This function is called when an error message is posted on the bus */
|
||||
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
/* Print error details on the screen */
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
|
||||
/* Set the pipeline to READY (which stops playback) */
|
||||
gst_element_set_state (data->playbin, GST_STATE_READY);
|
||||
}
|
||||
|
||||
/* This function is called when an End-Of-Stream message is posted on the bus.
|
||||
* We just set the pipeline to READY (which stops playback) */
|
||||
static void eos_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
g_print ("End-Of-Stream reached.\n");
|
||||
gst_element_set_state (data->playbin, GST_STATE_READY);
|
||||
}
|
||||
|
||||
/* This function is called when the pipeline changes states. We use it to
|
||||
* keep track of the current state. */
|
||||
static void state_changed_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin)) {
|
||||
data->state = new_state;
|
||||
g_print ("State set to %s\n", gst_element_state_get_name (new_state));
|
||||
if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED) {
|
||||
/* For extra responsiveness, we refresh the GUI as soon as we reach the PAUSED state */
|
||||
refresh_ui (data);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/* Extract metadata from all the streams and write it to the text widget in the GUI */
|
||||
static void analyze_streams (CustomData *data) {
|
||||
gint i;
|
||||
GstTagList *tags;
|
||||
gchar *str, *total_str;
|
||||
guint rate;
|
||||
gint n_video, n_audio, n_text;
|
||||
GtkTextBuffer *text;
|
||||
|
||||
/* Clean current contents of the widget */
|
||||
text = gtk_text_view_get_buffer (GTK_TEXT_VIEW (data->streams_list));
|
||||
gtk_text_buffer_set_text (text, "", -1);
|
||||
|
||||
/* Read some properties */
|
||||
g_object_get (data->playbin, "n-video", &n_video, NULL);
|
||||
g_object_get (data->playbin, "n-audio", &n_audio, NULL);
|
||||
g_object_get (data->playbin, "n-text", &n_text, NULL);
|
||||
|
||||
for (i = 0; i < n_video; i++) {
|
||||
tags = NULL;
|
||||
/* Retrieve the stream's video tags */
|
||||
g_signal_emit_by_name (data->playbin, "get-video-tags", i, &tags);
|
||||
if (tags) {
|
||||
total_str = g_strdup_printf ("video stream %d:\n", i);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str);
|
||||
total_str = g_strdup_printf (" codec: %s\n", str ? str : "unknown");
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
g_free (str);
|
||||
gst_tag_list_free (tags);
|
||||
}
|
||||
}
|
||||
|
||||
for (i = 0; i < n_audio; i++) {
|
||||
tags = NULL;
|
||||
/* Retrieve the stream's audio tags */
|
||||
g_signal_emit_by_name (data->playbin, "get-audio-tags", i, &tags);
|
||||
if (tags) {
|
||||
total_str = g_strdup_printf ("\naudio stream %d:\n", i);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
if (gst_tag_list_get_string (tags, GST_TAG_AUDIO_CODEC, &str)) {
|
||||
total_str = g_strdup_printf (" codec: %s\n", str);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
g_free (str);
|
||||
}
|
||||
if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
|
||||
total_str = g_strdup_printf (" language: %s\n", str);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
g_free (str);
|
||||
}
|
||||
if (gst_tag_list_get_uint (tags, GST_TAG_BITRATE, &rate)) {
|
||||
total_str = g_strdup_printf (" bitrate: %d\n", rate);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
}
|
||||
gst_tag_list_free (tags);
|
||||
}
|
||||
}
|
||||
|
||||
for (i = 0; i < n_text; i++) {
|
||||
tags = NULL;
|
||||
/* Retrieve the stream's subtitle tags */
|
||||
g_signal_emit_by_name (data->playbin, "get-text-tags", i, &tags);
|
||||
if (tags) {
|
||||
total_str = g_strdup_printf ("\nsubtitle stream %d:\n", i);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
|
||||
total_str = g_strdup_printf (" language: %s\n", str);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
g_free (str);
|
||||
}
|
||||
gst_tag_list_free (tags);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/* This function is called when an "application" message is posted on the bus.
|
||||
* Here we retrieve the message posted by the tags_cb callback */
|
||||
static void application_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
if (g_strcmp0 (gst_structure_get_name (gst_message_get_structure (msg)), "tags-changed") == 0) {
|
||||
/* If the message is the "tags-changed" (only one we are currently issuing), update
|
||||
* the stream info GUI */
|
||||
analyze_streams (data);
|
||||
}
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstStateChangeReturn ret;
|
||||
GstBus *bus;
|
||||
|
||||
/* Initialize GTK */
|
||||
gtk_init (&argc, &argv);
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Initialize our data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
data.duration = GST_CLOCK_TIME_NONE;
|
||||
|
||||
/* Create the elements */
|
||||
data.playbin = gst_element_factory_make ("playbin", "playbin");
|
||||
|
||||
if (!data.playbin) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Set the URI to play */
|
||||
g_object_set (data.playbin, "uri", "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL);
|
||||
|
||||
/* Connect to interesting signals in playbin */
|
||||
g_signal_connect (G_OBJECT (data.playbin), "video-tags-changed", (GCallback) tags_cb, &data);
|
||||
g_signal_connect (G_OBJECT (data.playbin), "audio-tags-changed", (GCallback) tags_cb, &data);
|
||||
g_signal_connect (G_OBJECT (data.playbin), "text-tags-changed", (GCallback) tags_cb, &data);
|
||||
|
||||
/* Create the GUI */
|
||||
create_ui (&data);
|
||||
|
||||
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
|
||||
bus = gst_element_get_bus (data.playbin);
|
||||
gst_bus_add_signal_watch (bus);
|
||||
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
|
||||
g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback)eos_cb, &data);
|
||||
g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, &data);
|
||||
g_signal_connect (G_OBJECT (bus), "message::application", (GCallback)application_cb, &data);
|
||||
gst_object_unref (bus);
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (data.playbin, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (data.playbin);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Register a function that GLib will call every second */
|
||||
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
|
||||
|
||||
/* Start the GTK main loop. We will not regain control until gtk_main_quit is called. */
|
||||
gtk_main ();
|
||||
|
||||
/* Free resources */
|
||||
gst_element_set_state (data.playbin, GST_STATE_NULL);
|
||||
gst_object_unref (data.playbin);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
> 
|
||||
> Need help?
|
||||
>
|
||||
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Build), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Build) or [Windows](installing/on-windows.md#InstallingonWindows-Build), or use this specific command on Linux:
|
||||
>
|
||||
> ``gcc basic-tutorial-5.c -o basic-tutorial-5 `pkg-config --cflags --libs gstreamer-video-1.0 gtk+-3.0 gstreamer-1.0` ``
|
||||
>
|
||||
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Run), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](installing/on-windows.md#InstallingonWindows-Run).
|
||||
>
|
||||
> This tutorial opens a GTK+ window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The Window has some GTK+ buttons to Pause, Stop and Play the movie, and a slider to show the current position of the stream, which can be dragged to change it. Also, information about the stream is shown on a column at the right edge of the window.
|
||||
>
|
||||
>
|
||||
> Bear in mind that there is no latency management (buffering), so on slow connections, the movie might stop after a few seconds. See how [](tutorials/basic/streaming.md) solves this issue.
|
||||
>
|
||||
> Required libraries: `gstreamer-video-1.0 gtk+-3.0 gstreamer-1.0`
|
||||
|
||||
## Walkthrough
|
||||
|
||||
Regarding this tutorial's structure, we are not going to use forward
|
||||
function definitions anymore: Functions will be defined before they are
|
||||
used. Also, for clarity of explanation, the order in which the snippets
|
||||
of code are presented will not always match the program order. Use the
|
||||
line numbers to locate the snippets in the complete code.
|
||||
|
||||
``` c
|
||||
#include <gdk/gdk.h>
|
||||
#if defined (GDK_WINDOWING_X11)
|
||||
#include <gdk/gdkx.h>
|
||||
#elif defined (GDK_WINDOWING_WIN32)
|
||||
#include <gdk/gdkwin32.h>
|
||||
#elif defined (GDK_WINDOWING_QUARTZ)
|
||||
#include <gdk/gdkquartzwindow.h>
|
||||
#endif
|
||||
```
|
||||
|
||||
The first thing worth noticing is that we are no longer completely
|
||||
platform-independent. We need to include the appropriate GDK headers for
|
||||
the windowing system we are going to use. Fortunately, there are not
|
||||
that many supported windowing systems, so these three lines often
|
||||
suffice: X11 for Linux, Win32 for Windows and Quartz for Mac OSX.
|
||||
|
||||
This tutorial is composed mostly of callback functions, which will be
|
||||
called from GStreamer or GTK+, so let's review the `main` function,
|
||||
which registers all these callbacks.
|
||||
|
||||
``` c
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstStateChangeReturn ret;
|
||||
GstBus *bus;
|
||||
|
||||
/* Initialize GTK */
|
||||
gtk_init (&argc, &argv);
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Initialize our data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
data.duration = GST_CLOCK_TIME_NONE;
|
||||
|
||||
/* Create the elements */
|
||||
data.playbin = gst_element_factory_make ("playbin", "playbin");
|
||||
|
||||
if (!data.playbin) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Set the URI to play */
|
||||
g_object_set (data.playbin, "uri", "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL);
|
||||
```
|
||||
|
||||
Standard GStreamer initialization and playbin pipeline creation, along
|
||||
with GTK+ initialization. Not much new.
|
||||
|
||||
``` c
|
||||
/* Connect to interesting signals in playbin */
|
||||
g_signal_connect (G_OBJECT (data.playbin), "video-tags-changed", (GCallback) tags_cb, &data);
|
||||
g_signal_connect (G_OBJECT (data.playbin), "audio-tags-changed", (GCallback) tags_cb, &data);
|
||||
g_signal_connect (G_OBJECT (data.playbin), "text-tags-changed", (GCallback) tags_cb, &data);
|
||||
```
|
||||
|
||||
We are interested in being notified when new tags (metadata) appears on
|
||||
the stream. For simplicity, we are going to handle all kinds of tags
|
||||
(video, audio and text) from the same callback `tags_cb`.
|
||||
|
||||
``` c
|
||||
/* Create the GUI */
|
||||
create_ui (&data);
|
||||
```
|
||||
|
||||
All GTK+ widget creation and signal registration happens in this
|
||||
function. It contains only GTK-related function calls, so we will skip
|
||||
over its definition. The signals to which it registers convey user
|
||||
commands, as shown below when reviewing the
|
||||
callbacks.
|
||||
|
||||
``` c
|
||||
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
|
||||
bus = gst_element_get_bus (data.playbin);
|
||||
gst_bus_add_signal_watch (bus);
|
||||
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
|
||||
g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback)eos_cb, &data);
|
||||
g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, &data);
|
||||
g_signal_connect (G_OBJECT (bus), "message::application", (GCallback)application_cb, &data);
|
||||
gst_object_unref (bus);
|
||||
```
|
||||
|
||||
In [](tutorials/playback/playbin-usage.md), `gst_bus_add_watch()` is
|
||||
used to register a function that receives every message posted to the
|
||||
GStreamer bus. We can achieve a finer granularity by using signals
|
||||
instead, which allow us to register only to the messages we are
|
||||
interested in. By calling `gst_bus_add_signal_watch()` we instruct the
|
||||
bus to emit a signal every time it receives a message. This signal has
|
||||
the name `message::detail` where *`detail`* is the message that
|
||||
triggered the signal emission. For example, when the bus receives the
|
||||
EOS message, it emits a signal with the name `message::eos`.
|
||||
|
||||
This tutorial is using the `Signals`'s details to register only to the
|
||||
messages we care about. If we had registered to the `message` signal, we
|
||||
would be notified of every single message, just like
|
||||
`gst_bus_add_watch()` would do.
|
||||
|
||||
Keep in mind that, in order for the bus watches to work (be it a
|
||||
`gst_bus_add_watch()` or a `gst_bus_add_signal_watch()`), there must be
|
||||
GLib `Main Loop` running. In this case, it is hidden inside the
|
||||
[GTK+](http://www.gtk.org/) main loop.
|
||||
|
||||
``` c
|
||||
/* Register a function that GLib will call every second */
|
||||
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
|
||||
```
|
||||
|
||||
Before transferring control to GTK+, we use `g_timeout_add_seconds
|
||||
()` to register yet another callback, this time with a timeout, so it
|
||||
gets called every second. We are going to use it to refresh the GUI from
|
||||
the `refresh_ui` function.
|
||||
|
||||
After this, we are done with the setup and can start the GTK+ main loop.
|
||||
We will regain control from our callbacks when interesting things
|
||||
happen. Let's review the callbacks. Each callback has a different
|
||||
signature, depending on who will call it. You can look up the signature
|
||||
(the meaning of the parameters and the return value) in the
|
||||
documentation of the signal.
|
||||
|
||||
``` c
|
||||
/* This function is called when the GUI toolkit creates the physical window that will hold the video.
|
||||
* At this point we can retrieve its handler (which has a different meaning depending on the windowing system)
|
||||
* and pass it to GStreamer through the VideoOverlay interface. */
|
||||
static void realize_cb (GtkWidget *widget, CustomData *data) {
|
||||
GdkWindow *window = gtk_widget_get_window (widget);
|
||||
guintptr window_handle;
|
||||
|
||||
if (!gdk_window_ensure_native (window))
|
||||
g_error ("Couldn't create native window needed for GstVideoOverlay!");
|
||||
|
||||
/* Retrieve window handler from GDK */
|
||||
#if defined (GDK_WINDOWING_WIN32)
|
||||
window_handle = (guintptr)GDK_WINDOW_HWND (window);
|
||||
#elif defined (GDK_WINDOWING_QUARTZ)
|
||||
window_handle = gdk_quartz_window_get_nsview (window);
|
||||
#elif defined (GDK_WINDOWING_X11)
|
||||
window_handle = GDK_WINDOW_XID (window);
|
||||
#endif
|
||||
/* Pass it to playbin, which implements VideoOverlay and will forward it to the video sink */
|
||||
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->playbin), window_handle);
|
||||
}
|
||||
```
|
||||
|
||||
The code comments talks by itself. At this point in the life cycle of
|
||||
the application, we know the handle (be it an X11's `XID`, a Window's
|
||||
`HWND` or a Quartz's `NSView`) of the window where GStreamer should
|
||||
render the video. We simply retrieve it from the windowing system and
|
||||
pass it to `playbin` through the `GstVideoOverlay` interface using
|
||||
`gst_video_overlay_set_window_handle()`. `playbin` will locate the video
|
||||
sink and pass the handler to it, so it does not create its own window
|
||||
and uses this one.
|
||||
|
||||
Not much more to see here; `playbin` and the `GstVideoOverlay` really simplify
|
||||
this process a lot!
|
||||
|
||||
``` c
|
||||
/* This function is called when the PLAY button is clicked */
|
||||
static void play_cb (GtkButton *button, CustomData *data) {
|
||||
gst_element_set_state (data->playbin, GST_STATE_PLAYING);
|
||||
}
|
||||
|
||||
/* This function is called when the PAUSE button is clicked */
|
||||
static void pause_cb (GtkButton *button, CustomData *data) {
|
||||
gst_element_set_state (data->playbin, GST_STATE_PAUSED);
|
||||
}
|
||||
|
||||
/* This function is called when the STOP button is clicked */
|
||||
static void stop_cb (GtkButton *button, CustomData *data) {
|
||||
gst_element_set_state (data->playbin, GST_STATE_READY);
|
||||
}
|
||||
```
|
||||
|
||||
These three little callbacks are associated with the PLAY, PAUSE and
|
||||
STOP buttons in the GUI. They simply set the pipeline to the
|
||||
corresponding state. Note that in the STOP state we set the pipeline to
|
||||
`READY`. We could have brought the pipeline all the way down to the
|
||||
`NULL` state, but, the transition would then be a little slower, since some
|
||||
resources (like the audio device) would need to be released and
|
||||
re-acquired.
|
||||
|
||||
``` c
|
||||
/* This function is called when the main window is closed */
|
||||
static void delete_event_cb (GtkWidget *widget, GdkEvent *event, CustomData *data) {
|
||||
stop_cb (NULL, data);
|
||||
gtk_main_quit ();
|
||||
}
|
||||
```
|
||||
|
||||
`gtk_main_quit()` will eventually make the call to to `gtk_main_run()`
|
||||
in `main` to terminate, which, in this case, finishes the program. Here,
|
||||
we call it when the main window is closed, after stopping the pipeline
|
||||
(just for the sake of tidiness).
|
||||
|
||||
``` c
|
||||
/* This function is called everytime the video window needs to be redrawn (due to damage/exposure,
|
||||
* rescaling, etc). GStreamer takes care of this in the PAUSED and PLAYING states, otherwise,
|
||||
* we simply draw a black rectangle to avoid garbage showing up. */
|
||||
static gboolean draw_cb (GtkWidget *widget, cairo_t *cr, CustomData *data) {
|
||||
if (data->state < GST_STATE_PAUSED) {
|
||||
GtkAllocation allocation;
|
||||
|
||||
/* Cairo is a 2D graphics library which we use here to clean the video window.
|
||||
* It is used by GStreamer for other reasons, so it will always be available to us. */
|
||||
gtk_widget_get_allocation (widget, &allocation);
|
||||
cairo_set_source_rgb (cr, 0, 0, 0);
|
||||
cairo_rectangle (cr, 0, 0, allocation.width, allocation.height);
|
||||
cairo_fill (cr);
|
||||
}
|
||||
|
||||
return FALSE;
|
||||
}
|
||||
```
|
||||
|
||||
When there is data flow (in the `PAUSED` and `PLAYING` states) the video
|
||||
sink takes care of refreshing the content of the video window. In the
|
||||
other cases, however, it will not, so we have to do it. In this example,
|
||||
we just fill the window with a black
|
||||
rectangle.
|
||||
|
||||
``` c
|
||||
/* This function is called when the slider changes its position. We perform a seek to the
|
||||
* new position here. */
|
||||
static void slider_cb (GtkRange *range, CustomData *data) {
|
||||
gdouble value = gtk_range_get_value (GTK_RANGE (data->slider));
|
||||
gst_element_seek_simple (data->playbin, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
|
||||
(gint64)(value * GST_SECOND));
|
||||
}
|
||||
```
|
||||
|
||||
This is an example of how a complex GUI element like a seeker bar (or
|
||||
slider that allows seeking) can be very easily implemented thanks to
|
||||
GStreamer and GTK+ collaborating. If the slider has been dragged to a
|
||||
new position, tell GStreamer to seek to that position
|
||||
with `gst_element_seek_simple()` (as seen in [Basic tutorial 4: Time
|
||||
management](tutorials/basic/time-management.md)). The
|
||||
slider has been setup so its value represents seconds.
|
||||
|
||||
It is worth mentioning that some performance (and responsiveness) can be
|
||||
gained by doing some throttling, this is, not responding to every single
|
||||
user request to seek. Since the seek operation is bound to take some
|
||||
time, it is often nicer to wait half a second (for example) after a seek
|
||||
before allowing another one. Otherwise, the application might look
|
||||
unresponsive if the user drags the slider frantically, which would not
|
||||
allow any seek to complete before a new one is queued.
|
||||
|
||||
``` c
|
||||
/* This function is called periodically to refresh the GUI */
|
||||
static gboolean refresh_ui (CustomData *data) {
|
||||
gint64 current = -1;
|
||||
|
||||
/* We do not want to update anything unless we are in the PAUSED or PLAYING states */
|
||||
if (data->state < GST_STATE_PAUSED)
|
||||
return TRUE;
|
||||
```
|
||||
|
||||
This function will move the slider to reflect the current position of
|
||||
the media. First off, if we are not in the `PLAYING` state, we have
|
||||
nothing to do here (plus, position and duration queries will normally
|
||||
fail).
|
||||
|
||||
``` c
|
||||
/* If we didn't know it yet, query the stream duration */
|
||||
if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
|
||||
if (!gst_element_query_duration (data->playbin, GST_FORMAT_TIME, &data->duration)) {
|
||||
g_printerr ("Could not query current duration.\n");
|
||||
} else {
|
||||
/* Set the range of the slider to the clip duration, in SECONDS */
|
||||
gtk_range_set_range (GTK_RANGE (data->slider), 0, (gdouble)data->duration / GST_SECOND);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
We recover the duration of the clip if we didn't know it, so we can set
|
||||
the range for the slider.
|
||||
|
||||
``` c
|
||||
if (gst_element_query_position (data->playbin, GST_FORMAT_TIME, ¤t)) {
|
||||
/* Block the "value-changed" signal, so the slider_cb function is not called
|
||||
* (which would trigger a seek the user has not requested) */
|
||||
g_signal_handler_block (data->slider, data->slider_update_signal_id);
|
||||
/* Set the position of the slider to the current pipeline position, in SECONDS */
|
||||
gtk_range_set_value (GTK_RANGE (data->slider), (gdouble)current / GST_SECOND);
|
||||
/* Re-enable the signal */
|
||||
g_signal_handler_unblock (data->slider, data->slider_update_signal_id);
|
||||
}
|
||||
return TRUE;
|
||||
```
|
||||
|
||||
We query the current pipeline position, and set the position of the
|
||||
slider accordingly. This would trigger the emission of the
|
||||
`value-changed` signal, which we use to know when the user is dragging
|
||||
the slider. Since we do not want seeks happening unless the user
|
||||
requested them, we disable the `value-changed` signal emission during
|
||||
this operation with `g_signal_handler_block()` and
|
||||
`g_signal_handler_unblock()`.
|
||||
|
||||
Returning `TRUE` from this function will keep it called in the future. If
|
||||
we return `FALSE`, the timer will be
|
||||
removed.
|
||||
|
||||
``` c
|
||||
/* This function is called when new metadata is discovered in the stream */
|
||||
static void tags_cb (GstElement *playbin, gint stream, CustomData *data) {
|
||||
/* We are possibly in a GStreamer working thread, so we notify the main
|
||||
* thread of this event through a message in the bus */
|
||||
gst_element_post_message (playbin,
|
||||
gst_message_new_application (GST_OBJECT (playbin),
|
||||
gst_structure_new_empty ("tags-changed")));
|
||||
}
|
||||
```
|
||||
|
||||
This is one of the key points of this tutorial. This function will be
|
||||
called when new tags are found in the media, **from a streaming
|
||||
thread**, this is, from a thread other than the application (or main)
|
||||
thread. What we want to do here is to update a GTK+ widget to reflect
|
||||
this new information, but **GTK+ does not allow operating from threads
|
||||
other than the main one**.
|
||||
|
||||
The solution is to make `playbin` post a message on the bus and return
|
||||
to the calling thread. When appropriate, the main thread will pick up
|
||||
this message and update GTK.
|
||||
|
||||
`gst_element_post_message()` makes a GStreamer element post the given
|
||||
message to the bus. `gst_message_new_application()` creates a new
|
||||
message of the `APPLICATION` type. GStreamer messages have different
|
||||
types, and this particular type is reserved to the application: it will
|
||||
go through the bus unaffected by GStreamer. The list of types can be
|
||||
found in the `GstMessageType` documentation.
|
||||
|
||||
Messages can deliver additional information through their embedded
|
||||
`GstStructure`, which is a very flexible data container. Here, we create
|
||||
a new structure with `gst_structure_new()`, and name it `tags-changed`, to
|
||||
avoid confusion in case we wanted to send other application messages.
|
||||
|
||||
Later, once in the main thread, the bus will receive this message and
|
||||
emit the `message::application` signal, which we have associated to the
|
||||
`application_cb` function:
|
||||
|
||||
``` c
|
||||
/* This function is called when an "application" message is posted on the bus.
|
||||
* Here we retrieve the message posted by the tags_cb callback */
|
||||
static void application_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
if (g_strcmp0 (gst_structure_get_name (gst_message_get_structure (msg)), "tags-changed") == 0) {
|
||||
/* If the message is the "tags-changed" (only one we are currently issuing), update
|
||||
* the stream info GUI */
|
||||
analyze_streams (data);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Once me made sure it is the `tags-changed` message, we call the
|
||||
`analyze_streams` function, which is also used in [](tutorials/playback/playbin-usage.md) and is
|
||||
more detailed there. It basically recovers the tags from the stream and
|
||||
writes them in a text widget in the GUI.
|
||||
|
||||
The `error_cb`, `eos_cb` and `state_changed_cb` are not really worth
|
||||
explaining, since they do the same as in all previous tutorials, but
|
||||
from their own function now.
|
||||
|
||||
And this is it! The amount of code in this tutorial might seem daunting
|
||||
but the required concepts are few and easy. If you have followed the
|
||||
previous tutorials and have a little knowledge of GTK, you probably
|
||||
understood this one can now enjoy your very own media player!
|
||||
|
||||

|
||||
|
||||
## Exercise
|
||||
|
||||
If this media player is not good enough for you, try to change the text
|
||||
widget that displays the information about the streams into a proper
|
||||
list view (or tree view). Then, when the user selects a different
|
||||
stream, make GStreamer switch streams! To switch streams, you will need
|
||||
to read [](tutorials/playback/playbin-usage.md).
|
||||
|
||||
## Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to output the video to a particular window handle
|
||||
using `gst_video_overlay_set_window_handle()`.
|
||||
|
||||
- How to refresh the GUI periodically by registering a timeout
|
||||
callback with `g_timeout_add_seconds ()`.
|
||||
|
||||
- How to convey information to the main thread by means of application
|
||||
messages through the bus with `gst_element_post_message()`.
|
||||
|
||||
- How to be notified only of interesting messages by making the bus
|
||||
emit signals with `gst_bus_add_signal_watch()` and discriminating
|
||||
among all message types using the signal details.
|
||||
|
||||
This allows you to build a somewhat complete media player with a proper
|
||||
Graphical User Interface.
|
||||
|
||||
The following basic tutorials keep focusing on other individual
|
||||
GStreamer topics
|
||||
|
||||
It has been a pleasure having you here, and see you soon!
|
||||
Reference in New Issue
Block a user