Sunday 30 January 2011

Analyse Xingmux Plugin

xingmux adds a Xing header to MP3 files. This contains information about the duration and size
of the file and a seek table and is very useful for getting an almost correct duration and better
seeking on VBR MP3 files.

This element will remove any existing Xing, LAME or VBRI headers from the beginning of the file.

Pad templates and the element details are registered with the plugin during the gst_xing_mux_base_init()

GStreamer uses a type system to ensure that the data passed between elements is in a recognized format

pad templates are registered during the gst_xing_mux_base_init () function. Pads are created from these
templates in the gst_xing_mux_init () function using gst_pad_new_from_template (). The
template can be retrieved from the element class using gst_element_class_get_pad_template()

method 1 and 2 are doing the same, but what's the difference?


1.
GstElementClass *klass = GST_ELEMENT_CLASS (xingmux_class);
gst_pad_new_from_template (gst_element_class_get_pad_template (klass,"sink"), "sink");
gst_pad_new_from_template (gst_element_class_get_pad_template (klass,"src"), "src");

2.
gst_pad_new_from_static_template (&gst_xing_mux_sink_template, "sink");
gst_pad_new_from_static_template (&gst_xing_mux_src_template, "src");

xingmux has a sinkpad and a srcpad, it supports mpeg version 1 audio, layer 1 to 3,
any channel at any samplerate


85 static GstStaticPadTemplate gst_xing_mux_sink_template =                                                                  
86 GST_STATIC_PAD_TEMPLATE ("sink",                                                                                          
87     GST_PAD_SINK,                                                                                                         
88     GST_PAD_ALWAYS,                                                                                                       
89     GST_STATIC_CAPS ("audio/mpeg, "                                                                                       
90         "mpegversion = (int) 1, " "layer = (int) [ 1, 3 ]")); 

initialise the class only once during the gst_xing_mux_class_init()
(specifying what signals, arguments and virtual functions the class has and setting up global state)

The primary and most important way of controlling how an element behaves, is through GObject
properties. GObject properties are defined in the _class_init () function.

gst_xing_mux_init()


  • initialize the new element
  • instantiate pads and add them to element
  • set pad calback functions
  • initialize instance structure

    gst_pad_set_setcaps_function()is called during caps negotiation,
    This is the process where the linked pads decide on the streamtype that will transfer

    Caps negotiation is the process where elements configure themselves and each other for streaming a
    particular media format over their pads.

    between them, here using
    gst_pad_set_setcaps_function (xing->sinkpad, GST_DEBUG_FUNCPTR (gst_pad_proxy_setcaps));

    If at all possible, your element should derive from one of the new base classes (Pre-made base classes).
    If you use a base class, you will rarely have to handle state changes yourself. All you have to do is
    override the base class’s start() and stop() virtual functions (might be called differently depending on the
    base class) and the base class will take care of everything for you.

    Do not g_assert for unhandled state changes; this is taken care of by the GstElement base class.

Test Pipeline:gst-launch-0.10 filesrc location=test.mp3 ! xingmux ! filesink location=test2.mp3

additional: function plugin_init() register the element when the plugin is loaded, is in the source file plugin.c under the same folder gst-plugins-ugly/gst/mpegaudioparse

meego play rmvb avi mp4 medio file

My notebook is Asus EeePC 1001px, with meego v1.1
If you can not play the Big_Buck_Bunny.ogv.

Check whether your notebook's hardware graphic acceleration is enabled:
glxinfo | grep "renderer string"
My EeePC output is: "/OpenGL renderer string: Mesa DRI Intel(R) IGD GEM 20100330 DEVELOPMENT x86/MMX/SSE2/ ".
If your output is: "/OpenGL renderer string: Software Rasterizer/", that means acceleration is not fully enabled, your graphics driver might not be capable. Maybe this is the reason that why your notebook can't play video.

For mp3,mp4,avi,rmvb and so on in meego, there's no non-free codecs gstreamer packages form Meego Official site. You can try this command to see the gstreamer related packages.
zypper se gst
In my notebook, there is only gst-plugins-bad-free, but no gst-plugins-bad. We downloaded the gstremer source from Gstreamer Official Website, built and installed to Our EeePC. Then I played non-free types quite smoothly.

Gstreamer Time & Sync

GstClock


Frist some introduce words:

absolute_time: the current time get from GstClock (monotonically increasing time)

running_time:the total time spent in the PLAYING state

base_time:defined as the absolute_time minus the running_time at the time when the pipeline is set to PLAYING

stream_time:the position in the stream(between 0 and the total duration)

GStreamer can use different clocks:


  • system time (with g_get_current_time() and with microsecond accuracy)
  • soundcards and other devices time(better, use get_clock in some elements to get a clock)
  • a network source based on packets received + timestamps in those packets (a typical example is an RTP source)

time is always expessed in nanoseconds, so I think there must have a conversion between different time types.

pipeline state and running_time:

  • NULL/READY, the running_time is undefined
  • PAUSED, the running_time remains at the time when it was last PAUSED
  • PLAYING, the running_time is the delta between the absolute_time and the base time
  • after a flushing seek, the running_time is set to 0(redistribute a new base_time to implement)

    running_time = absolute_time - base_time(In the PLAYING state)

GstBuffer

The GstBuffer timestamps and the preceeding NEW_SEGMENT event define a transformation of the buffer timestamps to running_time

B: GstBuffer

  • B.timestamp = buffer timestamp (GST_BUFFER_TIMESTAMP)

NS: NEWSEGMENT event preceeding the buffers

  • NS.start: start field in the NEWSEGMENT event
  • NS.stop: stop field in the NEWSEGMENT event
  • NS.rate: rate field of NEWSEGMENT event
  • NS.abs_rate: absolute value of rate field of NEWSEGMENT event(By default a pipeline will play from position 0 to
    the total duration of the media at a rate of 1.0)
  • NS.time: time field in the NEWSEGMENT event
  • NS.accum: total accumulated time of all previous NEWSEGMENT events. This field is kept in the GstSegment structure


    if (NS.rate > 0.0)
        B.running_time = (B.timestamp - NS.start) / NS.abs_rate + NS.accum
    else
        B.running_time = (NS.stop - B.timestamp) / NS.abs_rate + NS.accum
    

  • And we also got:

    stream_time = (B.timestamp - NS.start) * NS.abs_applied_rate + NS.time

  • This formula is typically used in sinks to report the current position in
    an accurate and efficient way:

    stream_time = (absolute_time - base_time - NS.accum) * NS.abs_rate * NS.abs_applied_rate + NS.time

Synchronisation

There are two ways to get the running_time:

  • using the clock and the element's base_time with:

    C.running_time = absolute_time - base_time

  • using the buffer timestamp and the preceeding NEWSEGMENT event as (assuming positive playback rate):

    B.running_time = (B.timestamp - NS.start) / NS.abs_rate + NS.accum

  • For synchronisation the following must hold:

    B.running_time = C.running_time

  • expaning:

    B.running_time = absolute_time - base_time

  • or:

    absolute_time = B.running_time + base_time

  • The absolute_time when a buffer with B.running_time should be played is noted with B.sync_time. Thus:

    B.sync_time = B.running_time + base_time

Friday 21 January 2011

ctags的指定语言类型

今天用ctags看函数的时候,居然跳转到了html文件里面。
然后man了一下,发现:

By  default,  ctags automatically selects the language of a source file, ignoring those files whose language cannot be determined
默认情况ctags会自动设置源码文件的类型

ctags --list-languages
用这个命令看下支持的语言:
Ant
Asm
Asp
Awk
Basic
BETA
C
C++
C#
Cobol
DosBatch
Eiffel
Erlang
Flex
Fortran
HTML
Java
JavaScript
Lisp
Lua
Make
MatLab
OCaml
Pascal
Perl
PHP
Python
REXX
Ruby
Scheme
Sh
SLang
SML
SQL
Tcl
Tex
Vera
Verilog
VHDL
Vim
YACC

这么多语言,看来ctags把C和html都当成源码了

[The  reason  that  .h  extensions  are  mapped  to C++ files rather than C files is because it is common to use .h extensions in C++, and no harm
 results in treating them as C++ files.]
从上面信息发现.h文件属于C++的

所以用下面这命名重新生成tags文件
ctags -R --langmap=C:.c.h --languages=C *

问题搞定,man才是王道

Sunday 9 January 2011

gstreamer dependencies for plugin mad

configure:24899: *** checking feature: mad mp3 decoder ***
configure:24903: *** for plug-ins: mad ***
configure:24946: checking for ID3TAG
configure:24954: $PKG_CONFIG --exists --print-errors "$which"
Package id3tag was not found in the pkg-config search path.
Perhaps you should add the directory containing `id3tag.pc'
to the PKG_CONFIG_PATH environment variable
No package 'id3tag' found
configure:24957: $? = 1
configure:24972: $PKG_CONFIG --exists --print-errors "$which"
Package id3tag was not found in the pkg-config search path.
Perhaps you should add the directory containing `id3tag.pc'
to the PKG_CONFIG_PATH environment variable
No package 'id3tag' found
configure:24975: $? = 1
No package 'id3tag' found
configure:25003: result: no
configure:25010: No package 'id3tag' found
configure:25036: checking id3tag.h usability
configure:25036: gcc -std=gnu99 -c -g -O2 conftest.c >&5
conftest.c:83: fatal error: id3tag.h: No such file or directory
compilation terminated.

sudo apt-get install libid3tag0-dev

configure:24899: *** checking feature: mad mp3 decoder ***
configure:24903: *** for plug-ins: mad ***
configure:24946: checking for ID3TAG
configure:24954: $PKG_CONFIG --exists --print-errors "$which"
configure:24957: $? = 0
configure:24972: $PKG_CONFIG --exists --print-errors "$which"
configure:24975: $? = 0
configure:25027: result: yes
configure:25096: checking for MAD
configure:25104: $PKG_CONFIG --exists --print-errors "$which"
Package mad was not found in the pkg-config search path.
Perhaps you should add the directory containing `mad.pc'
to the PKG_CONFIG_PATH environment variable
No package 'mad' found
configure:25107: $? = 1
configure:25122: $PKG_CONFIG --exists --print-errors "$which"
Package mad was not found in the pkg-config search path.
Perhaps you should add the directory containing `mad.pc'
to the PKG_CONFIG_PATH environment variable
No package 'mad' found
configure:25125: $? = 1
No package 'mad' found
configure:25153: result: no
configure:25160: No package 'mad' found
configure:25187: checking mad.h usability
configure:25187: gcc -std=gnu99 -c -g -O2 conftest.c >&5
conftest.c:83: fatal error: mad.h: No such file or directory
compilation terminated.

sudo apt-get install libmad0-dev


So do this two commands, you can build plugin mad:
sudo apt-get install libid3tag0-dev
sudo apt-get install libmad0-dev

Wednesday 5 January 2011

Time in Linux programming

Kernel measures the passage of the time in three different ways:

  1. Wall time ---> measuring absolute time
  2. Process time
  3. Monotonic time ---> calculating relative time

    Unix systems represent absolute time as the number of elapsed seconds since the epoch, which is defined as 00:00:00 UTC on the morning of 1 January 1970.
    SO, even absolute time is, at a low level, relative time.

    The following table shows some useful functions for uing time:
Name second Microsecond Nanosecond
Getting the current time of day time() gettimeofday() clock_gettime()
Setting the current time of day stime() settimeofday() clock_settime()
Sleeping sleep() usleep() nanosleep()

Now Microsoft, ipod nano are in my mind, are these names from here? So interesting.

With a 32-bit long type, time_t will let us have the Y2K mess all over again --- in 2038!

and "Come 22:14:07 on Monday, 18 January 2038, most systems and software will be 64-bit." said Robert Love. Will these happen, let's go and see.

POSIX Clocks have four of the linux standard time source:

  • CLOCK_MONOTONIC
  • CLOCK_PROCESS_CPUTIME_ID
  • CLOCK_REALTIME
  • CLOCK_THREAD_CPUTIME_ID

Tuning the system clock

example by a Makefile

make looks at the file modification timestamps of the source file versus the object file. If the source file is newer than the object file, make rebuilds the source file into an updated object file.If the source file is not newer than the object, however, no action is taken

Tuesday 4 January 2011

pulseaudio module-detect analyse

source code path:
src/modules/module-detect.c

1. run this function: "int pa__init(pa_module*m)"
"pa_bool_t just_one = FALSE;" allow more than 1 devices

function pa_modargs_new
in "pulsecore/modargs.c".
"modargs.c" analyse the input args, get the args' value such as: "u32, bool, s32..." and store in pa_hashmap list.

for ALSA,OSS,SOLARIS, each has different device node, so the have different codes.
and for WIN32, it does nothing except load module "module-waveout"

There only analyse ALSA.

#ifdef HAVE_ALSA--> use for alsa
2. "detect_alsa" function:

read info from this file "/proc/asound/devices", to get the device info.
("/dev/sndstat", "/proc/sndstat", "/proc/asound/oss/sndstat" these 3 for OSS)
in my PC, the file is:

------------>
2: : timer
3: : sequencer
4: [ 0- 1]: digital audio playback
5: [ 0- 0]: digital audio playback
6: [ 0- 0]: digital audio capture
7: [ 0- 2]: hardware dependent
8: [ 0] : control
<------------

The file above tells that my PC have one control channel, two PCM playback devices (DAC's), a PCM capture device (ADC's), a hardware dependent device, a MIDI sequencer, and a timer.

This function "sscanf(line, " %*i: [%u- %u]: ", &device, &subdevice)" is so insteresting.

3. Use funtion: "pa_module_load()" to load: "module-alsa-sink", "module-alsa-source" the two modules.

The end: unload the "module-detect" itself.

Monday 3 January 2011

Gstreamer basic Knowledge 2nd

This is my second time to read GStreamer Application Development Manual

Record something important


  • Element:

    Element is an object that can send and/or receive data, also is the most important class of objects in GStreamer
TypeDetailsNumber of pads
Source elementsdo not accept data, only generate dataa source pad
Filters and filter-like elementsoperate the dataat least one input and one output pads
Sink elementsaccept data and do not produce anythinga sink pad
  • Create an element(two methods)
    1. gst_element_factory_make
    2. gst_element_factory_find + gst_element_factory_create

    • gst_init has to be called frist
    • When you don’t need the element anymore, you need to unref it using gst_object_unref ()

      The following example isthe simplest way to create an element using gst_element_factory_make
      #include <gst/gst.h>
      int main (int argc, char *argv[])
      {
      GstElement *element;
      /* init GStreamer */
      gst_init (&argc, &argv);
      /* create element */
      element = gst_element_factory_make ("fakesrc", "source");
      if (!element) {
      g_print ("Failed to create element of type ’fakesrc’\n");
      return -1;
      }
      gst_object_unref (GST_OBJECT (element));
      return 0;
      }

  • Pad:

    Pad is element’s input or output

    A pad can have any of three availabilities:always, sometimes and on request

  • Bin & Pipeline:

    A bin is a container elements

    A pipeline is a special subtype of a bin that allows scheduling of the containing elements

  • Bus:

    A bus is a simple system that takes care of forwarding messages from the pipeline threads to an application in its own thread context

    Every pipeline contains a bus by default

  • Use a bus(two methods)
    1. Run a main loop and attach some kind of watch to the bus using gst_bus_add_watch () or gst_bus_add_signal_watch ()
    2. Get the messages on the bus using gst_bus_peek () and/or gst_bus_poll ()

      Message types:
NameDetails
Error, warning and information notificationsthose are used by elements if a message should be shown to the user about the state of the pipeline
End-of-stream notificationemitted when the stream has ended
Tagsemitted when metadata was found in the stream
State-changesemitted after a successful state change
Bufferingemitted during caching of network-streams
Element messagesthese are special messages that are unique to certain elements and usually represent additional features
Application-specific messagesany information on those can be extracted by getting the message |structure (see above) and reading its fields
  • Buffers & Events:

    The data flowing through a pipeline consists of a combination of buffers and events
    • Buffers contain the actual media data
    • Events contain control information, such as seeking information and end-of-stream notifiers

  • A buffer consists:
NumberNameDescription
1PoniterA pointer to a piece of memory
2SizeThe size of the memory
3TimestampA timestamp for the buffer
4refcountA refcount that indicates how many elements are using this buffer, This refcount will be used to destroy the buffer when no element has a reference to it
5FlagsBuffer flags

Get more information here