summaryrefslogtreecommitdiff
path: root/doc/hooks.texi
blob: 15013547ca25b2c389de2d309c1036055c708b86 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
\input texinfo @c -*- texinfo -*-

@settitle Video Hook Documentation
@titlepage
@sp 7
@center @titlefont{Video Hook Documentation}
@sp 3
@end titlepage


@chapter Introduction


The video hook functionality is designed (mostly) for live video. It allows
the video to be modified or examined between the decoder and the encoder.

Any number of hook modules can be placed inline, and they are run in the
order that they were specified on the ffmpeg command line.

Three modules are provided and are described below. They are all intended to
be used as a base for your own modules.

Modules are loaded using the -vhook option to ffmpeg. The value of this parameter
is a space separated list of arguments. The first is the module name, and the rest
are passed as arguments to the Configure function of the module.

@section null.c

This does nothing. Actually it converts the input image to RGB24 and then converts
it back again. This is meant as a sample that you can use to test your setup.

@section fish.c

This implements a 'fish detector'. Essentially it converts the image into HSV
space and tests whether more than a certain percentage of the pixels fall into
a specific HSV cuboid. If so, then the image is saved into a file for processing
by other bits of code.

Why use HSV? It turns out that HSV cuboids represent a more compact range of
colors than would an RGB cuboid.

@section imlib2.c

This module implements a text overlay for a video image. Currently it
supports a fixed overlay or reading the text from a file. The string
is passed through strftime so that it is easy to imprint the date and
time onto the image.

You may also overlay an image (even semi-transparent) like TV stations do.
You may move either the text or the image around your video to create
scrolling credits, for example.

Text fonts are being looked for in a FONTPATH environment variable.

Options:
@multitable @columnfractions .2 .8
@item @option{-c <color>}     @tab The color of the text
@item @option{-F <fontname>}  @tab The font face and size
@item @option{-t <text>}      @tab The text
@item @option{-f <filename>}  @tab The filename to read text from
@item @option{-x <expresion>} @tab X coordinate of text or image
@item @option{-y <expresion>} @tab Y coordinate of text or image
@item @option{-i <filename>}  @tab The filename to read a image from
@end multitable

Expresions are functions of these variables:
@multitable @columnfractions .2 .8
@item @var{N} @tab frame number (starting at zero)
@item @var{H} @tab frame height
@item @var{W} @tab frame width
@item @var{h} @tab image height
@item @var{w} @tab image width
@item @var{X} @tab previous x coordinate of text or image
@item @var{Y} @tab previous y coordinate of text or image
@end multitable

You may also use the constants @var{PI}, @var{E}, and the math functions available at the
FFmpeg formula evaluator at (@url{ffmpeg-doc.html#SEC13}), except @var{bits2qp(bits)}
and @var{qp2bits(qp)}.

Usage examples:

@example
   # Remember to set the path to your fonts
   FONTPATH="/cygdrive/c/WINDOWS/Fonts/"
   FONTPATH="$FONTPATH:/usr/share/imlib2/data/fonts/"
   FONTPATH="$FONTPATH:/usr/X11R6/lib/X11/fonts/TTF/"
   export FONTPATH

   # Bulb dancing in a Lissajous pattern
   ffmpeg -i input.avi -vhook \
     'vhook/imlib2.dll -x W*(0.5+0.25*sin(N/47*PI))-w/2 -y H*(0.5+0.50*cos(N/97*PI))-h/2 -i /usr/share/imlib2/data/images/bulb.png' \
     -acodec copy -sameq output.avi

   # Text scrolling
   ffmpeg -i input.avi -vhook \
     'vhook/imlib2.dll -c red -F Vera.ttf/20 -x 150+0.5*N -y 70+0.25*N -t Hello' \
     -acodec copy -sameq output.avi
@end example

@section ppm.c

It's basically a launch point for a PPM pipe, so you can use any
executable (or script) which consumes a PPM on stdin and produces a PPM
on stdout (and flushes each frame).

Usage example:

@example
ffmpeg -i input -vhook "/path/to/ppm.so some-ppm-filter args" output
@end example

@bye