Discussion:
Cinema automation cues
Ricardo Costa
15 years ago
Permalink
Hey there everyone,

I'm connection my desktop to my digital video projector to run features and
shorts programs. Is there any way to tag or embed a theora video file with
any sort of cues at certain points in the program or individual file
(feature start, end credits, feature end, etc.) that can then be turned into
pulses for theater automation systems to cue lights up, lights down, sound
format switch, etc.? Thanks for any help.
--
Ricardo M. Costa
Film & HD Video Projection
(323) 491-7346
Benjamin M. Schwartz
15 years ago
Permalink
Post by Ricardo Costa
. Is there any way to tag or embed a theora video
file with any sort of cues at certain points in the program or
individual file (feature start, end credits, feature end, etc.) that can
then be turned into pulses for theater automation systems to cue lights
up, lights down, sound format switch, etc.? Thanks for any help.
Theora (or more accurately, the Ogg container) does not have a standard
system for coding these sorts of timed events. You can easily create one
yourself, though.

My recommendation is to use a separate file containing a labeled list of
timestamps within the video. If it is important that only one file be
used, you can encapsulate the contents of that file inside the Ogg file in
a Skeleton "message header". There are utilities and libraries available
for easily reading and writing these message headers.

--Ben
o***@googlemail.com
15 years ago
Permalink
Another way would be to embed these commands as text, time-interleaved
with the video. These could be done as subtitles are, but with textual
commands instead of subtitle text. A custom category will tell a
player that this stream is not subtitles. I could show an possible
example if wanted.

In any case (embedded against separate text file), you'd have to have
some code to interpret these commands and act upon them when you read
them back. The format is these commands is up to you.
Ricardo Costa
15 years ago
Permalink
So, would it be better to work with this on the player end, with whatever
playlist formats are available there? This seems logical to use with events
between files in a playlist (between shorts, or in between the previews and
the feature) but with something like "lights-up-to-halflight" at the start
of the end credits of a feature, I didn't think the playlist would be
useful, since that's an event in the middle of a file (even if it is right
towards the very end).

Know what I mean? I'm on Linux, Ubuntu to be precise. What players would you
recommend to handle advanced options like this? I realize we might start
migrating off of the ogg topic right now, so my apologies ahead of time.

- Ricardo

On Fri, Jul 16, 2010 at 6:08 PM, Benjamin M. Schwartz <
...
--
Ricardo M. Costa
Film & HD Video Projection
(323) 491-7346
o***@googlemail.com
15 years ago
Permalink
Post by Ricardo Costa
Know what I mean? I'm on Linux, Ubuntu to be precise. What players would you
recommend to handle advanced options like this? I realize we might start
migrating off of the ogg topic right now, so my apologies ahead of time.
Gstreamer. It can do everything, even if it can be a right pain to debug.

Here, you could use gst-launch to demux embedded text commands to
stdout as playback progresses, and have a shell script read those and
run whatever program controls the light system.

If you go the playlist (or external file) way, then you'd typically
run the player and the parsing/timing code separately. You'd have to
keep the two in sync though, if you're doing anything else than plain
play-forward.

For the text file embedded in a skeleton message header, you'd do the
second method, but with a preprocessing step to extract that info (eg,
using oggz-info, which I think will print out these headers).
Silvia Pfeiffer
15 years ago
Permalink
...
Why not use Ogg Kate and a srt file to include commands instead of
subtitles - then after decoding with libkate (or from gstreamer), you
can simply execute your commands with some other program.

srt could also be used as the external file type, but then you have to
do the synchronisation between the media file and the srt file
yourself.

(I think that's what oggk also means, but I wasn't quite sure, so
thought it worth clarifying).

Cheers,
Silvia.
o***@googlemail.com
15 years ago
Permalink
Yes, it's what I meant. The neatest way would be to make your own codec
though, but that'd be a fair amount more work (and you'd need to make a
plugin for whatever player you'd be using too).
...
Continue reading on narkive:
Loading...