Live Motion Graphics

A place to discuss Webcasting
Post Reply
nerddigestive
Posts: 3
Joined: Sun Mar 30, 2014 4:33 am

Live Motion Graphics

Post by nerddigestive » Sun Mar 30, 2014 4:35 am

I am a motion graphics artist and I use an ATEM switcher and control desk by BlackMagic. I need to somehow set up a videofeed that allows me to make a motion graphic in After Effects, render it, and then add text on the fly or recieve live input such as from a twitter feed. What software do I need to do this or must I make my own using something like PureData?

User avatar
PID_Stop
Posts: 476
Joined: Thu Apr 01, 2010 11:58 am
Location: Syracuse, New York
Contact:

Re: Live Motion Graphics

Post by PID_Stop » Mon Mar 31, 2014 12:14 pm

First, welcome to the forum!

I'm not exactly clear on what you're trying to do here -- if I'm reading your post correctly, it sounds like you're trying to display text that changes in real time over an animated background that you compose with After Effects, right? If it's that simple, I would think the easiest solution would be to use PowerPoint: build the motion graphic onto a slide, and set up to display the text as an RSS feed on top of the motion graphic. It should be a fairly trivial exercise to write an app that puts your text into RSS format (assuming that it isn't already).

On the other hand, if the motion graphic is something like a banner for a lower-third CG, and you're keying that and text over another live video feed through the ATEM, things get more complex pretty quickly. You should be able to store the motion graphic in an ATEM media store (assuming that you have a model other than the Television Studio, which I think only supports stills). But you still need to generate the text in a way that lets you key it, which implies some sort of character generator that feeds a key and fill signal to the ATEM.

One approach might be to add something like a Ross MC1 card downstream of the ATEM: it's essentially a mixer/keyer card with some enhancements to make it a master control switcher. You can run it as a standalone device (it's usually paired with a Ross NK routing switcher for master control applications), and it incorporates several internal frame buffers with keyers. Ross xPression character generator software can push composited text to one or more of the frame buffers, so the finished text will key over your live video (and banner). Depending on the version of xPression you get, you can link text fields to various sorts of data sources... if you get the Developer version, you can write Visual Basic code to explicitly control the CG functions in really powerful ways. That's actually how we air school closings at our regional facility: we have a single computer running xPression Developer and the VB app I wrote, that pushes real-time closings displays to each of our stations' master control switcher MC1 cards.

Doubtless there are other approaches... if we have a better idea of what you're looking to do, we can likely help you narrow things down to a practical solution.

Regards,

-- Jeff

nerddigestive
Posts: 3
Joined: Sun Mar 30, 2014 4:33 am

Re: Live Motion Graphics

Post by nerddigestive » Tue Apr 01, 2014 11:41 am

Thank you :)

Most of the graphics will be lower thirds so I am thinking of trialling a system using CasparCG as a CG server and client set up. I don't know if you know of it but it essentially uses Flash to allow the text to be dynamic and scriptable. I will probably send this to the ATEM and then either use a separate fill and key source or simply chroma key it? Does this sound like a vaguely sensible system?

User avatar
PID_Stop
Posts: 476
Joined: Thu Apr 01, 2010 11:58 am
Location: Syracuse, New York
Contact:

Re: Live Motion Graphics

Post by PID_Stop » Tue Apr 01, 2014 12:15 pm

nerddigestive wrote:Thank you :)

Most of the graphics will be lower thirds so I am thinking of trialling a system using CasparCG as a CG server and client set up. I don't know if you know of it but it essentially uses Flash to allow the text to be dynamic and scriptable. I will probably send this to the ATEM and then either use a separate fill and key source or simply chroma key it? Does this sound like a vaguely sensible system?
That sounds like a practical solution. I would suggest using the separate key / fill signals rather than chromakey, for several reasons. From an operator's perspective, setting up a chromakey is more cumbersome than a traditional external key. From a content point of view, using chromakey means you can't include the background color in your graphic because it will disappear. We have some experience with Blackmagic Decklink cards, and they should do just fine for what you need. The ATEM has a frame synchronizer on every input, so you don't need to worry about feeding a reference signal to the Decklink card.

Good luck!

-- Jeff

nerddigestive
Posts: 3
Joined: Sun Mar 30, 2014 4:33 am

Re: Live Motion Graphics

Post by nerddigestive » Wed Apr 02, 2014 6:31 am

That sounds like a practical solution. I would suggest using the separate key / fill signals rather than chromakey, for several reasons. From an operator's perspective, setting up a chromakey is more cumbersome than a traditional external key. From a content point of view, using chromakey means you can't include the background color in your graphic because it will disappear. We have some experience with Blackmagic Decklink cards, and they should do just fine for what you need. The ATEM has a frame synchronizer on every input, so you don't need to worry about feeding a reference signal to the Decklink card.

Good luck!
I think we have a decklink card in the machine we would run it on - we certainly have some kind of input/output card! It isn't possible to just run it out of a normal graphics card is it with multiple monitor outputs, just while we test it before we rig it into our system?

~ Ed

User avatar
PID_Stop
Posts: 476
Joined: Thu Apr 01, 2010 11:58 am
Location: Syracuse, New York
Contact:

Re: Live Motion Graphics

Post by PID_Stop » Wed Apr 02, 2014 12:11 pm

nerddigestive wrote:I think we have a decklink card in the machine we would run it on - we certainly have some kind of input/output card! It isn't possible to just run it out of a normal graphics card is it with multiple monitor outputs, just while we test it before we rig it into our system?

~ Ed
That should work for initial testing, but you will ultimately need a Decklink or Bluefish card to feed the signal to your ATEM. CasparCG uses somewhat unintuitive terms: in particular, a "producer" is a part of the system that generates content (reasonable enough), while a "consumer" is the physical device that makes the content available for display (confusing, I think). CasparCG has a 'Decklink consumer', a 'Bluefish consumer', and a 'screen consumer' -- the latter sends program content to a PC graphic card output. An excerpt from the FAQ:
What is a “producer?”

A technical term for the CasparCG components that renders and plays media such as video, animations, images and audio.
A producer listens for commands and data sent from a client controller application, loads and renders that media and then sends it to a “consumer” that displays that media in a variety of ways.

CasparCG currently ships with five producers:
1. “FFmpeg Producer” that can play all media that FFmpeg can play, which includes many QuickTime codecs such as Animation, PNG, PhotoJPEG, MotionJPEG, as well as H.264, WMV and several audio codecs as well as uncompressed audio.

2. “Flash Producer” that uses Adobe’s Flash Player to play SWF‘s and FLV‘s including full control over all dynamic content. You can even load multiple layers that can be controlled independently.

3. “Image Producer” that displays bitmaps with alpha channel support.

4. “Color Producer” that generates a solid RGB color.

5. “DeckLink Producer” that accepts input video and audio.

What is a “consumer?”

A technical term for the CasparCG components that receive playing media such as video, animations, images and audio from a “producer” and displays that media on a specified output.

CasparCG currently ships with three consumers:
1. “Bluefish Consumer” that outputs the playing media on to video cards from Bluefish Technologies with full support for all PAL SD and HD resolutions, frame rates, pixel aspect-ratios, including support for separate fill + key + embedded audio.
All producers can output their media to the “Bluefish Consumer.”

2. “DeckLink Consumer” that outputs the playing media on to video cards from BlackMagic Design with full support for all PAL SD and HD resolutions, frame rates, pixel aspect-ratios, including support for separate fill + key + embedded audio.
All producers can output their media to the “DeckLink Consumer.”

3. “Screen Consumer” which displays the media in a standard Windows computer window, either windowed or fullscreen on one or several monitors.
You can read the whole thing here.

Regards,

-- Jeff

Baylink
Posts: 438
Joined: Wed Dec 15, 2010 3:21 pm

Re: Live Motion Graphics

Post by Baylink » Sun Jun 22, 2014 12:04 pm

Sounds to me like Caspar is based on gstreamer, which I think uses similar if not identical nouns.

Baylink
Posts: 438
Joined: Wed Dec 15, 2010 3:21 pm

Re: Live Motion Graphics

Post by Baylink » Thu Mar 05, 2015 9:46 am

More to the point, if you use a chroma- or luminance self-key off the signal, you can't do variable transparency in your shadows and the like; you can only do that with a separate key-signal channel, which I am pretty sure (without looking again) that Caspar knows how to do.

Post Reply