MultiDyne – openGear Live & Online Sept 29, 2020

Jesse Foster:
I have some use case drawings to go through and then highlight the card that I was referring to earlier on the panel, which is the OG-4600. It is quite the product and solving a lot of problems out there with it currently.

So just to give everybody just an overview of what we’re up to, we’re really strong in venues, stadiums, outside broadcasts. We’re continuing to grow our presence in military and government installations, as well as streaming and IPTV. So on the application side, this presentation will touch on OTT Streaming CODECS, HEVC monitoring is within that product line, PTZ/POV fiber systems and how we can integrate those with openGear, Studio Signal Extension is native capability of our openGear platform, as well as campus and facility interconnections, and we can even do DWDM for metro and long-haul transport within our openGear line as well.

A high level view of our current offerings within openGear. Fiber optic side, we have some embedder/de-embedder combo products. We have a Quad 3G transport card set that has a cross point on it, so there’s some failover capabilities. There’s routing, you could use it as a distribution amplifier. Then we have the OG-3600 and 4600 series. The 3600 is Quad 3G, and the 4600 is Quad 12G. So, that’s the one that I’ve been referring to today so far.

On the compression side of products, we have a ABC encoder and a companion decoder. The decoder can be licensed to handle HEVC 4 and Pick-5 H.265 monitoring applications. Conversion, we’ve got some infrastructure product that for audio, A to D and D to A, and also a two wire to four wire, four wire to two wire converter card for intercom. Those cards were in play in the symphony rebuild I was referring to on the panel. As well, where these distribution amp products. So, we have a one by nine dual, one by four with failover 3G product. We have a quad one by four with failover capability, a 3G card that could be two, one by eight DA’s or four, excuse me, a single one by 16 DA single card.

Then, we get down into analog distribution for gen lock and composite analog signals. Then, we have a popular in the openGear side of things, dual one by four analog audio DA with remote gain, controllable by DashBoard. Then we have some one by eight AES balanced and unbalanced DAs. Then on the more sophisticated side of things, we have a dual test generator that you could upload a PNG file for trouble slide or slate generation. It has time code audio outputs. You could reference time code to network time protocol so it’s quite the cool device, and it has a bouncing box motion pattern that you can insert to make sure that your transmission path, whether it’s satellite or terrestrial or fiber, is active, and there’s not a frozen frame in line. Great for broadcast set up.

Then we have a multi-viewer card, a five input auto detect. I’ll go ahead and jump to the next slide because that card is highlighted in an application. So what I’m showing here is a multi viewing at a remote site. It’s auto detect 1080p all the way down to NTSC or PAL composite analog. It’s very low latency, high quality. It has a close caption decoding, onscreen decoding, 16 channels of audio meters, character generators, time code readers all on screen.

Then we can take the 1080p mosaic multi-view out of that card into our AVC encoder, which, it’s a low latency for delivery over the internet or problematic networks where we can use the SRT protocol or Zixi to recover any lost packets and give you a seamless, broadcast quality video on the output side, over the internet, for this example here. Here’s a companion decoder card in play. You get 3G SDI or HDMI out simultaneously, excuse me. There’s some companion standalone products that are essentially standalone versions of these cards. So you can use a standalone on the remote side and openGear in the facility or vice versa.

So another application drawing showing the streaming encoder would be using the 4400 fiber receiver card with the cross point to accept, in this scenario, four camera feeds. They’re all genlocked, upstream, via our camera, transport product. So they’re synchronous, take them into this four by one router card here, come out coax 1080p, feed it to the encoder. And this encoder can feed output over IP to multiple destination simultaneously. So you do Facebook, YouTube, Twitch, CDNs, Amazon web services, Akamai, Microsoft Azure. Very capable streaming encoding product. Native support for UDP RTP, RTMP, RTSP as well, HLS and using SRT or Zixi for that packet loss protection capability.

So now moving into the fiber side of things, specifically the OG-4600 series. So this card is scalable. It’s an order time consideration what you want to be able to achieve with it. But I essentially have the extremes of those configurations represented here on this slide. So the fully stuffed version is a quad 12G, 6G, 3G, HD, ASI capable. All those signals are discreet and CWDM mux. So if you have any meta-data, audio, HDR, it’s all non-destructive, it’s just going to take that signal and transport it for you. And the top product there, and it also has CVBs for legacy teleprompter applications or anything you might need composite analog for.

You see there’s genlock, time code, eight by eight analog or AES audio. You have serial data tally and GPIO and gigabit ethernet. So this card was heavily used in that symphony rebuild I was referring to where, just opportunistically over the fiber, that was doing a good amount of 12G transport, we also got the two different locations connected with gigabit ethernet, and serial for inband DashBoard. You know, you can just plug the frame, open your frame controller into the network and bring it back over that fiber as well.

So you see it’s all bi-directional signal transport over that single fiber. Within this line, you can also just get it video only all over the single fiber and the video paths are unidirectional or bi-directional. I just happened to be showing two going in each direction on this card here, but they could all be going one way or the other as well.

So an application for this series of products, we have the ability to cascade and do CWDM multiplexing on board, the cards not needing any external mux, whether it’s an openGear card or 1 RUR standalone CWDM mux, we don’t need those for this. We can actually take the output of the upstream card into the next card expansion port and add four wavelengths each time we do that.
So I happen to be showing 16 videos here. Those are all 12G and you can also order this card as just a gigabit ethernet transport as well. So I’m showing, a GigE mixed in with 16, 12 GSDI videos. Those 12 GSDI videos could be bi-directional. I just happened to be showing uni directional here, but you see that you had a very small footprint, high density CWDM kit in openGear cards with no other products needed to pull that off.

So here is the OG 4600, 4602 in particular. This part number indicates that you have your quad 12G transport, your eight audio, all the signal types I referenced earlier, these are working in conjunction with that OG comms, two wire/four wire converter card. We support RTS or Clear-Com. So that means that you could take party line signals, which RTS, there’s two Intercom channels in one party line and Clear-Com, there’s single channel in there. And those are wet by design, meaning that they have power on them a lot of the time. So, this card on the input side will take party line, give you four wire, just line level audio out, which we can then transport over fiber. And then on the far side, we can take it back to two wire party line and power belt packs. So party line, meaning that you loop it, loop the signal through multiple belt packs. So, each person has a belt pack and they can talk and listen on the party line. And that requires that wet signal to be recreated, which this product does.

This slide is showing the OJI 4600 working in conjunction with our standalone products. So, this is … A key point I wanted to make here is openGear is within our line. Number one it’s standards-based. So, all of our partners can leverage our openGear solutions. The signals that we use … It’s an ITU CWDM spectrum that’s defined by a standards body. Then we have SMPTE signals. So, everything is open, to working with other … There’s no proprietary signal types being used here. So, we’re very flexible and we lend ourselves to other people’s designs quite often.

So, what we’re showing here though, is openGear in a rack mount configuration, working with our standalone VB series of products, which are quarter rack width, third rack width, half rack width, the modular cards inside. And we can build up like example what you see right here is the perfect companion to the OG 4600, is doing the mirror functionality on the far side to get a nice integrated systems. The previous slide showed openGear to openGear. Now this is openGear to VB.

This box right here is our juice power supply, which takes a single mode fiber, which the OG 4600 card works with. And this converts it to a SMPTE hybrid fiber, which we can then power our VB boxes over. You could also just go single mode to single mode, empower this locally, no problem. But once we’re in this configuration, we can do a pigtail empowered PTZ camera leveraging this power connection, or we can do PoE++ for in band powering in the ethernet connection to next generation PTZ cameras. What we’re showing here is the OG comms standalone equivalent to the openGear OG comms product. So, same functionality, but in a standalone form factor for the remote edge.

So, our Silverback five product, this is our latest generation of camera extenders for 4K and 8K cameras. And it’s also been integrated with our openGear line, which I’ll show you next here. But as far as cameras that we support, it’s all about cinema cameras from Sony, say the Venice or the Arri Amira. So large, single chip imagers, large format cameras. Not like three chip CCD, like broadcast two thirds imager cameras. We support those as well, but being able to take our system and augment a cinema camera and turn it into a television production workflow is what we’re doing with our system here.

So openGear applies it to that story here in that our base station has four openGear slots. So seeing the power and the flexibility of openGear, we allow for four cards to sit within the redundant power and cooling capability of the silver back five base station.

You could add, Raptor gateway products to this. You can put a multi-viewer in there, test generator, additional fiber to relay the signal, multiple kilometers, whatever you might need to do. So that is the story here. It’s just a powerful value add for that Rackspace that you’re using for the base station. But what’s coming out soon here is a quad 12G on the camera adapter product line, which means that it can then work in conjunction with the OG 4600 card.

So, it’s a similar story to the VB on the far edge, but this is working with cinema ENG type cameras. Rookie mistake. Sorry about that. So basically that native capabilities of the OG 4600 are now available within the Silverback. So you can start using them together. And if you have this system, you actually get three camera chains of capacity and 2 RU. Our dual channel base station that I just had on the previous slide is capable of two discrete camera chains. This will actually give you three camera chains up to 8K using quad 12G per camera chain and leave two slots open.
So it’s a very dense system configuration for AK. So it’s future protected. You can do single 12G today. You can do quad 3G for your UHD cameras. And if you bought into this system today and then migrate to AK all the way from quad 3G from previous, like in the F 55 Sony camera, for example.

So that is the major openGear side of things. But I [inaudible 00:16:17] all wanted to touch on. We see that DashBoard and openGear are extremely valuable approaches like Kevin Ansilon was saying earlier. It’s like a block in the football analogy to then run your solutions in there alongside of other partners. So DashBoard Connects has been enabled on our audio and signal monitoring products. And that’s why they’re getting a little shout out here. But so just further to that, we have a MADI audio monitor, so you can see all 64 channels of your MADI signal. It can take in coaxial or optical MADI inputs, and also can output one or the other.

So, if you come in optical, you can drop a coaxial copy or vice versa, which is a nice little value add there. But the ability to listen to any channel and see them is a very handy tool to have. And then DashBoard Connect. You have some controls and some basic signal status indications. This is all evolving, so we’ll be continuing to add more monitoring capabilities via DashBoard. So, then we have the standard 16 channel embedded for 1RU, 2RU. Got some discrete audio capacity here. Then this is a popular newer version, which is an integrated confidence monitor, video monitor and the audio capabilities in one. And then there’s a Dolby enabled version. So, Cindy, I think I went faster than I had planned.

You’re all good. You’re all good. So we do have a couple of questions though. One of the questions was about streaming, and I think that went back to your earlier slides, and how many destinations can you stream to.

Jesse Foster:
This encoder product has two encoder blocks internal. Each encoder block is capable of streaming to 10 destinations. So 20 destinations from a single input. You can do different resolutions. There’s some scaling capability in there. So, very powerful emissions and of the product, you can go to 20 destinations.

And is that usually plenty? Are people like wanting to go to more than 20 destinations? What’s the kind of typical application that you see out there?

Jesse Foster:
That’s more than enough in my experience, but if you’re going to be going to any of these destinations that I’m showing here, whether it’s Facebook or YouTube or Twitch, those are one to many by nature because people log onto their servers and get a copy of it. And they see it discreetly. So that’s copied, it could be millions of people looking at the stream. The CDNs is a similar story. Those are meant to … You contribute once and then they copy it out to whoever needs copies at different protocols, different bit rates, different end user requirements is what content distribution networks are for. So those servers in the middle would be utilized for those workflows, but 20 destination from a single card is more than I’ve ever seen needed.

We have a question here about someone building a new studio and they say “What type of 12G pitfalls should I look out for in my new studio build?”

Jesse Foster:
Okay. Yeah. I mean, mostly my mind goes to just distance limitations of that signal over coax. Really there’s a breaking point. Whether it’s 120 meters, or you have different capabilities of a different types of coax or specialty coax, which gets up to that level before you start seeing jitter issues and whatnot. But I think it’s better to err on the side of putting fiber in sooner than later. I mean, it’s a SMPTE 2082 is 12G. Is it 2083 or is 24G. You know, so it’s coming. Not anytime soon, but SMPTE’s already defined it. So having fiber in there is your future protection. You know, that’s our attitude, I mean, even if you put in fiber today for the 12G infrastructure, if you go IP 25 gig, 40 gig, 100 gig, 400 gig, whatever, you know that fiber is going to be there to evolve with your needs.

You were talking about the cascading CWDM for multiplexing and how you can add four width lengths as you go. I was curious about the applications you’ve seen for that, and who’s using that type of setup right now.

Jesse Foster:
Yeah. It’s really just a design criteria choice. I mean, cause again, it’s ITU specified CWDM so you could take independent wavelengths out of the card and mux it in a Ross CWDM multiplexer D multiplexer set or external one. But if you’re in a very space constrained environment, or you want less interconnections to be made using jumpers and whatnot, that’s where this configuration shines because you’re just taking a single fiber from each card and connecting it to the downstream card. So it’s a very small footprint and less points of failure, even though the muxes are passive, if you bump a cable of fiber jumper or something like that. It’s very clean and tidy way of doing it.

Another question we have here is “We’re putting in a system at a House of Worship and thinking about fiber, what would you recommend?” It a little bit ties into what you’re talking about there.

Jesse Foster:
Sure. Yeah. So House of Worship is really taken off at the high end. You know, it’s always been an AV sector, pulling off what you can with as little as possible, but we’ve seen a trend of those end users. House of Worship customers putting in like the Sony FX9, a 4K camera that does 12G, or even the Venice, for like the main camera and then FX9s around it or Vericam LTs, Canon C700s.
So once you’re doing that, you’re running 12G or quad link or dual 6G, distances come into play pretty quickly at large venue churches. And then they have campuses where they’re going to connect to a master control room, for lack of a better term that they are a master control room, they’re just admitting to the web, as opposed to a transmitter site. But in that case, you use SMPTE 311 hybrid fiber, ideally because then you can do camera power from our system to power these cameras that are in hard to reach areas up in … It could be PTZ, it could be a camera with an operator, right? So, in any event we’d recommend fiber, whether it’s the single mode infrastructure and the juice boxes I showed, or just native 311 from our base stations to the cameras. That’s what we would recommend.

I mean, who wouldn’t do fiber at this point? Is there a downside to doing fiber? It just sounds like with everything you’re talking about, it’s almost a no brainer.

Jesse Foster:
Yeah. I mean the costs have come down. So over the years that it really is marginal increased to do things over fiber, and then you get the inherent benefits of electromagnetic, interference free transmission. There’s no hum and nothing can get inducted into the copper. So if you’re running those SMPTE311 fiber cables with, traditional copper audio signals and whatnot, there’s issues that could arise. Actually we have a use case in Georgia, a church that is … They have a studio on both sides of the street and they were getting lightning strike issues. They have Neutrik connector, but SMPTE 311 hybrid fiber running between the two and lightning strike actually came in and zapped building B when building A got hit.

So, they’re using our Hut product, which the juice … It’s a relative of that juice and they just have the optical connection going across the street. So it’s lightning isolated. Let’s make act-of-God jokes…

But it’s electrically isolated and they just do the power on the far side using our juice box and then all the signal goes back and forth over fiber. So that’s a real life benefit of using optical transport.

I mean, how many facilities out there, whether it’s house of worship or sports, have that situation, where they’ve got multiple venues, multiple buildings? Is that something you see a lot?

Jesse Foster:
Yeah, more often than not, honestly. Yeah, I mean, whether it’s a campus or … The term campus applies to, obviously schools, large institutions of learning, like that’s a big customer base of ours. Whether it’s Duke university, I’m going to just name them, right? But then on the enterprise side of things, they have large campuses, which they’re sharing feeds between conference rooms and studios and whatnot. So I mean, that’s our thing, you know, is doing the campus extension. More often than not, you’re going to need to be shared signals beyond the reach of copper limitations.

So, are those kinds of non broadcasty things that you’re talking about … House of worship and I guess, live sports. That sort of fits in broadcast still depending Are they also then using your fly packs and some of the things that you showed on the drawings or where do the fly packs kind of come into play these days for [crosstalk 00:26:22] ?

Jesse Foster:
Yeah. Anytime you need to move anything. I have other slides from other presentations that show outside broadcast trucks, but you can pretty much emulate what you do in there in a fly pack, right? So you have this modularity, the ability to deploy it where advantageous, right? But just to call back to my friend, Kevin Ansilon earlier, he had mentioned that traditional terrestrial broadcast is almost dead and it’s all about OTT and streaming now, right? So you can have all of your transmission, terrestrial optical transmission in the same rack frame as your final emission encoder. And you could hit the web all in one Pelican case. So, again, that’s another popular solution that we’re working with people on.

When you were talking earlier, you showed an OTT back haul with a Zixi or SRT and … Is kind of everybody just moving to an OTT model in these cases, or…?

Jesse Foster:
Yeah. I mean the cost savings, as opposed to getting a dedicated fiber line or ethernet path with deterministic performance, that’s what the old model would be. You would go to your internet service provider or Telco and you would get, provision path of bandwidth, whether it was fiber based or IP. That’s very expensive and you have to be set up. Sometimes they have customer premise equipment that they want to drop off at either end. Contracts, tariffs, all this craziness that … Big money, heavy iron stuff that you don’t need anymore with these protocols, like in Rist. Rist is coming into our platform soon as well. So, those technologies allow you to leverage over the top of the existing internet to get what you need to get done and get the performance that you would get over a dedicated line.

2024 oG App Guide