At first....it looked like a House of Cards

Approaching 2110

Direct link: graphics/video/House of Cards.mp4

If autoplay didn’t start with sound, press play or unmute — modern browsers block autoplay with audio unless muted first.

Why?

First off, I’m not presenting myself as an expert in SMPTE 2110. This website is originally geared towards broadcast engineers within a few years of retirement who are trying to get up to speed on 2110 technology. Enough so that they can make it to retirement. I've talked to several people with my background that have this dilemma. But in talking to others I am finding it might have much wider appeal.

About a year ago, I tried looking around on the internet in hopes of finding a self-study way to come up to speed, but except for some expensive guided courses, I found nothing affordable, or at least to me, seemingly worthwhile. The reason I came back to looking into 2110 was that I was back to trying to tell the Grass Valley story. To tell the latest part of the GV story properly, I felt you have to understand the 2110 foundation from which it evolved.

I already understood the ancient analog chains, traditional SDI workflows, compressed video food chains, and even basic IT command-and-control concepts. But ST 2110, layered on top of everything, felt like a gated community with an entrance exam I couldn't pass.

ST 2110 is closely related to modern platforms like Grass Valley's AMPP family. AMPP inherits the IP-centric mindset of SMPTE ST 2110—transporting uncompressed (or lightly compressed) media essences over IP networks, with full support for standards like ST 2110-20/30/40 and NMOS control. Yet AMPP itself abstracts much of the underlying plumbing (PTP timing, multicast routing, firewall rules, etc.) behind a cloud-native, software-defined interface that makes it more accessible and elastic.

I felt that to tell the latest parts of the Grass Valley story properly, I had to grasp the ST 2110 foundation that AMPP evolved from and builds upon.

Many of you know that Grass Valley’s AMPP (Agile Media Processing Platform) is the company’s cloud-native, software-defined response to the scalability and flexibility needs that SMPTE ST 2110 helped unlock in IP-based media production. Looking into AMPP quickly pointed back to 2110.

AMPP uses 2110-like packetized flows and timing models. It wraps these in software-defined control, orchestration, and elasticity. This gives AMPP the ability to spin resources up or down on demand. AMPP builds on the 2110 transport foundation. It shifts many production elements to the cloud or a hybrid IP domain.

2110 can work with both on-premises and off-premises setups. AMPP focuses on virtualized video and audio tasks like processing, switching, mixing, and playout. It is designed for data centers or public cloud environments. AMPP rides on top of the 2110 transport layer. As someone other than me said, "AMPP is 2110’s philosophy taken to the cloud."

Alas, what I thought would be a week or two foray into the 2110 realm turned into a six month trek to gain sufficient knowledge about 2110 to write competently about it. I decided I would document what I found, and from that came the idea for a course. While not done, I will head back to the Grass Valley story. But I thought I would offer my musings to others and see if only I find this journey interesting.

I ended up talking to others in my situation. If you had not ridden the 2110 "wave" since it started shoaling in this century's "teens," you missed the time when everyone in the game was learning together. As with previous technological revolutions in the industry, you could check off the required boxes one by one as you learned with others. That wave "crashed" upon the scene just before 2020, and has long since retreated back into the sea. The community learning moment was over, and latecomers truly were left to self-train. IP media workflows became assumed knowledge.

At the beginning of this year, I came to know enough about ChatGPT to start asking questions about the subject. First, high level questions. One important one early was the order to study the subject matter. From there I treated ChatGPT as my mentor and kept asking more detailed questions. It helped that I have an engineering background, many years in broadcast and general project engineering. I knew how to troubleshoot, and had spent enough time around IT environments and people to basically understand the subject. I also knew how to ask questions that would link over time the IT world to the 2110 overhead that would make computer networks real time media engines. It helped me greatly that I already understood television and broadcast production workflows.

I had a decent grasp of html. CSS, and JavaScript (the client-side web stack). While I never considered myself a “web developer” I knew enough to use the tools. When I waded into the AI world I discovered, like 100 million people before me, that I was wasting my time coding. What took me a day or two to get a piece of code to beta, 10 minutes of conversation with ChatGPT did the same, and it brought much more finesse to the coding process than I could even dream of.

Now all of you a continent ahead of me know that most of us, that are not at the top few percent in coding brilliance (and I personally can see the bottom much more clearly than the top) can see that to work in software in the future will be more as program managers than software writers. This has been an ongoing engineering story for a long time. Your EE degree is good for 10-20 years. Then most of us had better move up the food chain, from low-level circuit designers to wider levels of system design and management. Yes, there are always a few who happen to be in the right place, with the right mindset, that get to stay in the basic design game for longer.

So, the website I've put together to this point, I was the scribe and student, hopefully will bring a new tool to the 2110 world. I hope you find it useful. I plan to make it a living, evolving site. I’d like to think that what I brought was enough knowledge to know what questions to ask. Hopefully others will see it as a worthwhile endeavor. Oh yeah, it’s free! Add any relevant detail you think I missed. I want it to help others

If you spend any time on the site, you will see the second week of this "two-week" course is not complete. I will continue to work on that. You will not see any actual real-world setup or tools. I had AI mock up all the panels on the site. If anyone has real-world examples to share, that would be great. Or, if you can let me create them by giving me access to a lab, I’d really appreciate it!

Those of you who have been following my Grass Valley story (which I am getting back to now) know of Dr. Donald Hare. Early on, I asked ChatGPT what it knew about Dr. Hare. It said he was born in Los Angeles (actually, it was Fresno). Went to school at UC Berkeley (no, Stanford), and that he worked on submarine development during the second world war. Actually he worked on ways to detect subs. Pretty close to the truth. No?

Asked again later. The answer was much closer to the truth. A legitimate concern might be: Who's teaching whom? I believe that in this particular case it is me learning 2110.

Microsoft did a study quite a long time ago into the Nigerian Prince scam. It seems ludicrous that people fall for the trap. But the findings as to why the scam worked were simple. The scammers wanted the most naive of the naive to self-identify. They didn't want to waste time with people who had any doubt about what was being offered. They aimed to target the most gullible people who could be led to their desired result quickly and at a high rate.

Worried that maybe I was a self-identified mark to AI, I did do some homework.

It is found here ----> Grouchy Engineer


If you take issue with the approach, claims, or conclusions located here, please let me have it. My excuse is that AI got me here. That's my excuse, and I'm sticking to it!

In Conclusion - click to play audio
Yeah, I'm sticking to it!
Turn speaker on!